Page 3 of 3

Radon?Nikodym theorem

Posted: February 1st, 2015, 7:19 pm
by Alan
QuoteOriginally posted by: CuchulainnQuote I'm doubtful about applications.Maybe it has escaped everyone's attention... but ..RN derivative _is_ the likelihood function in parameter estimation in SDEs.Or am I missing something here?exactly -- your point is under-appreciated.

Radon?Nikodym theorem

Posted: February 2nd, 2015, 3:42 am
by MiloRambaldi
QuoteOriginally posted by: CuchulainnQuote I'm doubtful about applications.I think the quote was referring to the Radon-Nikodym *Theorem*.QuoteMaybe it has escaped everyone's attention... but ..RN derivative _is_ the likelihood function in parameter estimation in SDEs.Or am I missing something here?It wasn't obvious to me. RN derivative between which two measures?

Radon?Nikodym theorem

Posted: February 2nd, 2015, 10:50 am
by Cuchulainn
QuoteOriginally posted by: MiloRambaldiQuoteOriginally posted by: CuchulainnQuote I'm doubtful about applications.I think the quote was referring to the Radon-Nikodym *Theorem*.QuoteMaybe it has escaped everyone's attention... but ..RN derivative _is_ the likelihood function in parameter estimation in SDEs.Or am I missing something here?It wasn't obvious to me. RN derivative between which two measures?An example is in Parameter Estimation for Multiscale DiffusionsG.A. PavliotisINRIA-MICMAC Project TeamandDepartment of Mathematics Imperial College LondonLECTURE 1: 14/02/2012LECTURE 2: 21/02/2012P_X == measure corresponding to process XP_W == measure corresponding to Wiener processdP_X/dP_w = LE function to be optimized for its parameters.That's my understanding. Feels kind of important..I am sure Alan knows this inside out.

Radon?Nikodym theorem

Posted: February 2nd, 2015, 2:29 pm
by Alan
For continuous-time inference, you have to imagine the very idealized setting where you can observe an entire path of a diffusion: [$]\{X_t: 0 \le t \le T\}[$].Then, suppose this path sample is generated by a parameterized SDE associated to a measure[$] P^{\theta}: dX_t = b(X_t;\theta) dt + \sigma(X_t) \, dW_t[$], where [$]W_t[$] is a [$]P^{\theta}[$]-BM. The idea is you want to make inferences about the drift parameter(s) [$]\theta[$], given the path sample. To do this, the statistician would construct what one might call the `Statistician's Likelihood Ratio' [$]\Lambda_T[$], which is the ratio of thelikelihood of observing the path under [$]P^{\theta}[$] to the likelihood of observing it under a reference system labeled [$]P^0[$].Under [$]P^0: dX_t = \sigma(X_t) \, dW_t[$]. This [$]\Lambda_t[$] can be shown to be the Girsanov Theorem Likelihood Ratio, i.e., the RN derivative [$]L_t = dP^{\theta}_t/dP^0_t[$], but expressed in the system under which it is *not* a martingale. I have a whole chapter on Continuous-time Inference in my forthcoming `Option Valuation under Stochastic Volatility II',where I elaborate and have various worked (novel) examples. The basic idea above is a standard result -- but under-appreciated -- as I commented.

Radon?Nikodym theorem

Posted: February 11th, 2015, 9:44 am
by Cuchulainn
QuoteOriginally posted by: AlanFor continuous-time inference, you have to imagine the very idealized setting where you can observe an entire path of a diffusion: [$]\{X_t: 0 \le t \le T\}[$].Then, suppose this path sample is generated by a parameterized SDE associated to a measure[$] P^{\theta}: dX_t = b(X_t;\theta) dt + \sigma(X_t) \, dW_t[$], where [$]W_t[$] is a [$]P^{\theta}[$]-BM. The idea is you want to make inferences about the drift parameter(s) [$]\theta[$], given the path sample. To do this, the statistician would construct what one might call the `Statistician's Likelihood Ratio' [$]\Lambda_T[$], which is the ratio of thelikelihood of observing the path under [$]P^{\theta}[$] to the likelihood of observing it under a reference system labeled [$]P^0[$].Under [$]P^0: dX_t = \sigma(X_t) \, dW_t[$]. This [$]\Lambda_t[$] can be shown to be the Girsanov Theorem Likelihood Ratio, i.e., the RN derivative [$]L_t = dP^{\theta}_t/dP^0_t[$], but expressed in the system under which it is *not* a martingale. I have a whole chapter on Continuous-time Inference in my forthcoming `Option Valuation under Stochastic Volatility II',where I elaborate and have various worked (novel) examples. The basic idea above is a standard result -- but under-appreciated -- as I commented.This is nice. When can we expect the book?I have some comments which can be seen as questions (to try to understand this stuff :-)) at the same time. IMO there other scenarios/variants on your test case1. It is possible to do a Nelson-Ramaswamy transformation to get SDE likedX = b(X;theta) dt + dWand apply RN to get the likelihood function as before. It might help make it more easily computable. Bur NR will break down for system while your approach will work.2. Solve the nonlinear log likelihood functions, usually by simulating the path of X (Euler, time series?) and thus for theta (asymptotically unbiased).3. Sanity check: simulate Euler path for X with a given theta; compute MLE to see if you get theta back.4. Neyman-Pearson lemma is the quotient of two RN derivatives (theta_0 vs theta-1). So we can test hypotheses on theta using this lemma.5 Your approach should work with Poisson processes as well I reckon (Cont/Tankov)6. The approach should work on drift with two unknown parameters?7. Is there any (pseudo) code floating around?

Radon?Nikodym theorem

Posted: February 11th, 2015, 5:22 pm
by Alan
QuoteOriginally posted by: CuchulainnQuoteOriginally posted by: AlanFor continuous-time inference, you have to imagine the very idealized setting where you can observe an entire path of a diffusion: [$]\{X_t: 0 \le t \le T\}[$].Then, suppose this path sample is generated by a parameterized SDE associated to a measure[$] P^{\theta}: dX_t = b(X_t;\theta) dt + \sigma(X_t) \, dW_t[$], where [$]W_t[$] is a [$]P^{\theta}[$]-BM. The idea is you want to make inferences about the drift parameter(s) [$]\theta[$], given the path sample. To do this, the statistician would construct what one might call the `Statistician's Likelihood Ratio' [$]\Lambda_T[$], which is the ratio of thelikelihood of observing the path under [$]P^{\theta}[$] to the likelihood of observing it under a reference system labeled [$]P^0[$].Under [$]P^0: dX_t = \sigma(X_t) \, dW_t[$]. This [$]\Lambda_t[$] can be shown to be the Girsanov Theorem Likelihood Ratio, i.e., the RN derivative [$]L_t = dP^{\theta}_t/dP^0_t[$], but expressed in the system under which it is *not* a martingale. I have a whole chapter on Continuous-time Inference in my forthcoming `Option Valuation under Stochastic Volatility II',where I elaborate and have various worked (novel) examples. The basic idea above is a standard result -- but under-appreciated -- as I commented.This is nice. When can we expect the book?I have some comments which can be seen as questions (to try to understand this stuff :-)) at the same time. IMO there other scenarios/variants on your test case1. It is possible to do a Nelson-Ramaswamy transformation to get SDE likedX = b(X;theta) dt + dWand apply RN to get the likelihood function as before. It might help make it more easily computable. Bur NR will break down for system while your approach will work.2. Solve the nonlinear log likelihood functions, usually by simulating the path of X (Euler, time series?) and thus for theta (asymptotically unbiased).3. Sanity check: simulate Euler path for X with a given theta; compute MLE to see if you get theta back.4. Neyman-Pearson lemma is the quotient of two RN derivatives (theta_0 vs theta-1). So we can test hypotheses on theta using this lemma.5 Your approach should work with Poisson processes as well I reckon (Cont/Tankov)6. The approach should work on drift with two unknown parameters?7. Is there any (pseudo) code floating around?Thanks, I'm working on the last chapter, so hopefully the book will be out by summer.It's easier to give the flavor of what I'm doing from the chapter, so I've prepared an excerpt, which is attached.Re some questions not answered by the excerpt:- I'll guess if you do NR transformation, you end up with the same point estimates for the drift parms in the absence of the transformation. (Whether or not it simplifies some computations related to that, I'm not sure). - Although most chapters come with codes, this one doesn't have any. (My other chapter on inference is much more practical, less theoretical, and has detailed algorithms).- I haven't really thought about jump-diffusions here, as my issues in this chapter are related to diffusions.

Radon?Nikodym theorem

Posted: February 27th, 2015, 9:02 am
by savr
At this point it seems important to express a stern disagreement with the below (italicized):QuoteLetD(t, s) == discount factor at s as seen from t (present value of one unit of currency)when visibly there is everything future and not much present about the quantity D(t,s) so defined.

Radon?Nikodym theorem

Posted: February 28th, 2015, 10:45 am
by Cuchulainn
QuoteOriginally posted by: AlanFor continuous-time inference, you have to imagine the very idealized setting where you can observe an entire path of a diffusion: [$]\{X_t: 0 \le t \le T\}[$].Then, suppose this path sample is generated by a parameterized SDE associated to a measure[$] P^{\theta}: dX_t = b(X_t;\theta) dt + \sigma(X_t) \, dW_t[$], where [$]W_t[$] is a [$]P^{\theta}[$]-BM. The idea is you want to make inferences about the drift parameter(s) [$]\theta[$], given the path sample. To do this, the statistician would construct what one might call the `Statistician's Likelihood Ratio' [$]\Lambda_T[$], which is the ratio of thelikelihood of observing the path under [$]P^{\theta}[$] to the likelihood of observing it under a reference system labeled [$]P^0[$].Under [$]P^0: dX_t = \sigma(X_t) \, dW_t[$]. This [$]\Lambda_t[$] can be shown to be the Girsanov Theorem Likelihood Ratio, i.e., the RN derivative [$]L_t = dP^{\theta}_t/dP^0_t[$], but expressed in the system under which it is *not* a martingale. I have a whole chapter on Continuous-time Inference in my forthcoming `Option Valuation under Stochastic Volatility II',where I elaborate and have various worked (novel) examples. The basic idea above is a standard result -- but under-appreciated -- as I commented.General question: would this topic be a good topic an MSc/MFE thesis in computational finance?

Radon?Nikodym theorem

Posted: February 28th, 2015, 1:37 pm
by Alan
Sure -- parameter inference is always a good topic. There are various ways to take it.First, there are computations fleshing out the continuous-time inference theory itself in particular models.Second, in real life, even if the CT stochastic processes were valid, the observations are discretely spaced. So there isthe issue of how the CT inference relations might be helpful for the associated discrete-time maximum likelihood theory. I didn't pursue that aspect, but it is a natural direction for somebody's thesis. In other words, the topic would be:what does CT inference teach us about real-life inference? There should be many computational opportunities in that.

Radon?Nikodym theorem

Posted: March 3rd, 2015, 11:05 am
by Cuchulainn
QuoteOriginally posted by: savrAt this point it seems important to express a stern disagreement with the below (italicized):QuoteLetD(t, s) == discount factor at s as seen from t (present value of one unit of currency)when visibly there is everything future and not much present about the quantity D(t,s) so defined.DF(t,T) discount factor at t expiring at maturity T.Better?