Serving the Quantitative Finance Community

 
User avatar
JohnLeM
Topic Author
Posts: 379
Joined: September 16th, 2008, 7:15 pm

Numerical results concerning the curse of dimensionality

June 23rd, 2016, 6:36 am

...are accessible here or here.[edit :] link submitted arxiv paper Any comments welcomed from this site. These are first results, small, but research time, big .
Last edited by JohnLeM on July 3rd, 2016, 10:00 pm, edited 1 time in total.
 
User avatar
Cuchulainn
Posts: 22933
Joined: July 16th, 2004, 7:38 am

Numerical results concerning the curse of dimensionality

June 23rd, 2016, 9:01 am

QuoteOriginally posted by: JohnLeM...are accessible here or here.These are first results, small, but research time, big (ten years :( ).Any comments welcomed from this site.Link fixed
 
User avatar
JohnLeM
Topic Author
Posts: 379
Joined: September 16th, 2008, 7:15 pm

Numerical results concerning the curse of dimensionality

June 23rd, 2016, 9:33 am

oops.. thanks for the fix Cuchulainn
Last edited by JohnLeM on June 22nd, 2016, 10:00 pm, edited 1 time in total.
 
User avatar
Cuchulainn
Posts: 22933
Joined: July 16th, 2004, 7:38 am

Numerical results concerning the curse of dimensionality

July 2nd, 2016, 8:36 am

I read the paper but I find it hard to see how the curse has been exorcised.Maybe a second, scoped version is needed with let's say a 3d FP PDE and then from A to Z. With PDE complexity is exponential, so I don't see how to get round that problem.
Last edited by Cuchulainn on July 1st, 2016, 10:00 pm, edited 1 time in total.
 
User avatar
JohnLeM
Topic Author
Posts: 379
Joined: September 16th, 2008, 7:15 pm

Numerical results concerning the curse of dimensionality

July 3rd, 2016, 6:10 pm

QuoteOriginally posted by: CuchulainnI read the paper but I find it hard to see how the curse has been exorcised.Maybe a second, scoped version is needed with let's say a 3d FP PDE and then from A to Z. With PDE complexity is exponential, so I don't see how to get round that problem.Cuchulainn, hello. You're right, this is called the Curse of dimensionality, a problem opened and identified by Bellman 60 years ago. However this paper claims having slaughtered it : it is not exponential anymore, but cubic in the number of risk sources. This result could change some little things.We are talking about mathematic, an exact science. All what is stated in this paper can be checked. To be more precise, we are looking today for partners that could help us checking this result with concrete applications. Do not hesitate to forward this demand :)
Last edited by JohnLeM on July 2nd, 2016, 10:00 pm, edited 1 time in total.
 
User avatar
Cuchulainn
Posts: 22933
Joined: July 16th, 2004, 7:38 am

Numerical results concerning the curse of dimensionality

July 3rd, 2016, 7:08 pm

QuoteOriginally posted by: JohnLeMQuoteOriginally posted by: CuchulainnI read the paper but I find it hard to see how the curse has been exorcised.Maybe a second, scoped version is needed with let's say a 3d FP PDE and then from A to Z. With PDE complexity is exponential, so I don't see how to get round that problem.Cuchulainn, hello. You're right, this is called the Curse of dimensionality, a problem opened and identified by Bellman 60 years ago. However this paper claims having slaughtered it : it is not exponential anymore, but cubic in the number of risk sources. This result could change some little things.We are talking about mathematic, an exact science. All what is stated in this paper can be checked. To be more precise, we are looking today for partners that could help us checking this result with concrete applications. Do not hesitate to forward this demand :)Sure, I know that Bellman phrased it. And I know that mathematics (BTW some would argue that it is not a science) is precise. But it does not help my understanding of the paper. Maybe I am missing something, but which part/formula does the 'magic' happen in? Is it something along the lines of how Longstaff /Swartz do it? Maybe you could expand on (4) because that seems to be the essence of the model, yes?QuoteAll what is stated in this paper can be checked. How can it be checked? I think an algo or pseudo code is needed, at the very least, especially for someone (like me) who does not have the necessary background. I don't understand the last 4 lines and eq. (4) of section 2.1.It would be nice to see a 2-factor transported map worked out in the same way as has been done with GBM and CIR (April 2014 paper) to see how the numerics look like. If complexity is now cubic - as you claim - then this would be a major result indeed. One sure way is to provide the source code and we can test it ourselves.
Last edited by Cuchulainn on July 2nd, 2016, 10:00 pm, edited 1 time in total.
 
User avatar
JohnLeM
Topic Author
Posts: 379
Joined: September 16th, 2008, 7:15 pm

Numerical results concerning the curse of dimensionality

July 3rd, 2016, 8:55 pm

Hello again.(4) is phrasing the problem. It is somehow stating two problems :- We are not given the Kolmogorov operator.- Even if we knew this operator, the problem is stated with a big number of risk sources, and numerical PDE methods are unable to compute due to the Curse of Dimensionality (CoD).- The first problem has been solved already in this draft paper : if you are able to sample your process with a Monte-Carlo method (that is if you know a quantile of your process), then I can compute this operator for you. If you don't know how to compute a quantile, I can compute it for you using the calibration algorithm presented in the paper.- I did my best to describe how the second problem is tackled in the arxiv paper, please tell me if it was unclear. The main idea is as follows : we know that Monte-Carlo methods are not not affected by CoD, because they are using random sequences. PDE methods relies on grids, usually cartesian ones. If you are able to compute PDE over a randomly generated grid, then you kill the CoD, the idea is as simple as that. Obviously, computing PDE over randomly generated grids is quite an headache: all the know-how is here and it took me some years to be able to compute with such grids. Furthermore, as pointed out in another thread, it is not perfect, but it already works to my perception : I know that the scheme is stable and convergent.Concerning your test requests, I am already facing other demands, and, since I am alone to do the job, it might take some time to answer to all of them and we need prioritizing. I already tested thoroughly with log-normals, but I agree with you : a good way to be convincing could be to test against known 1D and 2D processes as CIR, Heston or SABR where benchmarks are known (note that for 1D processes, this 2009 paper already presented results and benchmarks using this method). Then to raise dimensions with special cases. The point is that above 3 dimensions, there are no known benchmarks... that's why the paper proposed a first one in higher dimensions.Hope this sheds some lights
Last edited by JohnLeM on July 3rd, 2016, 10:00 pm, edited 1 time in total.
 
User avatar
Cuchulainn
Posts: 22933
Joined: July 16th, 2004, 7:38 am

Numerical results concerning the curse of dimensionality

July 5th, 2016, 6:01 am

A good 'litmus test' is to have an unambiguous description of the algorithms that can be handed to a developer who will program it up. Even better IMO is to make your software open source (Github/ Quantlib) add added value to it.
Last edited by Cuchulainn on July 4th, 2016, 10:00 pm, edited 1 time in total.
 
User avatar
JohnLeM
Topic Author
Posts: 379
Joined: September 16th, 2008, 7:15 pm

Numerical results concerning the curse of dimensionality

July 5th, 2016, 1:58 pm

Cuchulainn hello.I agree with you, open science is a better and more efficient one. However I can't open it more, sorry.
Last edited by JohnLeM on July 5th, 2016, 10:00 pm, edited 1 time in total.