- April 7th, 2019, 11:38 am
- Forum: Numerical Methods Forum
- Topic: Are Artificial Intelligence methods (AKA Neural Networks) for PDEs about to rediscover the wheel ?
- Replies:
**327** - Views:
**12604**

Here is a seminal paper on PDE with discontinuous payoff (initial condition). This was known in our group by 1974 (it's in the FEM book by Gil Strang and George Fix). https://wwwf.imperial.ac.uk/~ajacquie/IC_Num_Methods/IC_Num_Methods_Docs/Literature/DuffyCN.pdf So, I'm wondering what new revela...

- April 7th, 2019, 11:08 am
- Forum: Numerical Methods Forum
- Topic: Are Artificial Intelligence methods (AKA Neural Networks) for PDEs about to rediscover the wheel ?
- Replies:
**327** - Views:
**12604**

Here is a seminal paper on PDE with discontinuous payoff (initial condition). This was known in our group by 1974 (it's in the FEM book by Gil Strang and George Fix). https://wwwf.imperial.ac.uk/~ajacquie/IC_Num_Methods/IC_Num_Methods_Docs/Literature/DuffyCN.pdf So, I'm wondering what new revela...

- April 7th, 2019, 9:01 am
- Forum: Numerical Methods Forum
- Topic: Are Artificial Intelligence methods (AKA Neural Networks) for PDEs about to rediscover the wheel ?
- Replies:
**327** - Views:
**12604**

Then I'm still confused. Can you clarify the relation between BV functions, convergence rates and derivative payoffs? ISayMoo, you are not confused at all, you are clearly sharp minded. This is a very pertinent question, it is exactly the same question that the one you asked concerning the constant...

- April 6th, 2019, 6:02 pm
- Forum: Numerical Methods Forum
- Replies:
**327** - Views:
**12604**

OK, so just so that I get it straight: 1. call payoff (S - K)+ is not BV and there is NO sampling sequence for it which converges faster than 1/N 2. binary payoff Heaviside(S - K) is BV and there IS a sampling sequence for it which converges faster than 1/N Correct? Nope: Correct for binary (thank,...

- April 6th, 2019, 12:37 pm
- Forum: Numerical Methods Forum
- Replies:
**327** - Views:
**12604**

>> And why is this train of thought relevant here? e.g. why do you want to differentiate a payoff. I was trying to explain to @ISayMoo that Calls are somehow one derivative smoothers than Autocalls. Hence a sampling method, meaning a Monte-Carlo like method of kind [$] \int_{R^D} P(x) d\mu(x) \sim \...

- April 6th, 2019, 11:32 am
- Forum: Numerical Methods Forum
- Replies:
**327** - Views:
**12604**

Fair enough. Aka indicator/ Kronecker function. https://en.wikipedia.org/wiki/Dirac_measure The question is still: "the derivative of a call option is a barrier option". Is that what you mean? (x-K)^+ ' = {0, x< K, 1, x> K}. Can't I call this a barrier option ? I only see the derivative of a call p...

- April 6th, 2019, 11:30 am
- Forum: Numerical Methods Forum
- Replies:
**327** - Views:
**12604**

There's at least one person from the AI community here who's trying to tell you that you're mistaken about the equivalence and explaining clearly why (however difficult it is to pin down what you mean). I am trying to argue with you, on a mathematical basis, concerning this equivalence. But maybe s...

- April 6th, 2019, 11:01 am
- Forum: Numerical Methods Forum
- Replies:
**327** - Views:
**12604**

(x-K)^+ ' = {0, x< K, 1, x> K}. Can't I call this a barrier option ?Fair enough. Aka indicator/ Kronecker function.

https://en.wikipedia.org/wiki/Dirac_measure

The question is still: "the derivative of a call option is a barrier option". Is that what you mean?

- April 6th, 2019, 10:03 am
- Forum: Numerical Methods Forum
- Replies:
**327** - Views:
**12604**

What is a bounded variation function ? it is basically a function which derivative is a measure (even if it is a little bit more complex). Consider a call payoff (x-K)^+. Take its derivative : {0, x < K, 1, x > K}, that is a heavyside function, also called a barrier. Then take a second derivative :...

- April 5th, 2019, 6:21 pm
- Forum: Numerical Methods Forum
- Replies:
**327** - Views:
**12604**

It takes being a physicist to know that translating every problem to PDEs is not necessarily a good approach ;-) I have found some papers, that could interest you, emanating from the Artificial Intelligence community, pointing out the link between PDE and neural network methods. For instance this o...

- April 5th, 2019, 6:02 pm
- Forum: Numerical Methods Forum
- Replies:
**327** - Views:
**12604**

A call option does not have a gradient .. it has a derivative which is a Heaviside function???? I see a disconnect between ISM's question and your answer (caveat: I have a bad dose of flu..) // Off-topic: I recommend FW in French (published by Gallimard) Finnegan le Constructeur, stathouder de sa m...

- April 5th, 2019, 11:19 am
- Forum: Numerical Methods Forum
- Replies:
**327** - Views:
**12604**

You seem to be contradicting your LinkedIn post now (or I don't understand something). In it you wrote that "a bounded variation function, a function class for which we know that the convergence rates of a sampling method can not exceed 1 / N, not 1/N^2." The payoff of a call option, (S - K)+, is a...

- April 5th, 2019, 11:16 am
- Forum: Numerical Methods Forum
- Replies:
**327** - Views:
**12604**

Well, my criticism was even simpler: it's just not accurate or correct to say "0.2% (relative error on price), which corresponds to a convergence factor of rate 1 / N with N = 512" even if, in fact, 1/512 ~= 0.002. You CANNOT estimate the convergence rate for a single value of N. Either you prove i...

- April 5th, 2019, 11:13 am
- Forum: Numerical Methods Forum
- Replies:
**327** - Views:
**12604**

Computational time is very important for people who need to deliver risk numbers to the boss at 7am every morning. These are the people you're selling your stuff to. And you didn't answer my question (again :D) - did you run the MC pricer separately for every option, or once for the whole portfolio...

- April 5th, 2019, 10:01 am
- Forum: Numerical Methods Forum
- Replies:
**327** - Views:
**12604**

your method has the same convergence properties as Sobol numbers... Interesting remark. Yes, you are correct, these methods converge also at rate (LnN)^{D-1} /N if you consider a kernel generating the functional space corresponding to Koksma-Hlawka inequality . Dont expect a miracle: Koksma-Hla...

GZIP: On