This is an interesting set of questions. In principle, the answer is yes. Can you answer the questions in the case of your lattices?

The big issue is if you separate the underlying stuff from the payoff algos. If the common data is read only then parallel processing is on the cards.

But a more precise specification is needed.

Statistics: Posted by Cuchulainn — July 21st, 2017, 5:13 pm

]]>

Book 2 has been out for about a year now. If you're interested, there's a bunch of pre-sales stuff and support at the link below: full table of contents, code files, etc.

Statistics: Posted by Alan — July 21st, 2017, 2:14 pm

]]>

geodesic,

Things get more interesting in 2D and up. I've got one chapter in a recent book that may interest you: Ch. 5 "Stochastic Volatility as a Hidden Markov Model" in the book "Option Valuation under Stochastic Volatility II". This turns out to be kind of a mixed lattice and continuum representation.

The virtue of convergent lattice methods is that they provide an explicit Markov chain approximation to a continuum process -- an approximation that has manifestly non-negative transition probabilities and weak convergence to the continuum process. PDE discretizations typically do not offer this interpretation.

If you've ever tried to solve a PDE for a tricky transition probability in 2D or higher, you've probably seen cases where convergence to a strictly positive result can be quite computationally demanding. And simply truncating a small negative pde result to zero can sometimes be grossly wrong (in maximum likelihood estimation, for example). So, there is a role for both approaches.

I haven't seen that particular problem, but I have seen "bouncing ghosts" where diffusions of probability bounce off the "grid wall" at "infinity". I've also seen the same thing happen off the faux wall at zero in spherical coordinates for a spherically symmetric function (where r is restricted to be non-negative). So yeah, I've definitely seen gotchas in PDE approaches, but I don't think I've ever solved a numerical PDE in finance. Only in physics (my dissertation research was solving and interpreting a complex valued diffusion eqn.)

If you're the author of "Option Valuation under Stochastic Volatility I" may I just say how amazing your book is? I didn't know book 2 existed -- it seems like you covered so much in book 1!

Statistics: Posted by geodesic — July 21st, 2017, 12:25 pm

]]>

You are not asking the right question. Lattice methods are poor men's pdes.

Admittedly lattice methods played a huge rule in quantitative finance, as they brought Black-Scholes theory to ... poor men. And I am not saying they aren't being used, they are.

But beyond this, look no further, look for advanced books on pdes and you shall find.

Also notice that you would find a lot more unpublished documentation with dealers than you would find in textbooks and papers. That's not just true of lattice methods, but generally the case. It's hard for people outside practitioner circles to even come up with the relevant problems. That's because you hit so many roadblocks in practical industrial strength applications that you either solve the problems or you just stop.

Thanks. If one wants to trade accuracy for speed (on the provisio that the accuracy is "good enough" for a risk system if not a trading system) what are the benefits of recasting the problem as a PDE?

Let's say I have a few hundred convertible bonds that need to get valued every day for a daily market risk report. I'm not out to get dead-accurate trading accuracy. I just need to get them valued in a decent amount of time on a system that many people share to do valuations. What's the benefit of a PDE? Can a PDE system be "reused" for different instruments living on the same interest rate lattice?

Statistics: Posted by geodesic — July 21st, 2017, 12:14 pm

]]>

Cuchulainn wrote:Related to above, what do people think of this statement?To re-iterate, for the two-dimensional Black-Scholes PDE there is no con-sistent and monotone discretisation with fixed stencil on a uniform mesh,for any [$]\rho \ne 0 [$].

Is the problem truly that [$]\rho \ne 0 [$] or that the diffusion coefs become unbounded? If true, the statement needs a proof to clarify the unstated assumptions. For example, what is the computational domain and the numerical boundary conditions?

The quote is taken from this article

https://arxiv.org/pdf/1605.06348.pdf

The Holy Grail of FDM is finding monotone schemes for all correlation/diffusion/convection terms.

1. For correlation, for the unprocessed BS PDE, constant meshes h1 and h2 will not lead to a M-matrix indeed (as elegantly shown by Samarski/Shishkin/O'Riordan). However, a transform [$]x = log S[$] will produce an M-matrix using their approximation to the correlation derivative V_xy. I have verified this mathematically for Heston PDE and I expect a similar conclusion for the transformed BS PDE.

2. Convection can destroy monotonicity but we resolve using 1) upwinding or better 2) exponential fitting.

3. Degenerate diffusion is also an issue of course but Fichera/Feller come to the rescue? And using MOL makes life easier.

Statistics: Posted by Cuchulainn — July 20th, 2017, 12:43 pm

]]>

Related to above, what do people think of this statement?

Is the problem truly that [$]\rho \ne 0 [$] or that the diffusion coefs become unbounded? If true, the statement needs a proof to clarify the unstated assumptions. For example, what is the computational domain and the numerical boundary conditions?

Statistics: Posted by Alan — July 19th, 2017, 10:44 pm

]]>

This is a universal issue in general.

Non-monotone schemes can produce the bespoke problems.And the initial delta is tricky. I don't have Vol II here but I reckon lattices track the exact solution very well. Many FDM not.

So wrong.

Statistics: Posted by Cuchulainn — July 19th, 2017, 8:41 pm

]]>

Things get more interesting in 2D and up. I've got one chapter in a recent book that may interest you: Ch. 5 "Stochastic Volatility as a Hidden Markov Model" in the book "Option Valuation under Stochastic Volatility II". This turns out to be kind of a mixed lattice and continuum representation.

The virtue of convergent lattice methods is that they provide an explicit Markov chain approximation to a continuum process -- an approximation that has manifestly non-negative transition probabilities and weak convergence to the continuum process. PDE discretizations typically do not offer this interpretation.

If you've ever tried to solve a PDE for a tricky transition probability in 2D or higher, you've probably seen cases where convergence to a strictly positive result can be quite computationally demanding. And simply truncating a small negative pde result to zero can sometimes be grossly wrong (in maximum likelihood estimation, for example). So, there is a role for both approaches.

Statistics: Posted by Alan — July 19th, 2017, 7:53 pm

]]>

Admittedly lattice methods played a huge rule in quantitative finance, as they brought Black-Scholes theory to ... poor men. And I am not saying they aren't being used, they are.

But beyond this, look no further, look for advanced books on pdes and you shall find.

Also notice that you would find a lot more unpublished documentation with dealers than you would find in textbooks and papers. That's not just true of lattice methods, but generally the case. It's hard for people outside practitioner circles to even come up with the relevant problems. That's because you hit so many roadblocks in practical industrial strength applications that you either solve the problems or you just stop.

Statistics: Posted by mtsm — July 19th, 2017, 1:17 pm

]]>

The books I've seen go on endlessly on the same topics. Binomial, trinomial, CRR. I'm looking for an author that treats more advanced topics in this field.

Statistics: Posted by geodesic — July 19th, 2017, 5:31 am

]]>

Statistics: Posted by Cuchulainn — June 28th, 2017, 9:16 am

]]>

Chan, Jiun Hong and Joshi, Mark S. and Tang, Robert and Yang, Chao, Trinomial or Binomial: Accelerating American Put Option Price on Trees (September 1, 2008). Available at SSRN: https://ssrn.com/abstract=1261745 or http://dx.doi.org/10.2139/ssrn.1261745

RE works if the pay-off is smoothed or the tree is adapted to the strike.

Statistics: Posted by mj — June 28th, 2017, 1:17 am

]]>

Statistics: Posted by frolloos — June 20th, 2017, 6:59 pm

]]>

Anyone have a copy of "Very poor banksters' stochastic volatility model" by Andreasen & Huge, Danske Bank working paper, 2009?

Them?

Statistics: Posted by gatarek — June 20th, 2017, 10:36 am

]]>

Why not go the whole way and use Crank Nicolson. And it allows you to calculate the greeks in one sweep.

https://mhittesdorf.wordpress.com/2013/11/17/introducing-quantlib-american-option-pricing-with-dividends/

I don't see any (e.g. mathematical) reason to use trinomial against a decent FD scheme.

//

One issue I've noticed: I coded Richardson extrapolation for binomial method but I did not get what I expected. I am rather sceptical of the mathematical applicability of RE for lattice models. You just cannot ignore discontinuous payoffs...

See

https://kluedo.ub.uni-kl.de/frontdoor/index/index/year/2010/docId/2166

Statistics: Posted by Cuchulainn — June 14th, 2017, 11:33 am

]]>