wilmott.com Serving the Quantitative Finance Community 2021-06-17T20:04:51 https://forum.wilmott.com/feed 2021-06-17T20:04:51 2021-06-17T20:04:51 https://forum.wilmott.com/viewtopic.php?t=102281&p=866764#p866764 <![CDATA[Off Topic • Re: Bad things that come out of this pandemic]]>

Statistics: Posted by tagoma — 41 minutes ago

]]>
2021-06-17T19:59:02 2021-06-17T19:59:02 https://forum.wilmott.com/viewtopic.php?t=95212&p=866763#p866763 <![CDATA[General Forum • Re: Philosophy of Mathematics]]>
Newton and Euler were the same person??
The acorn doesn’t fall from the tree.
But Lagrange is closer.

Statistics: Posted by Cuchulainn — 47 minutes ago

]]>
2021-06-17T18:42:44 2021-06-17T18:42:44 https://forum.wilmott.com/viewtopic.php?t=99702&p=866762#p866762 <![CDATA[Technical Forum • Re: Breakthrough in the theory of stochastic differential equations and their simulation]]> I suppose that we have already done cholesky decomposition or SVD decomposition of the correlation matrix and we have written each of the N SDEs with their volatility coefficients explicitly stated after cholesky or SVD. I will just write one of the N SDEs in the basket as

[$]dX_1 = \mu_1 \, {X_1}^{\beta_1} \, dt + \sum_{n=1}^{N} \, \sigma_{1n} \, {X_1}^{\gamma_1} \, dZ_n[$]

In particular we want to notice that [$]\sum_{n=1}^{N} \, {\sigma_{1n}}^2 \, = \, {\sigma_1}^2 [$]
Please note that after cholesky or SVD, all of the above brownian motions [$]Z_n[$] are orthogonal.

Since I am playing with only one SDE, I will drop the subscript "1" from the bottom of each variable since it is unnecessary in case we are dealing with one SDE only.

expanding the above SDE as we have previously done, we write the equation after applying repeated Ito as
[$]dX = \mu \, {X}^{\beta} \, \int_0^t ds + \, {X}^{\gamma} \, \sum_{n=1}^{N} \, \sigma_{1n} \, \int_0^t dZ_n(s) [$]
[$]+(\mu \, \beta {X}^{\beta-1} ) \, (\mu \, {X}^{\beta}) \, \int_0^t \int_0^s dv \, ds[$]
[$]+(\mu \, \beta {X}^{\beta-1} ) \, ({X}^{\gamma} ) \, \sum_{n=1}^{N} \, \sigma_{1n} \, \int_0^t \int_0^s dZ_n(v) ds[$]
[$]+.5 (\mu \, \beta \, (\beta-1) \, {X}^{\beta-2} ) \, ({X}^{2 \gamma} ) \, \sum_{n=1}^{N} \, {\sigma_{1n}}^2 \, \int_0^t \int_0^s dv \, ds[$]
[$]+(\gamma {X}^{\gamma-1} ) \, (\mu \, {X}^{\beta}) \, \sum_{n=1}^{N} \, \sigma_{1n} \, \int_0^t \int_0^s dv \, dZ_n(s)[$]
[$]+(\gamma {X}^{\gamma-1} ) \, ({X}^{\gamma}) \, \sum_{n=1}^{N} \, \sum_{m=1}^{N} \, \sigma_{1m} \, \sigma_{1n} \, \int_0^t \int_0^s dZ_m(v) \, dZ_n(s)[$]
[$]+.5 (\gamma (\gamma-1) \, {X}^{\gamma-2} ) \, ({X}^{2 \gamma}) \, \sum_{n=1}^{N} \, \sum_{m=1}^{N} \, {\sigma_{1m}}^2 \, \sigma_{1n} \, \int_0^t \int_0^s dv \, dZ_n(s)[$]

since in quadratic variations, [$]\sum_{n=1}^{N} \, {\sigma_{1n}}^2 \, = \, {\sigma_1}^2 [$], we can slightly simplify the above equations by using this identity that changes double summations in quadratic variations to single summations and write the above equation again as

[$]dX = \mu \, {X}^{\beta} \, \int_0^t ds +\, {X}^{\gamma} \, \sum_{n=1}^{N} \, \sigma_{1n} \, \int_0^t dZ_n(s) [$]
[$]+(\mu \, \beta {X}^{\beta-1} ) \, (\mu \, {X}^{\beta}) \, \int_0^t \int_0^s dv \, ds[$]
[$]+(\mu \, \beta {X}^{\beta-1} ) \, ({X}^{\gamma} ) \, \sum_{n=1}^{N} \, \sigma_{1n} \, \int_0^t \int_0^s dZ_n(v) ds[$]
[$]+.5 (\mu \, \beta \, (\beta-1) \, {X}^{\beta-2} ) \, ({X}^{2 \gamma} ) \, {\sigma_1}^2 \, \int_0^t \int_0^s dv \, ds[$]
[$]+(\gamma {X}^{\gamma-1} ) \, (\mu \, {X}^{\beta}) \, \sum_{n=1}^{N} \, \sigma_{1n} \, \int_0^t \int_0^s dv \, dZ_n(s)[$]
[$]+(\gamma {X}^{\gamma-1} ) \, ({X}^{\gamma}) \, \sum_{n=1}^{N} \, \sum_{m=1}^{N} \, \sigma_{1m} \, \sigma_{1n} \, \int_0^t \int_0^s dZ_m(v) \, dZ_n(s)[$]
[$]+.5 (\gamma (\gamma-1) \, {X}^{\gamma-2} ) \, ({X}^{2 \gamma}) \, {\sigma_1}^2\, \sum_{n=1}^{N} \, \sigma_{1n} \, \int_0^t \int_0^s dv \, dZ_n(s)[$]

We can solve all of the above integrals analytically.
[$]\int_0^t \int_0^s dZ_n(v) ds= \frac{1}{\sqrt{3}} t Z_n(t)[$]
[$]\int_0^t \int_0^s dZ_n(v) dZ_n(s)= \frac{1}{2} ({Z_n(t)}^2-t)=H_2(Z_n(t))[$]
only difficult integral is
[$]\int_0^t \int_0^s dZ_m(v) dZ_n(s)= Z_n(t) \sqrt{[{Z_m(t)}^2-t] (1-\frac{\sqrt{2}}{2}) +\frac{t}{2} }[$]
where we get this integral from Ito Isometry as
[$]\int_0^t \int_0^s dZ_m(v) dZ_n(s)[$]
[$]=\int_0^t Z_m(s) dZ_n(s)[$]
and its variance is given as
[$]=\int_0^t {Z_m(s)}^2 ds[$]
[$]=\int_0^t d[{Z_m(s)}^2 s]-\int_0^t s d[{Z_m(s)}^2][$]
[$]= t \, {Z_m(t)}^2 - \int_0^t 2 s \, Z_m(s) \, dZ_m(s)- \int_0^t s \, ds [$]
[$]= t \, H_2(Z_m(t)) (1-\frac{\sqrt{2}}{2}) + t^2/2[$]
its representation will be given as
[$] \sqrt{(H_2(Z_m(t)) (1-\frac{\sqrt{2}}{2}) + t/2)} \, \sqrt{t} \, N_n[$]
where [$]N_n[$] is standard normal associated with brownian motion [$]Z_n[$]
writing [$]\sqrt{t} \, N_n = Z_n(t)[$] in the above equation we get
[$] \sqrt{(H_2(Z_m(t)) (1-\frac{\sqrt{2}}{2}) + t/2)} \, Z_n(t)[$]
[$]= Z_n(t) \sqrt{[{Z_m(t)}^2-t] (1-\frac{\sqrt{2}}{2}) +\frac{t}{2} }[$]

So we can write the integral as
[$]\int_0^t \int_0^s dZ_m(v) dZ_n(s)= Z_n(t) \sqrt{[{Z_m(t)}^2-t] (1-\frac{\sqrt{2}}{2}) +\frac{t}{2} }[$]

We can now solve all the integrals and easily simulate a basket option to 2nd expansion order of Ito-Taylor expansion.

Statistics: Posted by Amin — Today, 6:42 pm

]]>
2021-06-17T16:59:33 2021-06-17T16:59:33 https://forum.wilmott.com/viewtopic.php?t=95212&p=866761#p866761 <![CDATA[General Forum • Re: Philosophy of Mathematics]]> Statistics: Posted by katastrofa — Today, 4:59 pm

]]>
2021-06-17T15:47:40 2021-06-17T15:47:40 https://forum.wilmott.com/viewtopic.php?t=95212&p=866760#p866760 <![CDATA[General Forum • Re: Philosophy of Mathematics]]>
https://en.wikipedia.org/wiki/Newton%27s_method

It is an iterative method and the nasty surprise is that it is the Euler method!

I suppose that the old-timers didn't have recourse to ODE solvers? Same with Cauchy's gradient descent.

Statistics: Posted by Cuchulainn — Today, 3:47 pm

]]>
2021-06-17T12:23:58 2021-06-17T12:23:58 https://forum.wilmott.com/viewtopic.php?t=101120&p=866759#p866759 <![CDATA[Off Topic • Re: Apocalypse (Revelation) Biblical explanation.]]>
I suspect his "Goodbye" will be followed by "And another thing..."
I don't think that word means what he thinks it means.
he's a bit like Colombo minus the humour.

Statistics: Posted by Cuchulainn — Today, 12:23 pm

]]>
2021-06-17T12:23:32 2021-06-17T12:23:32 https://forum.wilmott.com/viewtopic.php?t=92714&p=866758#p866758 <![CDATA[Off Topic • Re: Those equatorial Ecuadorians]]>

Statistics: Posted by ExSan — Today, 12:23 pm

]]>
2021-06-17T11:48:39 2021-06-17T11:48:39 https://forum.wilmott.com/viewtopic.php?t=101120&p=866757#p866757 <![CDATA[Off Topic • Re: Apocalypse (Revelation) Biblical explanation.]]> Statistics: Posted by Paul — Today, 11:48 am

]]>
2021-06-17T10:56:35 2021-06-17T10:56:35 https://forum.wilmott.com/viewtopic.php?t=101120&p=866756#p866756 <![CDATA[Off Topic • Re: Apocalypse (Revelation) Biblical explanation.]]>
God says no mercy for doggists.
Goodbye.
How often does God talk to you Statistics: Posted by Cuchulainn — Today, 10:56 am

]]>
2021-06-17T10:51:50 2021-06-17T10:51:50 https://forum.wilmott.com/viewtopic.php?t=101662&p=866755#p866755 <![CDATA[Numerical Methods Forum • Re: Asymptotic behaviour of ODE/PDE (large time)]]>
Let's skip the zillion fixes and patches for GD. Enough blogs already. One example however is

The learning rate is a hyperparameter that controls how much to change the model in response to the estimated error each time the model weights are updated. ... Momentum can accelerate training and learning rate schedules can help to converge the optimization process.

There is much ado with this and a whole cottage industry has grown around it, e.g. grid search ugh.

GD is really a FD scheme for dissipative gradient ODE (Lagrange, Poincare); the learning rate is the step size in the ODE solver. Equality and inequality constraints are easy (try with GD..)

Statistics: Posted by Cuchulainn — Today, 10:51 am

]]>
2021-06-17T09:52:18 2021-06-17T09:52:18 https://forum.wilmott.com/viewtopic.php?t=101120&p=866754#p866754 <![CDATA[Off Topic • Re: Apocalypse (Revelation) Biblical explanation.]]>
www.youtube.com/watch?v=uvPxz5apXN8

Statistics: Posted by Cuchulainn — Today, 9:52 am

]]>
2021-06-17T09:13:27 2021-06-17T09:13:27 https://forum.wilmott.com/viewtopic.php?t=101120&p=866753#p866753 <![CDATA[Off Topic • Re: Apocalypse (Revelation) Biblical explanation.]]> Statistics: Posted by Paul — Today, 9:13 am

]]>
2021-06-17T05:39:14 2021-06-17T05:39:14 https://forum.wilmott.com/viewtopic.php?t=94796&p=866752#p866752 <![CDATA[Off Topic • Re: How to safeguard my research]]> Statistics: Posted by Amin — Today, 5:39 am

]]>
2021-06-17T05:37:09 2021-06-17T05:37:09 https://forum.wilmott.com/viewtopic.php?t=99702&p=866751#p866751 <![CDATA[Technical Forum • Re: Breakthrough in the theory of stochastic differential equations and their simulation]]> Statistics: Posted by Amin — Today, 5:37 am

]]>
2021-06-16T22:56:06 2021-06-16T22:56:06 https://forum.wilmott.com/viewtopic.php?t=92714&p=866750#p866750 <![CDATA[Off Topic • Re: Those equatorial Ecuadorians]]>
Fantastic story! United we stand, divided we fall: https://fablesofaesop.com/the-four-oxen ... -lion.html
Cute. They weren't Irish oxen.

Funny, but I was in Soweto and Stanton, Jo'burg 16 June 2006, the 30th anniversary of the murder of Hector Pieterson in 1976 and the start of the Soweto riots.

Boycott against Apartheid started in Dublin by working class women in 1984. Of course, the middle class liberals came on board when it became trendy to be anti-Apartheid.

https://en.wikipedia.org/wiki/Hector_Pieterson

https://www.youtube.com/watch?v=M6cBDusBiHE

Seamus Heany @2.22. They used to hang poets in Zuid Afrika.

// and it was Bloomsday 16th June

Statistics: Posted by Cuchulainn — Yesterday, 10:56 pm

]]>