https://www.datacenterdynamics.com/en/n ... last-year/

Statistics: Posted by Cuchulainn — Today, 3:13 pm

]]>

"Philosophy is a battle against the bewitchment of our intelligence by means of language."

It's all falling apart.

The 70’s called. The connection wasn’t very good, but something about the benefits of punched cards.It's all falling apart.

Statistics: Posted by bearish — Today, 1:50 pm

]]>

It's all falling apart.

Statistics: Posted by Cuchulainn — Today, 1:08 pm

]]>

Friends, going into year 2020, I made a lot of effort to find analytic methods for evolution of stochastic differential equations but I had very limited success.

I was later able to make some solid success towards evolution of densities of SDEs with transition probabilities in early 2021. I borrowed high order discretization schemes from earlier work on monte carlo simulations and inverted them to find the estimate of transition normal and its associated probabilities between grid cells on first time grid and grid cells on second time grid. I also accounted for width of the originating grid cell using Taylor expansions to make sure that resulting estimate of the transition probabilities is valid fro transition probability from one complete grid cell to other complete grid cell.

Here in post 1129, made on Wed Apr 21, 2021 12:41 pm, I had posted a good program that advances the densities of SDEs by using transition probabilities framework. Here is the link to post: https://forum.wilmott.com/viewtopic.php?f=4&t=99702&start=1125#p865466

For dependencies of this program, you will have to download code from 2-3 posts earlier. For friends who want to understand the context and theory perfectly, please read the relevant posts between posts 1059 and post 1129. Here is the web address of post 1059: https://forum.wilmott.com/viewtopic.php?f=4&t=99702&start=1050#p864474

After completing the work with transition probabilities densities, I started playing with different topics. I posted a C++ program to construct limit order book from Nasdaq ITCH format data. Here you can download the program. I had written a basic version of this program several years earlier. This C++ program was distributed in post 1162 here: https://forum.wilmott.com/viewtopic.php?f=4&t=99702&start=1155#p866534

Later I posted code for monte carlo simulation of SDEs with 8th and 12th order of expansion of SDEs. The original monte carlo simulation code I had written was only till fourth order and only employed fourth hermite polynomial as max order polynomial. I wrote new functions that could monte carlo simulate using 8th or 12th orders of expansions of SDEs for each time step. I thought that these very high order expansions would be helpful for friends who need very high accuracy for non-linear like SDEs. This code was distributed here on Tue Jun 15, 2021 2:51 pm, Post 1168 and 1169 : https://forum.wilmott.com/viewtopic.php?f=4&t=99702&start=1155#p866726

Later I found a simpler way to solve fokker-planck equation. This is a new method based on principle of conservation of probability mass. in this method the grid expands in time so as to conserve the probability mass associated with each grid point while simultaneously following the dynamics of Fokker-Planck equation associated with the SDE. In other words, the grid expands along constant CDF lines.

We model this method by equating time derivative of the CDF Integral upto the grid point as equal to zero. This gives us an equation in terms of time derivative of the boundary grid point and the integral of fokker-planck equation of the SDE. It turns out that expression under integral sign is a complete integral leaving us a first order partial differential equation that determines the evolution of constant CDF points. Then I do a change of probability density and convert the probability density of the SDE into probability density of standard normal and associated change of probability derivative. Upon simplifications, the whole thing can be solved like a first order ODE for each constant probability point on the grid.

You can download the program and details from posts around post 1198 here written on Wed Jul 28, 2021 5:49 pm here: https://forum.wilmott.com/viewtopic.php?f=4&t=99702&start=1185#p867692

More detailed explanation here in post 1202 : https://forum.wilmott.com/viewtopic.php?f=4&t=99702&start=1200#p867718

This is first time I conceived the idea of Z-Series and it worked very well to represent the random variables associated with SDEs.

Suppose we have a random variable X and belongs to very large class of distributions that can be represented as a power series in standard normal random variable. We can then represent the random variable X as

[$]X \, = \, a_0 \, + \, a_1 \, Z \, + \, a_2 \, Z^2 \, +\, a_3 \, Z^3 \, +\, a_4 \, Z^4 \, +\, a_5 \, Z^5 \, +\, a_6 \, Z^6 \, + \, \ldots [$]

We also want the reader to know that for every Z-series there is an equivalent Orthogonal Polynomial Series with different coefficients. We know the relationships between coefficients so we can easily convert a Z-series to Hermite Polynomial series and vice versa. Here is a general representation of Hermite polynomial Series

[$]X \, = \, \mu \, + \, c_1 \, H_1(Z) \, + \, c_2 \, H_2(Z) \, +\, \, c_3 \, H_3(Z) \, +\, c_4 \, H_4(Z) \, +\, c_5 \, H_5(Z) \, +\, c_6 \, H_6(Z) \, + \, \ldots [$]

For a given random variable, we know how to convert from coefficients [$]a_n[$] of Z-series to coefficients [$]c_n[$] of hermite polynomial series and vice versa.

Here in post 1272, written on Thu Nov 11, 2021 7:19 pm, I conceived the idea of semi-analytic solution of fokker-plack equation by inserting a series in powers of standard normal random variable with undetermined coefficients whose coefficients would be later found by matching coefficients right and left hand side of equations for each power of Z. Here I described the idea https://forum.wilmott.com/viewtopic.php?f=4&t=99702&start=1260#p868341

This was the first time, I used Z-series in the context of stochastic differential equations. You can download a program in post 1288. You will need to look at posts 1272-1296 for development on this topic. Especially read post 1291 and later posts : https://forum.wilmott.com/viewtopic.php?f=4&t=99702&start=1290#p868483

Later I had the idea of using Z-series representation of SDE random variables independently of Fokker-Planck equation and decided to use addition of cumulants to find the analytic evolution of SDEs without any pseudo random numbers.

Later in January 2022, I proposed to directly represent the initial distribution of the SDE at each time step and distribution of its diffusion at that step both as independent Z-series random variables. Since in bessel coordinates the evolution of SDE is almost independent of its initial value, I could use cumulant addition to advance the Z-series representation of the densities of SDEs. You can read relevant posts 1338,1339, 1343, 1345, 1349,1378, 1629, 1640. Here is post 1640 where a final program is distributed: https://forum.wilmott.com/viewtopic.php?f=4&t=99702&start=1635#p873577

Please note that for mean-reverting and many other types of SDEs there is still some dependence between initial distribution of the SDE and distribution of its generator even in Bessel coordinates but the method worked perfectly well over large enough time steps.

You can read this post for basic similarity between Taylor series and method of iterated integrals used to find solution of ODEs and SDEs. https://forum.wilmott.com/viewtopic.php?f=4&t=99702&start=1605#p873291

Most densities with continuous derivatives and vanishing tails can be represented in terms of Z-Series or alternatively in terms of series in Hermite Orthogonal polynomials. This is true for lognormal density, generalized gamma density and its various variants and chi-squared density and its variants.

Here is a post to find Z-series representation of lognormal random variables: viewtopic.php?f=4&t=99702&start=1635#p873551

This method generates densities of random variables so that first eight moments of the density perfectly match with first eight moments input to the method. Mostly the match of the densities is precisely perfect. In this method, we find coefficients of Z-series or Hermite orthogonal Polynomial series so that moments of resulting density precisely match the input moments. Though Edgeworth and Gram-Charlier like methods have been very useful, this new method is a very significant improvement upon classic methods of construction of densities when moments are given.

Here I distributed a program that finds Z-series representation of a random variable when first eight moments of the random variable are input to the program. It is in post 1653: https://forum.wilmott.com/viewtopic.php?f=4&t=99702&start=1650#p873803

Please also read post 1662 where I have written a small program that anlytically constructs generalized gamma density. The first eight moments of generalized gamma density are input to our Z-series density construction program. As it turns out, when graphed on the same axes, our newly constructed Z-series density is indistinguishable from analytic graph of generalized gamma density. Post 1662: https://forum.wilmott.com/viewtopic.php?f=4&t=99702&start=1650#p873846

Later I did a monte carlo simulation of system of asset and SV SDEs and found the Z-series representation of the asset density from moments of asset calculated in monte carlo. Posts 1685, 1686 and 1688: https://forum.wilmott.com/viewtopic.php?f=4&t=99702&start=1680#p874055

This monte carlo program led me to think about calculating correlations between orthogonal hermite polynomials of same degree across two correlated variables. I calculated correlations beteween hermite polynomials of same order between two correlated variables in monte carlo setting in a matlab program in this post 1691 and 1693: https://forum.wilmott.com/viewtopic.php?f=4&t=99702&start=1680#p874075

This led me to thinking that prevalent classic method of calculation of pearson correlation between two random variables was highly suboptimal when the densities of the variables are non-gaussian. It was optimal only for perfectly gaussian random variables. In general, the greater the deviation from gaussianity in two random variables, the more suboptimal the classic calculation of pearson correlation would become. I found in my research that a proper way to calculate correlations between two non-gaussian random variables was to represent the random variables as series in hermite orthogonal polynomials and then find correlations independently between hermite polynomials of the same order across two variables. This discovery means that most of the classic models and numerical methods in stochastics and time series that are based on proper calculation of correlations or covariances between two or more random variables have to be properly altered by replacing one gross estimate of Pearson correlation by multiple estimates of correlations between hermite polynomials of the same order across two variables when the random variables under consideration are non-gaussian.

The observation that pearson correlations have to be calculated independently for each hermite polynomial led to a new ground breaking concept of analytic hermite polynomial regression between two or more non-gaussian variables. This orthogonal regression breaks down every non-gaussian random variable into a series in hermite polynomials and then hermite polynomials of every order are regressed independently on each other depending upon their indpendent pearson correlations and then results from various orthogonal hermite polynomial regressions is added to give high dimensional curved surfaces with smooth derivatives that make a fit to the regression data. As opposed to straight lines making a fit between variables in linear regression, in hermite orthogonal regression, polynomials are used to find a fit to data and this fit is in the form of high-dimensional analytic, and curved surfaces with smooth derivatives. Please note that Hermite orthogonal regression I am referring to is a very new discovery that I recently made in Feb 2023 and it is different, far better and unrelated to another classic method known by the name of hermite polynomial regression which has been known for decades. This new method of hermite orthogonal regression is a strong competitor and usually better than most AI and machine learning regresssion methods that fit a curve to data. As opposed to functional regression, our new orthogonal hermite polynomial regression method depends on correlations between hermite polynomials. While classic regression that depends upon correlations between first hermite polynomial our new method is a generalization of classic first hermite polynomial regression to a new method of regression that depends upon correlations between multiple orthogonal hermite polynomials of same order across different variables.

Below I show three graphs of hermite orthogonal regressions that try to explain Apple Stock returns VS Microsoft and Nvidia Stock returns. All three graphs use only nine data observations over non-overlapping periods.

The next three posts 1791, 1792 and 1793 describe calculation of correlations, covariances and regressions between non-gaussian random variables when their Z-series representation is given.

https://forum.wilmott.com/viewtopic.php?f=4&t=99702&start=1770#p875136

https://forum.wilmott.com/viewtopic.php?f=4&t=99702&start=1785#p875153

https://forum.wilmott.com/viewtopic.php?f=4&t=99702&start=1785#p875155

Particularly the last post analytically explains the concept of regression between two Z-series random variables.

Please also read posts 1731, 1732 and 1734 for hermite polynomial correlations and regressions:

https://forum.wilmott.com/viewtopic.php?f=4&t=99702&start=1710#p874492

https://forum.wilmott.com/viewtopic.php?f=4&t=99702&start=1725#p874542

https://forum.wilmott.com/viewtopic.php?f=4&t=99702&start=1725#p874578

Here I distributed program for doing multivariate hermite regression in post 1722: https://forum.wilmott.com/viewtopic.php?f=4&t=99702&start=1710#p874497

I presented results of one dimensional hermite regression in post 1705 here: https://forum.wilmott.com/viewtopic.php?f=4&t=99702&start=1680#p874075

The hermite regression program is given in post 1706 here: https://forum.wilmott.com/viewtopic.php?f=4&t=99702&start=1695#p874270

I also learnt that in order to do robust hermite orthogonal regressions, we needed to apply positivity condition of the derivatives of the density. Here I have described the positivity condition for derivatives of the density in post 1719: https://forum.wilmott.com/viewtopic.php?f=4&t=99702&start=1710#p874464

After proper application of positivity condition, hermite regressions worked in an excellent fashion. I showed some results here in post 1721: https://forum.wilmott.com/viewtopic.php?f=4&t=99702&start=1710#p874492

Please read posts 1783 and 1784 for calculation of variance of sum of two non-gaussian random variables when their hermite polynomials have correlations

https://forum.wilmott.com/viewtopic.php?f=4&t=99702&start=1770#p875135

https://forum.wilmott.com/viewtopic.php?f=4&t=99702&start=1770#p875136

Please disregard post 1785 and 1785, they have some errors.

A Z-series option pricing formula is given in post 1830 here: https://forum.wilmott.com/viewtopic.php?f=4&t=99702&start=1815#p875619

Statistics: Posted by Amin — Today, 12:04 pm

]]>

From this, the next question is this: what if we wanted to do a long-dated bond forward for instance, what repo would we use ? Simply flat extrapolation from short-dated observable repo rates ?

Thanks for reading.

Statistics: Posted by woodsdevil — Today, 12:31 am

]]>

]]>

Comparison to Standard Prompting

With

+ toy examples..

Statistics: Posted by Cuchulainn — Yesterday, 6:48 pm

]]>

Requirements engineering for dummies?

just a 1st impression. Nothing new under the sun. Ecclesiastes 1:9,

Statistics: Posted by Cuchulainn — Yesterday, 6:29 pm

]]>

what's the issue? Can you post your code?

BTW what's prompt engineering?

Seems Julia and PDE go well together.

https://www.deeplearning.ai/short-cours ... evelopers/

Statistics: Posted by bearish — Yesterday, 1:40 pm

]]>

what's the issue? Can you post your code?

BTW what's prompt engineering?

Seems Julia and PDE go well together.

Statistics: Posted by Cuchulainn — Yesterday, 9:57 am

]]>

I have not presented the proof of those formulas but I will give friends an idea how to solve those iterated stochastic integrals. It is very embarrassing to admit but when I started out to solve for those integrals, I very naively thought that in iterated stochastic integrals with repeated dz(t) and dt, these integrals commute. Only later when I simulated the lognormal and other SDEs to higher order that I realized that these integrals do not commute.

Here in this post 689, I told friends that I had made an error and my earlier thoughts that stochastic integrals commute and resulting calculations were wrong.

https://forum.wilmott.com/viewtopic.php?f=4&t=99702&start=675#p826905

Then I went to blackboard again and found out that my method to evaluate dz integrals with Ito-isometry like formula with variances was very right but stochastic integrals that ended with dt were wrong.

I wrote this post four days later and gave the correct formulas. Post 690 here: https://forum.wilmott.com/viewtopic.php?f=4&t=99702&start=675#p827224

For reference, here is the original post # 32 where I claimed that stochastic integrals should commute. The method suggested in that post only works for dz-integrals and not on dt-integrals. Here is the original post: https://forum.wilmott.com/viewtopic.php?f=4&t=99702&start=30#p782089

Sometime after writing that post, I started working on ODEs and stopped the work on SDEs which I restarted later. I was forcefully detained several times in that duration.

Those dt integrals that could not be directly evaluated had to be first converted to dz integrals as below and then we could use ito isometry like formulas to solve for them and get exact results.

Then just like we approach the formulas [$]\int_0^{t} z(s) \, ds \, =\,\int_0^{t} \, d[\, s \, z]\, - \int_0^{t} \, s \, dz(s) \, [$]

I applied that to hermite polynomials as

[$]\, \int_0^{t} H_n(z(s)) \, ds \, =\,\int_0^{t} \, d[\, s \, H_n(z(s))]\, - \int_0^{t} \, s \, dH_n(z(s)) \, [$]

where second integral on RHS can be solved with ito-isometry like formula. So I had to replace dt integrals with [$]dz(t)][$] or [$]dH_n(z(t))[$] integrals which could be solved easily.

Using above recursions given in post 693, you can solve for a very large class of stochastic integrals by representing arbitrary polynomial expressions of z(t) in terms of hermite polynomials and then use above recursions.

Here in post 697, I have given a toy example how you could solve for integral [$]\, \int_0^t \, z(s)^4 \, ds[$]. Following the logic in that example you can easily solve for a very large number of stochastic integrals. : https://forum.wilmott.com/viewtopic.php?f=4&t=99702&start=690#p827488

I recall reading a scholarly paper in which author had calculated above integral [$]\, \int_0^t \, z(s)^4 \, ds[$] with great difficulty but you can very quickly solve it by representing it in a hermite polynomial form and then applying the stochastic integral solution recursions of post 697.

Another thing I want to mention that iterate integral formula with repeated Ito is completely general and can be applied to any univariate or multivariate SDEs or systems of SDEs. But you have to generalize it according to the problem. For example in stochastic volatility SDEs setting when we have two different variables in a term as can happen that in volatility term of asset there could be a power of asset and also a power of volatility. In such cases, you apply Ito product rule instead of Ito change of variable formula and successive application of Ito product rule would convert the original SDE into large number of terms with constant integrand evaluated at initial time and then all we need is to solve for the appropriate stochastic integrals. It is a bit tedious but straightforward.

I was not going to write a full-fledged stochastic volatility program and thought most people would do it on their own after looking at my research on strictly one dimensional SDEs. I wrote stochastic volatility program only because I needed a correct reference program which can be used to find the true distribution of the SV SDEs and I wanted to compare results from my experiments with other analytic methods that did not use any pseudo-random numbers.

Again here is the link to general Stochastic volatility SDEs program: https://forum.wilmott.com/viewtopic.php?f=4&t=99702&start=1200#p867829

Statistics: Posted by Amin — Yesterday, 9:49 am

]]>

Statistics: Posted by bearish — Yesterday, 3:19 am

]]>

]]>

]]>

]]>