Serving the Quantitative Finance Community

  • 1
  • 4
  • 5
  • 6
  • 7
  • 8
 
User avatar
Cuchulainn
Posts: 20256
Joined: July 16th, 2004, 7:38 am
Location: 20, 000

Re: Bivariate Normal Integral paper

March 26th, 2017, 1:07 pm

Instead of a PDE solution we can examine the dual problem and integrate by various versions of Romberg integraton as a 2d loop. This works showing that the integrand is well-behaved.

Looking at 26.3.3 we can see that the inner integral can be directly written in terms of C++11 erf(x). Then in the other direction we can use Romberg/Midpoint to get accuracy. In this case we don't need to build a matrix. And C++11 takes care of all edge conditions in erf(x) and exp(x). Just in case you can do a Kahan summation.

http://people.math.sfu.ca/~cbm/aands/page_936.htm

Some results starting with a modest N = 10 gives this (the result are 7 digits accuracy without too much effort). N = 50,100, 400 are stress tests.

So we can get accurate results directly in C++11 without external libraries. It's another option.

a, b, rho: 1.93185,1.78999, -0.411002
*Genz West                  : 0.9366216225051415
*Tanh 2d Extrap/Adaptive    : 0.9366219402518232
*Midpoint 2d                : 0.9366238179854843
*Tanh 2d 100X100            : 0.9367276920000295
*A&S 26.3.3                 : 0.9366704332207069
 
*Genz QuantLib 1.8          : 0.9366216225051415
*A&S 26.3.3 Extrap (N=10)   : 0.9366210429626975
*A&S 26.3.3 Extrap (N=50)   : 0.9366216224739409
*A&S 26.3.3 Extrap (N=100)  : 0.9366216225046546
*A&S 26.3.3 Extrap (N=400)  : 0.9366216225051428
 
a, b, rho: 1.337236842557504,1.264162992646285, 0.486393246573539
*Genz West                  : 0.8363779098801795
*Tanh 2d Extrap/Adaptive    : 0.8363726930947498
*Midpoint 2d                : 0.836377419248175
*Tanh 2d 100X100            : 0.8365230433697367
*A&S 26.3.3                 : 0.8364483519381961
 
*Genz QuantLib 1.8          : 0.8363779098801796
*A&S 26.3.3 Extrap (N=10)   : 0.8363776712097363
*A&S 26.3.3 Extrap (N=50)   : 0.836377909871765
*A&S 26.3.3 Extrap (N=100)  : 0.8363779098800628
*A&S 26.3.3 Extrap (N=400)  : 0.8363779098801919
 
a, b, rho: 1.955091230607987,0.941464704072475, -0.4825543785261079
*Genz West                  : 0.80173352113734
*Tanh 2d Extrap/Adaptive    : 0.8017309528547933
*Midpoint 2d                : 0.8017282455937149
*Tanh 2d 100X100            : 0.8018595575746997
*A&S 26.3.3                 : 0.8017798723119439
 
*Genz QuantLib 1.8          : 0.80173352113734
*A&S 26.3.3 Extrap (N=10)   : 0.8017329230670884
*A&S 26.3.3 Extrap (N=50)   : 0.8017335211054371
*A&S 26.3.3 Extrap (N=100)  : 0.8017335211368406
*A&S 26.3.3 Extrap (N=400)  : 0.8017335211373599
 
a, b, rho: 1.860920640238858,0.3513236324632962, -0.09898276912896131
*Genz West                  : 0.614801617049458
*Tanh 2d Extrap/Adaptive    : 0.6147996574002191
*Midpoint 2d                : 0.6147982757717783
*Tanh 2d 100X100            : 0.6148746733453729
*A&S 26.3.3                 : 0.6148381569839125
 
*Genz QuantLib 1.8          : 0.6148016170494581
*A&S 26.3.3 Extrap (N=10)   : 0.6148012461785963
*A&S 26.3.3 Extrap (N=50)   : 0.6148016170304943
*A&S 26.3.3 Extrap (N=100)  : 0.6148016170491677
 
User avatar
Cuchulainn
Posts: 20256
Joined: July 16th, 2004, 7:38 am
Location: 20, 000

Re: Bivariate Normal Integral paper

March 29th, 2017, 11:29 am

One more thing...

Speed Test
 
a, b, rho, N: 1.19473, 0.10332, -0.666994, 814657
Genz QL: 1.02002
Genz West: 1.19002
26.3.3: 5.40011
 
a, b, rho, N: 0.167764, -1.08659, 0.436802, 301351
Genz QL: 0.450009
Genz West: 0.430009
26.3.3: 1.31303
 
a, b, rho, N: 0.371799, 0.321643, 0.20042, 417296
Genz QL: 0.334011
Genz West: 0.350007
26.3.3: 1.53003
 
a, b, rho, N: -0.386024, -0.825493, 0.544039, 769328
Genz QL: 0.98002
Genz West: 1.11002
26.3.3: 4.56005
 
 
User avatar
Cuchulainn
Posts: 20256
Joined: July 16th, 2004, 7:38 am
Location: 20, 000

Re: Bivariate Normal Integral paper

March 31st, 2017, 1:54 pm

For 26.3.3 using 20-point Gauss-Legendre is vary accurate and performs much better than Romberg. It is [1.1, 3] times slower than QuantLib Genz. The performance cannot be improved since we are using std::erf(x) as a black box. We don't know hot it is implemented.
It's always a trade-off; tweek ad infinitum or choose a general robust method.
Fin.
 
User avatar
AVt
Posts: 90
Joined: December 29th, 2001, 8:23 pm

Re: Bivariate Normal Integral paper

April 2nd, 2017, 7:46 pm

I looked at the example in post #106, a, b, rho: 1.93185,1.78999, -0.411002 and get 0.936622085501903 (15 decimals)

PS: What time units are meant in the speed test in post # 107?
 
User avatar
Cuchulainn
Posts: 20256
Joined: July 16th, 2004, 7:38 am
Location: 20, 000

Re: Bivariate Normal Integral paper

April 3rd, 2017, 8:13 am

I looked at the example in post #106, a, b, rho: 1.93185,1.78999, -0.411002 and get 0.936622085501903 (15 decimals)

PS: What time units are meant in the speed test in post # 107?
I ran the experiment again with these precise (truncated) numbers + some extra methods for comparison. And a performance test 10^7 trials (time is in seconds).

What I particularly like is 12-digits accuracy for 26.3.3 using 20-point GL and standard C+11 (no extra files and/or libraries needed). I did 10-point GL as well and has lower accuracy as expected, but it will be faster.

a, b, rho: 1.93185,1.78999, -0.411002
*Goursat Classico 200x200   : 0.9366486100496062
*Goursat Extrap             : 0.9366220856907832
*Genz West                  : 0.9366220855019028
*Drezner 1978 Quantlib 1.8  : 0.9366220580788491
*A&S 26.3.3 Extrap (N=10)   : 0.936621505959591
*A&S 26.3.3 Extrap (N=950)  : 0.9366220855018679
*Genz QuantLib 1.8          : 0.9366220855019028
*A&S 26.3.3 Gauss Legendre  : 0.9366220855178885
 
 
Speed Test (in seconds)
 
a, b, rho; NTrials: 1.93185, 1.78999, -0.411002; 10000000
Genz QL: 15.0989
Genz West: 13.9828
26.3.3 GauLeg: 31.6778
26.3.3 Midpt Extrap N = 10: 92.4843
 
a, b, rho; NTrials: 1.93185, 1.78999, -0.411002; 10000000
Genz QL: 17.072
Genz West: 15.8739
26.3.3 GauLeg: 33.3149
26.3.3 Midpt Extrap N = 10: 105.156
 
a, b, rho; NTrials: 1.93185, 1.78999, -0.411002; 10000000
Genz QL: 16.912
Genz West: 15.4709
26.3.3 GauLeg: 38.7462
26.3.3 Midpt Extrap N = 10: 105.854
 
User avatar
Cuchulainn
Posts: 20256
Joined: July 16th, 2004, 7:38 am
Location: 20, 000

Re: Bivariate Normal Integral paper

July 24th, 2017, 8:25 am

Concluding:

. The West and Quantlib implementations of the Genz algorithms give the same accuracy up to machine precision. We used the finite difference method to check them just to ensure that they do not give the same incorrect value for certain values of the parameters. We generated a,b,rho randomly and computed 10^10 trials. Use FDM to check both Genz algorithms.
. The added value of the finite difference approach is that it can be tuned to suit our accuracy needs. The basic model is second-order accurate and we can use  Richardson extrapolation to achieve fourth-order accuracy. Furthermore, these meshes deliver a matrix as part of the algorithm which saves recomputing for certain classes of applications.
. The finite difference approach can be applied to a wide range of bivariate and trivariate  distributions, for example the bivariate t distribution and the trivariate normal distribution. Genz becomes difficult in these cases.
 
. The Drezner1978 algorithm is less accurate than the other schemes and it degrades for correlation near 1 in absolute.
 
User avatar
Cuchulainn
Posts: 20256
Joined: July 16th, 2004, 7:38 am
Location: 20, 000

Re: Bivariate Normal Integral paper

September 28th, 2017, 6:46 am

Genz has an algorithm for trivariate normal distribution and Graeme West has an implementation. The question is if the PDE approach works in 3d and if so the PDE would be

[$]\frac{\partial^3 u}{\partial x \partial y \partial z} = f[$]  (BTW Anyone know if this PDE has a name, it's the kind of stuff that Goursat, Monge and Darboux would do?)

We discretised this as a second-order FDM scheme and tested again Genz. I generated some tests such as

      Values, error: 1/ 3.36056e-25,8.60479e-26,2.50008e-25
      Values, error: 2/ 5.55148e-15,5.55106e-15,4.2039e-19
      Values, error: 3/ 0.998061,0.998062,1.55523e-06
      Values, error: 4/ 0.667054,0.667056,2.38335e-06
      Values, error: 5/ 3.57559e-08,3.57549e-08,9.60296e-13
 
      // …
 
      Values, error: 96/ 7.25962e-15,7.21828e-15,4.13414e-17
      Values, error: 97/ 0.0118934,0.0118931,2.91526e-07
      Values, error: 98/ 6.82096e-17,6.19754e-17,6.23427e-18
      Values, error: 99/ 9.44464e-07,9.44434e-07,2.96727e-11
      Values, error: 100/ -4.16418e-25,1.23604e-73,4.16418e-25
 
      Max, Min error: 6.003088278805357e-06, 
                      1.36752501534285e-45

 and tested for rho in range [-0.5, 0.5]. and b1, b2 and b3 in range [-8,8]. I used domain truncation but a more elegant approach is to transform to [-1,1]^n exactly.

The PDE approach can be applied to any distribution in principle. 

I have also done it for 4 dimensional integral for zero correlation (I did not feel like inverting 4X4 covariance matrix by hand...)

[$]\frac{\partial^3 u}{\partial x \partial y \partial z \partial p} = f[$] 
 
User avatar
Cuchulainn
Posts: 20256
Joined: July 16th, 2004, 7:38 am
Location: 20, 000

Re: Bivariate Normal Integral paper

April 16th, 2023, 5:10 pm

A relevant remark is that the correlations rho21, rho31, rho32 must fit into the 'cushion' feasability region in order to produce a positive-definite correlation matrix, otherwise it goes haywire.

Thanks for the Mathematica jpg file!

i.e. [$] 1 - (x^2 + y^2 + z^2) + 2xyz > 0[$]
[$] x,y, z  \in [-1,1][$]



Image
Sergio Prego (San Sebastian/Bilbao museum)
sergio2.jpg
sergio1.jpg
 
User avatar
Cuchulainn
Posts: 20256
Joined: July 16th, 2004, 7:38 am
Location: 20, 000

Re: Bivariate Normal Integral paper

April 16th, 2023, 5:14 pm

Thirteen to Centaurus
Envisioned by the artist Sergio Prego (San Sebastián, 1969) for the museum’s galleries during its enlargement, and taking advantage of the opportunity to remove the artworks from the collection from this work zone, Thirteen to Centaurus takes advantage of an extraordinary situation to propose a sculpture experience in interaction with the museum’s interior architecture.
A sequence of large pneumatic modules transforms the perception of ten adjacent galleries which comprise the architectural body of the museum’s old section. The installation is primarily made up of a series of fourteen modules located in space, sized to fit the galleries in which they are placed. These elements are articulated in relation to the symmetry lines of the different galleries, creating a pattern in which concatenated elements alternate either in physical continuity with one another or separated by the walls dividing the rooms or by empty spaces between geometrically similar edges which are parallel in space.
The membrane of the modules is based on the tetrahedron as an abstract model for their structure. It is the simplest regular solid with the greatest structural consistency: four equal triangular sides. Tetrahedrons are unique in that their edges do not match the axial edges of the orthonormal coordinate system, and they do not fit a cubic capacity system. That is, when these forms occupy a space they cannot completely fill it but instead leave interstitial spaces between them. As a result, their geometry is somewhat unsuited to the museum’s galleries as their container, whose structure becomes distorted and whose perception is hindered by the presence of the tetrahedrons. When the membrane is blown up, each of them transforms into a curved organic shape similar to a topology in which no other geometric elements can be identified except the two edges connecting the modules comprising the chain of tetrahedrons. The pneumatic structures can be described as organic shapes, such as organs or organisms, which consist in a membrane enclosed upon itself with orifices that regulate the relationship between inside and outside. The characteristics of these organic forms are determined by the plasticity of the surface tension of the membranes.
 
User avatar
Cuchulainn
Posts: 20256
Joined: July 16th, 2004, 7:38 am
Location: 20, 000

Re: Bivariate Normal Integral paper

April 16th, 2023, 5:15 pm

In the science-fiction story, Thirteen to Centaurus, J. G. Ballard describes the experiment in which some subjects’ lives unfold in utter isolation in a dome, simulating the conditions of intergenerational interstellar travel, without either contact with or knowledge of the outside world. The purpose is to consider the factors of human behaviour that have led past attempts at space colonisation to fail. Doctor Roger Francis is in charge of psychologically tracking the subjects of the study and secretly leaves on a regular basis to coordinate the progress of the project and the support and maintenance tasks with the outside team maintaining the facilities. After 50 years, the decline in public and political support endangers the project, and he desperately asks that the research be continued so it can be concluded in the very distant future:
‘… If the project ends it will be we who have failed, not them. We can’t rationalize by saying it’s cruel or unpleasant. We owe it to the fourteen people in the dome to keep it going.’
Chalmers watched him shrewdly. ‘Fourteen? You mean thirteen, don’t you, Doctor? Or are you inside the dome too?’

The obscure story interlinks utopian and dystopian images of the future associated with questioning the forms of progress that have characterised modernity, whose development has often collided with ethical and moral positions resulting from human experience. This association resonates in the attempts to use the pneumatic structures commonly found in the radical architecture experiments of the 1960s and 1970s by José Miguel de Prada Poole, Event Structure Research Group, Ant Farm and Hans Walter Muller, among others, who have been and continue to be touchstones for the artist. When working on them, pneumatic architecture has questioned the massive use of material resources to create other forms of inhabiting. With this experimental quest inserted within the genealogy of sculpture, the use of pneumatic membranes is connected to questioning material mass as an element constitutive of spatial experience.