Page 1 of 5

sitmo C++ normal distribution

Posted: December 24th, 2013, 4:24 pm
by Cuchulainn
QuoteOriginally posted by: outrunbtw I'm adding "sitmo" to the title -not for branding-, but for version identification, it's a "namespace". We have (had?) many C++ and other language initiatives w.r.t. MC and random number generation, and I'm hoping more will come!What that my idea originally

sitmo C++ normal distribution

Posted: December 28th, 2013, 11:28 am
by Cuchulainn
QuoteOriginally posted by: outrunThis is a set of two C++11 compliant normal distributions named normal_distribution_inv and normal_distribution_inv_single. These C++ distribution sampler are used to generate normal distributed random numbers.C++11 has it's own std::normal_distribution<> in the <random> library to do this, and that version uses the Box Muller transform. It's the best plain vanilla method to generate normal distributed random numbers in C++11. This version is different because it uses *inversion* (the inverse of the cumulative normal distribution function) to convert uniform random numbers to normal distributed ones. It has different properties, and so can be useful for some special cases:* it might be a bit faster (I still need to benchmark it)* it's common to use inversion when using low discrepancy sequences. There is a worry that Box Muller breaks the low discrepancy structure because it consumes and mixes two random sample points/coordinated instead of one.I have two versions:sitmo::normal_distribution_inv<> which uses a full machine precision inversion functionsitmo::normal_distribution_inv_single<> which uses a faster 1.15E-9 precision inversion function.. I still need to do a lot of testing and I'll post updates if people find bugs or have suggestions for improvement. I might also add different distribution methods alter on, maybe Ziggurat?The code is based on the inversion function approximation by Peter John Acklamusage:sitmo::normal_distribution_inv<> N;double e = N(eng);(with eng some random engine...)Do you have a 101 main()? Can I use it for my online students? They use 1) Hull approx and 2) Boost implementation and others at the moment. BTW Using your prng for Box-Muller is good, although the Polar form seems to less accurate. Any ideas why?

sitmo C++ normal distribution

Posted: December 28th, 2013, 11:41 am
by Cuchulainn
QuoteOriginally posted by: outrunYes, I still need to doc it, will post it here tonight. The idea is that is conforms to the C++11 distribution, should be identical!Brilliant.

sitmo C++ normal distribution

Posted: February 4th, 2014, 1:37 am
by MiloRambaldi
I've done this stuff (random normal variate generation using the inverse CDF) at a previous job. Did you try using boost for the inverse normal CDF? It is extremely accurate, within a couple machine epsilon IIRC.Did you ever implement the Ziggurat algorithm? I'm looking to add more stuff to my "portfolio".

sitmo C++ normal distribution

Posted: April 25th, 2014, 11:39 pm
by MiloRambaldi
QuoteOriginally posted by: outrunBoost recently switched from Box Muller to zigguratVery interesting. I was looking at QuantLib's implementation and it is closely tied to the MT19937 (Mersenne Twister) generator. It only uses 24-bits of the MT output to avoid some correlation issue.Do you know if these issues have been resolved in boost? I cannot find the post you once made here about ziggurat quality issues.

sitmo C++ normal distribution

Posted: April 28th, 2014, 5:11 am
by MiloRambaldi
QuoteOriginally posted by: outrunQuoteOriginally posted by: MiloRambaldiI was looking at QuantLib's implementation and it is closely tied to the MT19937 (Mersenne Twister) generator. That's a design flaw, they should have used orthogonal concepts and reuse standard concepts, abstract away specifics as much as possible. Now they can only eat their own spaghetti food. It's an old project, it's alway easy to judge in retrospect but e.g. Efficient C++ by Scott Meyers has been around for ages. QuoteIt only uses 24-bits of the MT output to avoid some correlation issue.Do you know if these issues have been resolved in boost? I cannot find the post you once made here about ziggurat quality issues.The issue seems to be with MT, not with ziggarat. MT is standardised algorithm (with some constants) which should be *identical* across implementations, so boost can't 'fix' MT, nor ziggurat, the combination is apparently invalid? It would be interesting if you have some more info about that issue?To be fair, QuantLib's polar Box-Muller appears to be well-designed with appropriate orthogonality and abstraction. Perhaps there is some reason that their Ziggurat hard-codes MT. I plan to look into this further very soon.