AD is easy to understand (do by hand) and then you see the data structures. For large problems these will become yuge IMO.

For PDE I think AD will not be optimal. While FDM truncation error is small, its derivative will not be in general.

Statistics: Posted by Cuchulainn — July 31st, 2018, 4:28 pm

]]>

C99 doesn't even have a native matrix type......

It didn't stop me from solving complex multidimensional Hamiltonians

Indeed. It's just the

Statistics: Posted by Cuchulainn — July 28th, 2018, 12:18 pm

]]>

Of course, the complex step method breaks down for the Hessian, in which case it seems multi-complex step method is a solution.(?)

The real pain is that C++ is miles behind Matlab and Fortran in this respect.

It breaks down - or simply wasn't designed to approximate higher order derivatives. It's indeed naturally extended using multi-complex analysis when one needs to analyse sensitivity or elasticity of non-linear systems, utility functions depending on several factors, uncertainty in the multiple parameter models, etc. (I suppose many people in the forum applied such methods, knowingly or not.)

C99 doesn't even have a native matrix type......

It didn't stop me from solving complex multidimensional Hamiltonians

Statistics: Posted by katastrofa — July 28th, 2018, 11:11 am

]]>

Never understood why paying for a scientic paper (between $10-$30 and it's a pdf file!!!).

When I don't find the paper online (for free), I ask the author.

Sci-Hub is a breath of fresh air for a LOT of scientists around the world who can't afford the predatory prices (and practice) of the editors.

FaridMoussaoui, IMHO, scientific journals are just a lucrative business marketed on their former prestige. Every scientist (or only those naive or with misplaced ambitions?) dreams about publishing in Nature like a chav about a new iPhone model. Not many succeed, because nepotism and power seem to be the main selection criteria, even if it often entails publishing poor, trivial or even incorrect work - they will juice up the title and fudge the results, et voila! (E.g., there was recently a flashy title about ML application in medicine outperforming the medical diagnosis. I instantly wondered how a standard multinomial regression would perform on the same data. The authors indeed performed such an obvious comparison, but hid the outcome in the supplement - it was as good as their ML model, hence...) I'd better warn you though that I hold a huge grudge against the academia and scientists, so I may be exaggerating

Statistics: Posted by katastrofa — July 28th, 2018, 11:07 am

]]>

The real pain is that C++ is miles behind Matlab and Fortran in this respect.

Statistics: Posted by Cuchulainn — July 27th, 2018, 4:44 pm

]]>

Sci-Hub is a breath of fresh air for a LOT of scientists around the world who can't afford the predatory prices (and practice) of the editors.

Statistics: Posted by FaridMoussaoui — July 27th, 2018, 1:08 pm

]]>

katastrofa wrote:"Numerical algorithms based on the theory of complex variable", Lyness, 1967 (paywall)

How to get the article for free (as the author get $0 anyway):

- get the DOI reference from the paywall web page.
- go to https://twitter.com/sci_hub
- click on the link on the left hand side. It is now: https://sci-hub.tw
- done

That's cheating! Just kidding Many thanks!

Statistics: Posted by katastrofa — July 27th, 2018, 1:01 pm

]]>

How to get the article for free (as the author get $0 anyway):

- get the DOI reference from the paywall web page.
- go to https://twitter.com/sci_hub
- click on the link on the left hand side. It is now: https://sci-hub.tw
- done

Statistics: Posted by FaridMoussaoui — July 27th, 2018, 12:46 pm

]]>

]]>

I am just amazed that it is used in statistics.. any links?

Ouch. Check it out.

Statistics: Posted by Cuchulainn — July 27th, 2018, 10:27 am

]]>

"Numerical Differentiation of Analytic Functions", Lyness & Moler, 1967 (PDF)

Are you pulling my leg? The sensitivity analysis in various fields requires an accurate estimation of derivatives, and hence they use complex step method instead of finite differencing, e.g. "The Complex-Step Derivative Approximation", Martins, et al., 2003 (example of an application and a simple estimation of the error)

Statistics: Posted by katastrofa — July 27th, 2018, 10:16 am

]]>

Interesting. So, statisticians discovered it?

BTW Fortran has complex type forever. C99 is playing catch-up.

Statistics: Posted by Cuchulainn — July 27th, 2018, 9:19 am

]]>

]]>

Higham's method looks v. appealing to me now.

Statistics: Posted by ISayMoo — July 26th, 2018, 6:15 pm

]]>

Simple question which stumps me: I have a complex square matrix H. There are some nice methods for calculating exp(H). What about calculating the derivative of exp(H) over elements of H? To be precise: let M = exp(H). I want to calculate dM_{jk} / dH_{mn} numerically, accurately and (relatively) quickly.

What about somefing on the lines of BCH?

https://math.stackexchange.com/question ... -of-matrix

A successful implementation would improve the code (e.g. maintainability and readability) for backpropagation etc.

Statistics: Posted by Cuchulainn — July 26th, 2018, 11:51 am

]]>