I complexify [$]f(z)[$] where [$]z = x + ih[$] and h = 0.001, e.g.
Then compute the imaginary part of [$]f(z)/h[$] and you are done.(*)
For x = 1.87 I get 1.3684..... e+289 for both exact and this method.(12 digits accuracy)
No subtraction cancellation as with FD classic.
(*) Squire and Trapp 1998.
The use of the complex-step in calculating the derivative of an analytic function was introduced by Lyness & Moler.
Numerical Differentiation of Analytics Functions, SIAM Vol 4, N2, 1967 available here: link to the pdf paper
PS: In a previous life, I used the complex-step gradients to compute the sensittivity derivatives of the aerodynamic cost function.
The Lyness/Moler paper looks more computationally intensive than S&T for f'(x). It uses a series solution + numerical integration(?)
I am examining the wider applicability of this method for gradients compared to AD method (I exclude exact methods and classic fdm for various reason). I tested first order call option greeks and bond sensitivities and the results so far agree with exact and fdm solutions. I even tried option speed by applying S&T to gamma.
For functions with several complex variables z1, z2, ...as arguments I can now perturb each z_j (j =1, 2,..) independendently and compute partial derivatives with ease. Is that how the work?
I feel the method is most applicable with a small number of parameters, like MLE(?) (your aerodynamics have 18 parameters?)
Would the approach work for ML and more general optimisation, i.e. does it scale?
It was mentioned that the compiler needs to support erfc(z) for z complex. Not in C++11 but I found that it can be written in terms of the Faddeeva function (See package "Faddeeva" and good to go).
http://ab-initio.mit.edu/wiki/index.php ... va_Package