The blue line is the second derivative of delta. (taken from MathExchange as a requirements input).

- Cuchulainn
**Posts:**63816**Joined:****Location:**Amsterdam-
**Contact:**

The blue line is the second derivative of delta. (taken from MathExchange as a requirements input).

Last edited by Cuchulainn on August 31st, 2020, 1:39 pm, edited 2 times in total.

My C++ Boost code gives

262537412640768743.999999999999250072597198185688879353856337336990862707537410378210647910118607313

http://www.datasimfinancial.com

http://www.datasim.nl

262537412640768743.999999999999250072597198185688879353856337336990862707537410378210647910118607313

http://www.datasimfinancial.com

http://www.datasim.nl

- Cuchulainn
**Posts:**63816**Joined:****Location:**Amsterdam-
**Contact:**

And my solution is to use CSM for [$]\delta^{(1)}[$] and use centred fd for [$]\delta^{(2)}[$].

Cool

Cool

My C++ Boost code gives

262537412640768743.999999999999250072597198185688879353856337336990862707537410378210647910118607313

http://www.datasimfinancial.com

http://www.datasim.nl

262537412640768743.999999999999250072597198185688879353856337336990862707537410378210647910118607313

http://www.datasimfinancial.com

http://www.datasim.nl

- katastrofa
**Posts:**9853**Joined:****Location:**Alpha Centauri

My solution to the ultimate shower-related relationship problem:That sounds like a variation of a bad pickup line. “We’ll accelerate our relationship exponentially if we take a shower together!”

Hard bristles - I like it rough!

BTW, thank you for explaining to us idea of the "speed":-)

Cuchulainn, it's usually fair to compare such insightful results against exact solutions.

- Cuchulainn
**Posts:**63816**Joined:****Location:**Amsterdam-
**Contact:**

No kidding.

My C++ Boost code gives

262537412640768743.999999999999250072597198185688879353856337336990862707537410378210647910118607313

http://www.datasimfinancial.com

http://www.datasim.nl

262537412640768743.999999999999250072597198185688879353856337336990862707537410378210647910118607313

http://www.datasimfinancial.com

http://www.datasim.nl

- katastrofa
**Posts:**9853**Joined:****Location:**Alpha Centauri

When I mentioned modelling Dirac deltas by narrowing Gaussians (going with std to zero), I thought that you modelled actual Gaussian random variables. Now that I see this gymnastics has no specific foundations, I'd simply suggest taking a piece-wise uniform distribution 0 ( f(x) = 1/a for -a/2 <= x <= a/2 ans f(x) = 0 otherwise) and going with a to 0.

CSM is a nice method indeed. I'm wondering about the computational efficiency of complex arithmetic operations, though.

BTW, thinking about Gaussian approximations I found this: https://en.cppreference.com/w/cpp/exper ... ns/hermite

I wasn't aware they added orthogonal polynomials to C++!

CSM is a nice method indeed. I'm wondering about the computational efficiency of complex arithmetic operations, though.

BTW, thinking about Gaussian approximations I found this: https://en.cppreference.com/w/cpp/exper ... ns/hermite

I wasn't aware they added orthogonal polynomials to C++!

- Cuchulainn
**Posts:**63816**Joined:****Location:**Amsterdam-
**Contact:**

I could have used that box function indeed.but my choice is another standard one and is infinitely differentiable.

You know, if you give alternative solution, it is useful to say*why* I should it. Now it's just coffee salon chit-chat! LOL

Seriously, I'll believe it when you plot [$]\delta^{(2)}[$].

You know, if you give alternative solution, it is useful to say

Seriously, I'll believe it when you plot [$]\delta^{(2)}[$].

262537412640768743.999999999999250072597198185688879353856337336990862707537410378210647910118607313

http://www.datasimfinancial.com

http://www.datasim.nl

- katastrofa
**Posts:**9853**Joined:****Location:**Alpha Centauri

It is a tea-table talk. You're in the best position to find such alternative solutions. I don't even understand what you're doing.

- Cuchulainn
**Posts:**63816**Joined:****Location:**Amsterdam-
**Contact:**

Alan is also working on this. To recall, it has all boiled down toIt is a tea-table talk. You're in the best position to find such alternative solutions. I don't even understand what you're doing.

In my words: Compute [$]\delta^{(n)}[$] for [$]n=0,1,2,3,...[$] Clear.

//

[$]\delta^{(2)}[$].

And it's finished.

[$]\delta^{(1)}[$] == speed.

Amen.

262537412640768743.999999999999250072597198185688879353856337336990862707537410378210647910118607313

http://www.datasimfinancial.com

http://www.datasim.nl

- katastrofa
**Posts:**9853**Joined:****Location:**Alpha Centauri

I’m not Catholic.

If you post a nice cat photo I'll tell you how to do this faster and better.

If you post a nice cat photo I'll tell you how to do this faster and better.

- Cuchulainn
**Posts:**63816**Joined:****Location:**Amsterdam-
**Contact:**

That's what they all say. I suppose I'll just have to take your word for it.If you post a nice cat photo I'll tell you how to do this faster and better.

I call your bluff.

Last edited by Cuchulainn on September 1st, 2020, 4:27 pm, edited 1 time in total.

262537412640768743.999999999999250072597198185688879353856337336990862707537410378210647910118607313

http://www.datasimfinancial.com

http://www.datasim.nl

- Cuchulainn
**Posts:**63816**Joined:****Location:**Amsterdam-
**Contact:**

Now we solve Fokker Planck PDE based on the initial SDE [$]dX = dW[$] to see if my super fast ADE FDM is up to it (ADE is probably one of the greatest in the history of the FDMs). This FPE is a serious test case numerically and ADE comes true with flying colours (and integral == 1).

262537412640768743.999999999999250072597198185688879353856337336990862707537410378210647910118607313

http://www.datasimfinancial.com

http://www.datasim.nl

- katastrofa
**Posts:**9853**Joined:****Location:**Alpha Centauri

No, cats or nothing. That's how I operate.

- Cuchulainn
**Posts:**63816**Joined:****Location:**Amsterdam-
**Contact:**

Nice? It's (very) robust and accurate.

Efficiency is a non-issue; take an example to compare the number of function calls and operations

(f(x+h) - f(x-h))/2h

versus

Imaginary part of f(x + ih)/h

But all the functions will be complexified; can be easy (polynomials) or not. It becomes very easy to be drawn into functions of (several) complex variables.

// CSM is a well-kept secret.

262537412640768743.999999999999250072597198185688879353856337336990862707537410378210647910118607313

http://www.datasimfinancial.com

http://www.datasim.nl

- Cuchulainn
**Posts:**63816**Joined:****Location:**Amsterdam-
**Contact:**

CSM for noobies

Code: Select all

```
// TestComplexStep.cpp
//
// Complex-step method to compute approximate derivatives.
// Example is scalar-valued function of a scalar argument.
//
// https://pdfs.semanticscholar.org/3de7/e8ae217a4214507b9abdac66503f057aaae9.pdf
//
// http://mdolab.engin.umich.edu/sites/default/files/Martins2003CSD.pdf
//
// (C) Datasim Education BV 2019-2020
// dduffy@datasim.nl
//
#include <functional>
#include <complex>
#include <iostream>
#include <iomanip>
#include <cmath>
// Notation and function spaces
using value_type = double;
using cvalue_type = std::complex< value_type>;
template <typename T>
using FunctionType = std::function < T(const T& c)>;
using CFunctionType = FunctionType<std::complex<value_type>>;
// Test case from Squire&Trapp 1998
template <typename T> T SquireTrapp(const T& t)
{
T n1 = std::exp(t);
T d1 = std::sin(t);
T d2 = std::cos(t);
return n1 / (d1*d1*d1 + d2*d2*d2);
}
template <typename T> T func2(const T& t)
{ // Derivative of e^t, sanity check
return std::exp(std::pow(t,1));
// return std::exp(std::pow(t, 5));
}
value_type Derivative(const CFunctionType& f, value_type x, value_type h)
{ // df/dx at x using tbe Complex step method
// std::complex<value_type> z(x, h); // x + ih, i = sqrt(-1)
return std::imag(f(cvalue_type(x,h))) / h;
}
int main()
{
// Squire Trapp
double x = 1.5; double h = 0.1;
do
{
std::cout << std::setprecision(12) << Derivative(SquireTrapp<cvalue_type>, x, h) << '\n';
h *= 0.1;
} while (h > 1.0e-300);
// Exponential function (101 sanity check)
x = 5.0;
h = 1.0e-10;
std::cout << "Exponential 1: " << std::setprecision(12) << Derivative(func2<cvalue_type>, x, h) << '\n';
return 0;
}
```

262537412640768743.999999999999250072597198185688879353856337336990862707537410378210647910118607313

http://www.datasimfinancial.com

http://www.datasim.nl

- katastrofa
**Posts:**9853**Joined:****Location:**Alpha Centauri

Dude, do we need to start a flame over the complex step method for the THIRD time?CSM is a nice method indeed. I'm wondering about the computational efficiency of complex arithmetic operations, though.

Nice? It's (very) robust and accurate.

Efficiency is a non-issue; take an example to compare the number of function calls and operations

(f(x+h) - f(x-h))/2h

versus

Imaginary part of f(x + ih)/h

But all the functions will be complexified; can be easy (polynomials) or not. It becomes very easy to be drawn into functions of (several) complex variables.

// CSM is a well-kept secret.

1 - viewtopic.php?f=10&t=101843&p=849194&hi ... ep#p849194

2 - viewtopic.php?f=34&t=101399&p=832095&hi ... ep#p832095

Obviously, this is not true, and surprising to come in the discussion on differentiating holomorphic functions.But all the functions will be complexified; can be easy (polynomials) or not. It becomes very easy to be drawn into functions of (several) complex variables.

The method has not been a secret since Cauchy and Riemann. It's probably not a secret either that it can be viewed as a lazy-cat version of automatic differentiation

Anyway, there's a better way to calculate your delta derivatives... but it requires a cat photo.

BTW, your CSM error is exp[-(x+i h)^2] = Re [...] - i exp[-x^2]

GZIP: On