SERVING THE QUANTITATIVE FINANCE COMMUNITY

 
User avatar
Cuchulainn
Posts: 57335
Joined: July 16th, 2004, 7:38 am
Location: Amsterdam
Contact:

Re: Adjoint and Automatic Differentiation (AAD) in computational finance

April 25th, 2018, 10:25 pm

.
Last edited by Cuchulainn on April 27th, 2018, 6:40 pm, edited 1 time in total.
 
User avatar
ISayMoo
Posts: 1127
Joined: September 30th, 2015, 8:30 pm

Re: Adjoint and Automatic Differentiation (AAD) in computational finance

April 26th, 2018, 5:03 am

Why should I do all the work? :-)
 
User avatar
Cuchulainn
Posts: 57335
Joined: July 16th, 2004, 7:38 am
Location: Amsterdam
Contact:

Re: Adjoint and Automatic Differentiation (AAD) in computational finance

April 26th, 2018, 10:10 am

.
Last edited by Cuchulainn on April 27th, 2018, 6:40 pm, edited 1 time in total.
 
User avatar
ISayMoo
Posts: 1127
Joined: September 30th, 2015, 8:30 pm

Re: Adjoint and Automatic Differentiation (AAD) in computational finance

April 26th, 2018, 10:24 am

I do.
 
User avatar
Cuchulainn
Posts: 57335
Joined: July 16th, 2004, 7:38 am
Location: Amsterdam
Contact:

Re: Adjoint and Automatic Differentiation (AAD) in computational finance

April 26th, 2018, 10:58 am

.
Last edited by Cuchulainn on April 27th, 2018, 6:41 pm, edited 1 time in total.
 
User avatar
outrun
Posts: 4573
Joined: April 29th, 2016, 1:40 pm

Re: Adjoint and Automatic Differentiation (AAD) in computational finance

April 26th, 2018, 4:31 pm

so the idea is to have higher order gradient of the surrogate loss function (blue) to find the actual optimal (red) ?


Image

I think the idea and math seems very nice, it's a good new result. The "MagicBox" symbol is something my 7 year old son would come up with (actually I pretty sure he would draw a cat face operator Image[x] ), very distracting... 

It looks computationally easy to implement, .. but the practical relevance will really depend on how well surrogate loss functions approximates the real loss function, and how well behaved the higher order gradient terms are, and how the computational costs scale (Hessians are avoided because of the practical cost/benefit). The surrogate loss functions I know only have a good local approximation, you can't do a large step size any way? These are however orthogonal issues, bad SL functions is not the topic of the paper, and I'm sure that will improve in the future too, although small batch sizes make gradients very noisy anyways.

It would be nice if someone did a benchmark on Atari agains e.g. PPO and compared computing time, sample efficiency and overall performance of the found solution?
 
User avatar
Cuchulainn
Posts: 57335
Joined: July 16th, 2004, 7:38 am
Location: Amsterdam
Contact:

Re: Adjoint and Automatic Differentiation (AAD) in computational finance

April 27th, 2018, 12:16 pm

As described in various papers/conference presentations, the AAD approach can potentially reduce the computational cost of sensitivities by several orders of magnitude, while having no approximation error. It can be used either for computing the Greeks or, respectively, for computing exact (up to machine precision)  gradient (or even Hessian matrix), the last one very useful when using a gradient-based local optimizer. A framework based on AD can also be developed for automatic computation of Greeks/sensitivities from existing code, similar to various work done in in the last 15-20 years in areas such as fluid dynamics, meteorology and data assimilation.  An overview of the approach and of the relevant literature is presented in http://papers.ssrn.com/sol3/papers.cfm? ... =1828503If you know any other relevant references, please mention them in this thread. In case you have used recently any AD software, if possible please share your experience with it.Thank you
The complex-step method is really very cool. Can't understand why it does not get more exposure. No more catastrophic cancellation! in 1st derivative It can even be used to compute Fréchet derivatives.
 
User avatar
Cuchulainn
Posts: 57335
Joined: July 16th, 2004, 7:38 am
Location: Amsterdam
Contact:

Re: Adjoint and Automatic Differentiation (AAD) in computational finance

July 31st, 2018, 4:28 pm

AD is quite cute and very elegant but is it not a CS solution (using graphs) for what is essentially a problem in numerical analysis, i.e. computing a derivative at a given point?

AD is easy to understand (do by hand) and then you see the data structures. For large problems these will become yuge IMO. 

For PDE I think AD will not  be optimal. While FDM truncation error is small, its derivative will not be in general. 
ABOUT WILMOTT

PW by JB

Wilmott.com has been "Serving the Quantitative Finance Community" since 2001. Continued...


JOBS BOARD

JOBS BOARD

Looking for a quant job, risk, algo trading,...? Browse jobs here...


GZIP: On