Serving the Quantitative Finance Community

 
User avatar
trinity
Topic Author
Posts: 0
Joined: August 7th, 2003, 6:43 am

KL Minimisation = s-o-s minimisation ?

July 2nd, 2006, 9:38 am

Is KL minimisation = min(sum-of-sqr error between densities) ?Let's say I have a parametric way of descrbing a probability distribution, and I am trying to choose parameter values such that I want to minimize the Kullback-Leibler distance between the distribution implied by my parameter choice and some prior.If I am able to calc the density of the parametrically specified distribution at values of the random variable with some regular spacing (-3,-2.9,-2.8,...0,0.1,0.2,...,2.9,3.0), and just run some kind of nonlinear optimiser to minimise the sum-of-sqr error between a column vector of density values between the distribution implied by parameter choices and the known density values, could one say that is equivalent to KL minimisation ?Thanks
 
User avatar
scholar
Posts: 0
Joined: October 17th, 2001, 8:03 pm

KL Minimisation = s-o-s minimisation ?

July 2nd, 2006, 5:43 pm

No, it is not equivalent. Least square measure of the distance between two densities is the leading term in the Taylor expansion of the KL divergence, which is something you can easily check yourself.
 
User avatar
trinity
Topic Author
Posts: 0
Joined: August 7th, 2003, 6:43 am

KL Minimisation = s-o-s minimisation ?

July 3rd, 2006, 7:59 am

Many thanks, makes sense. Thanks Scholar.