July 31st, 2009, 1:17 pm
I've been testing LevMar for a few weeks fitting SABR curves to market data, and I don't have a very strong grasp on what damping parameters work best under what circumstances. It seems like using a higher lambda (0.5ish) will result in a worse fit in fewer iterations of the algorithm. But lower factors (close to 0) will take longer and give a better set of best-fit parameters. And using the 'increased damping' method recommended by Marquardt (on the wikipedia page) also doesn't appear to yield palpable differences in the speed/accuracy tradeoff.Have you found there to be much benefit in finding the best damping factor? Or do people generally just set it at like 0.01, and just concentrate on their exit conditions?