Serving the Quantitative Finance Community

 
User avatar
gbhal
Topic Author
Posts: 1
Joined: May 16th, 2005, 1:41 am

Function fitting.

July 3rd, 2006, 12:36 pm

Hi all,lets say we have f = f(x,y,z) and values for independent x,y,z and for f, with the funcitonality to be determined. Is there a scietific way to do this, we might assume for simplcity thatf(x,y,z) = g(x)h(y)i(z)Thanks.
 
User avatar
mutley
Posts: 20
Joined: February 9th, 2005, 3:51 pm

Function fitting.

July 3rd, 2006, 1:11 pm

that would impose huge restrictions on the fitting space. i.e. if f(x,y) = (x+y)^2, there'd be no suitable solution for g(x,y) = a(x)b(y)
 
User avatar
zeta
Posts: 26
Joined: September 27th, 2005, 3:25 pm
Location: Houston, TX
Contact:

Function fitting.

July 3rd, 2006, 1:50 pm

I'll give you the least fancy way. Say your intuition for the true functional form is F(x,y,z) and your datum is G(x,y,z) then your objective to be minimized is where the sum is really a triple sum and i stands for the various independent/dependant datum. This is just least squares and the name of the game is at each iteration of your algorithm (I'll mention a few shortly) you want to change the coefficients in the 'true' functional form. Now depending on how many variables you have to be optimized, you may use something straightforward like Levenberg Marquardt (N<10) or at the esoteric end of the scale simulated anneal (N>100).
Last edited by zeta on July 2nd, 2006, 10:00 pm, edited 1 time in total.
 
User avatar
gbhal
Topic Author
Posts: 1
Joined: May 16th, 2005, 1:41 am

Function fitting.

July 3rd, 2006, 3:19 pm

Thanks mutley and zeta, I had in mind something similar to what you suggest zeta, but I was looking for something more fancier like neural networks, or something. The precise dififculty is that there will be some adhoc guesswork in assuming the "trur functional form", not to say that I cannot assume any, but basically is there very general way to go about this.Thanks.
 
User avatar
zeta
Posts: 26
Joined: September 27th, 2005, 3:25 pm
Location: Houston, TX
Contact:

Function fitting.

July 5th, 2006, 12:48 pm

okay; with a little more info I may be able to help... in NMR (for instance) one might not know the analytic form for a lineshape, but you can still fit it using an approximation based on the sum of dirac delta functions. Are you trying to fit a time series?
 
User avatar
gbhal
Topic Author
Posts: 1
Joined: May 16th, 2005, 1:41 am

Function fitting.

July 6th, 2006, 6:59 pm

nope it's not a time series, just a physical relationship between a variable and some other physical variables.
 
User avatar
MikeCrowe
Posts: 0
Joined: January 16th, 2006, 8:20 am

Function fitting.

July 17th, 2006, 9:15 am

There is a very scientific way of doing this, and thats to use Bayesian inference.You should create a functional form that is capable of representing ANY function that could be a solution. What you pick will depend on the exact nature of the problem, for example if your f(x,y,z) can only take values from 0 to 1 then pick a function that does the same.Next step is to make an assertion about any error on your data points. Usually we assume that the x,y,z are exact and your f has gaussian noise, in which case use least squares as your measure of error (as explained by zeta - although I can't help but be nitpicky about the use of the chi^2 symbol for least squares. This is confusing with chi^2 itself which is divided by G).Now you want to maximise the posterior of your function given your input data. To do this you need to work out the likelihood and prior. You do not need the evidence in this case, unless you intend to compare two different trial functions (f(x,y,z) vs h(x,y,z) for example - you may wish to do this for reasons I'll explain later).Your likelihood is pretty easy - since we've assumed gaussian error (you can include varying s.d. or constant).Your prior should encompass your belief about this function. This is generally enclosed in a regularising function, most commonly to stop the function from having too much curvature. You can limit the curvature by putting the second derivative in a least squares cost function. This will stop you having sharp jumps and discontinuities in your function and will act to minimise the volatility.The regularising parameter itself can be found either by trial and error, or if you are hardcore you can infer this from your data as well.You may wish to implement this in a number of ways.Firstly you can use functions like fourier series, laplace series, RBF, or polynomials (i suggest using chebychev or hermite or similar for this not simple 1+x+x^2...). This will probably need truncating but the trick is to make the series far far longer than you need (cf with rounding decimals). Use the regulator to reduce the complexity not the function. You should get the same answer whatever function you use, just some are easier than others to work with. If the problem admits it the exponential family are very tractable and are worth a look.Secondly you can use neural networks, particularly MLPs, to act as your f(x,y,z). These are then trained using backprop, and you must include a regulariser to prevent overfitting. This isn't trivial.Thirdly, and the easiest, is to try a number of different simple models, none of which cover the fitting space, but together cover almost all possible f(x,y,z). You can maximise the posterior of these very easily separately. If you then calculate the evidence for each model you can decide which one is the best. This is much easier because you do not need to use a regularising function (your models are too simple to begin with to permit overfitting). The down side is that you won't truly cover the fitting space, and the functions you fit will be subjective choices.
 
User avatar
MikeCrowe
Posts: 0
Joined: January 16th, 2006, 8:20 am

Function fitting.

July 17th, 2006, 9:22 am

The third method I gave is very useful if you have multiple theoretical models you could use to fit the data to.Failing that you could do something likeLinear, exponential, log, linearithmic, quadratic, cubic, sinusoidal, powerThose are all very quick to fit to data points, even in 4D, and you can then calculate the evidence fairly easily.One final tip - log everything first. By that I mean log the posterior, so that it is log(likelihood)+log(prior)-log(evidence). The log of these functions is usually a sum so differentiating becomes easier. Rely on the fact that the minimum of the log of a function is at the minimum of the function.
 
User avatar
zeta
Posts: 26
Joined: September 27th, 2005, 3:25 pm
Location: Houston, TX
Contact:

Function fitting.

July 18th, 2006, 12:10 pm

nice MikeC
 
User avatar
Bazman2
Posts: 1
Joined: January 28th, 2004, 2:22 pm

Function fitting.

July 29th, 2009, 7:25 am

Hi Mike Crowe,I am looking to fit a function and am very intersted in the curve fittig procedures you mention below.Can you suggest some good references to do some background reading on these techniques?Basically I have already done a PCA analysis and have three basic modes of movement for my curve parallel, steepner, butterfly. Now I want a method to calibrate these moves to different sets of data. The aim is to model the curve using a weighted linear combination of these functions. The curve fitting process is in effect to find these weights.Any refereneces you can give would be very welcome.Baz
 
User avatar
bojan
Posts: 0
Joined: August 8th, 2008, 5:35 am

Function fitting.

July 29th, 2009, 1:29 pm

I do not think Mike visits the forums very often... But, I suspect I can guess what direction he would recommend: you should start by reading,http://www.inference.phy.cam.ac.uk/itprnn/book.pdf
 
User avatar
Bazman2
Posts: 1
Joined: January 28th, 2004, 2:22 pm

Function fitting.

July 29th, 2009, 8:15 pm

thanks for this should keep me pretty busy!
 
User avatar
trippel
Posts: 0
Joined: May 27th, 2009, 11:17 am

Function fitting.

July 30th, 2009, 5:04 pm

I took a course in Machine Learning and we had the classic text of Bishop as course reference.