Serving the Quantitative Finance Community

 
User avatar
rmb623
Topic Author
Posts: 0
Joined: March 16th, 2009, 3:02 am

Neural Nets

September 12th, 2010, 1:50 am

Why are neural nets such a thing of the past? What is the knock on them?
 
User avatar
acastaldo
Posts: 14
Joined: October 11th, 2002, 11:24 pm

Neural Nets

September 12th, 2010, 3:32 am

They forecast very well in-sample and very poorly out-of sample (overfitting).
 
User avatar
Beachcomber
Posts: 2
Joined: May 25th, 2004, 5:56 pm

Neural Nets

September 12th, 2010, 1:41 pm

Many neural networks are set up more to replicate than to predict, but I've sort of wondered about the overfitting aspect. Isn't that something that is correctable? In neural net speak, it is a matter of overtraining. Simply limiting the training should at least help the overfitting problem.It seems there are various related ideas that could be useful in forecasting that have also been thrown out with the bath water. I am wondering if they will come back into fashion some day soon.
 
User avatar
Traden4Alpha
Posts: 3300
Joined: September 20th, 2002, 8:30 pm

Neural Nets

September 12th, 2010, 5:03 pm

Limiting the training doesn't solve the overfitting problem because its a model complexity problem, not a data problem. If the neural net is capable of expressing a function that is more complex than the true function being modeled, then the NN will find, in the noise and ill-sampling of the training data, some complex pattern that isn't there.It's like feeding noisy but linear data into a 100th-order polynomial regression. The polynomial regression WILL find non-zero high order coefficients and some of them (probably 5 of them) will be statistically significant at the 5% confidence level.
 
User avatar
Beachcomber
Posts: 2
Joined: May 25th, 2004, 5:56 pm

Neural Nets

September 12th, 2010, 6:46 pm

Point taken. But that is kind of the universal question of forecasting - what is predictable? What isn't? What's spurious?Personally, I have only used neural networks for next day forecasting (short term for my market) where I expect there to be highly nonlinear behavior, and had some success. I would stay far away from ANNs if I suspected linear behavior. I'm guessing that the pattern recognition guys and the data mining guys have to use ANNs at some point in the process.
 
User avatar
spv205
Posts: 1
Joined: July 14th, 2002, 3:00 am

Neural Nets

September 12th, 2010, 11:17 pm

The problem with neural nets is that it was developed by computer scientists rather than statisticians. So they ignored all the basic theory/practise of statistics.[ eg your mention of overtraining - the fact that the regression has not converged is a good thing!? - you could do linear regression by gradient descent too, and talk about overtraining!]It is just a form of non-linear regression. But it was marketed as handling noisy data ("like the brain" etc ) when clearly the noisier your data the more simple your model should be.
 
User avatar
Beachcomber
Posts: 2
Joined: May 25th, 2004, 5:56 pm

Neural Nets

September 13th, 2010, 2:04 am

QuoteOriginally posted by: spv205The problem with neural nets is that it was developed by computer scientists rather than statisticians. So they ignored all the basic theory/practise of statistics.[ eg your mention of overtraining - the fact that the regression has not converged is a good thing!? - you could do linear regression by gradient descent too, and talk about overtraining!]It is just a form of non-linear regression. But it was marketed as handling noisy data ("like the brain" etc ) when clearly the noisier your data the more simple your model should be.I agree. This is why I think that ANNs replicate patterns better than they forecast. My point (that I never actually got around to making) was that we knew all this 10 or 15 years ago. I am just a little surprised that there hasn't been more statistical rigor added to the methodology in a decade. For all the faults of ANNs, they do have positives that are worth preserving.
 
User avatar
Beachcomber
Posts: 2
Joined: May 25th, 2004, 5:56 pm

Neural Nets

September 13th, 2010, 2:07 am

sorry...pursuing...not preserving
 
User avatar
crmorcom
Posts: 0
Joined: April 9th, 2008, 4:09 pm

Neural Nets

September 13th, 2010, 2:43 pm

Pattern-replication can be very useful, if there is a pattern to replicate. Neural-net methods have many of the same problems that any "rigorous" statistical model does: if OOS is not the same as IS, you are screwed, no matter how parsimonious and rigorous your model is.I agree with Beachcomber to some extent: if you have a stable non-linear relationship, NN methods can be quite helpful, because they do curve fitting in a quite compact and generalizable way.The overfitting is also something that you can control: use fewer units and or fewer layers or restrict the weights. With a NN, you always know how many free parameters you have. So long as you remember that NN methods are curve-fitting - nothing more and nothing less - then you can use them safely. If you are constructing any model with, say, 100 free parameters and 120 data points, you are just a bad person, no matter whether you are doing a linear regression, a neural net, or reading tea-leaves.
 
User avatar
pnrodriguez
Posts: 1
Joined: December 19th, 2008, 1:12 pm

Neural Nets

September 14th, 2010, 12:04 pm

QuoteOriginally posted by: spv205The problem with neural nets is that it was developed by computer scientists rather than statisticians. So they ignored all the basic theory/practise of statistics.Neural Networks and Projection Pursuit Regression have many similarities. In fact, a NN with one hidden layer has exactly the same "form" as a projection pursuit model.