Page 1 of 1
Neural Nets
Posted: September 12th, 2010, 1:50 am
by rmb623
Why are neural nets such a thing of the past? What is the knock on them?
Neural Nets
Posted: September 12th, 2010, 3:32 am
by acastaldo
They forecast very well in-sample and very poorly out-of sample (overfitting).
Neural Nets
Posted: September 12th, 2010, 1:41 pm
by Beachcomber
Many neural networks are set up more to replicate than to predict, but I've sort of wondered about the overfitting aspect. Isn't that something that is correctable? In neural net speak, it is a matter of overtraining. Simply limiting the training should at least help the overfitting problem.It seems there are various related ideas that could be useful in forecasting that have also been thrown out with the bath water. I am wondering if they will come back into fashion some day soon.
Neural Nets
Posted: September 12th, 2010, 5:03 pm
by Traden4Alpha
Limiting the training doesn't solve the overfitting problem because its a model complexity problem, not a data problem. If the neural net is capable of expressing a function that is more complex than the true function being modeled, then the NN will find, in the noise and ill-sampling of the training data, some complex pattern that isn't there.It's like feeding noisy but linear data into a 100th-order polynomial regression. The polynomial regression WILL find non-zero high order coefficients and some of them (probably 5 of them) will be statistically significant at the 5% confidence level.
Neural Nets
Posted: September 12th, 2010, 6:46 pm
by Beachcomber
Point taken. But that is kind of the universal question of forecasting - what is predictable? What isn't? What's spurious?Personally, I have only used neural networks for next day forecasting (short term for my market) where I expect there to be highly nonlinear behavior, and had some success. I would stay far away from ANNs if I suspected linear behavior. I'm guessing that the pattern recognition guys and the data mining guys have to use ANNs at some point in the process.
Neural Nets
Posted: September 12th, 2010, 11:17 pm
by spv205
The problem with neural nets is that it was developed by computer scientists rather than statisticians. So they ignored all the basic theory/practise of statistics.[ eg your mention of overtraining - the fact that the regression has not converged is a good thing!? - you could do linear regression by gradient descent too, and talk about overtraining!]It is just a form of non-linear regression. But it was marketed as handling noisy data ("like the brain" etc ) when clearly the noisier your data the more simple your model should be.
Neural Nets
Posted: September 13th, 2010, 2:04 am
by Beachcomber
QuoteOriginally posted by: spv205The problem with neural nets is that it was developed by computer scientists rather than statisticians. So they ignored all the basic theory/practise of statistics.[ eg your mention of overtraining - the fact that the regression has not converged is a good thing!? - you could do linear regression by gradient descent too, and talk about overtraining!]It is just a form of non-linear regression. But it was marketed as handling noisy data ("like the brain" etc ) when clearly the noisier your data the more simple your model should be.I agree. This is why I think that ANNs replicate patterns better than they forecast. My point (that I never actually got around to making) was that we knew all this 10 or 15 years ago. I am just a little surprised that there hasn't been more statistical rigor added to the methodology in a decade. For all the faults of ANNs, they do have positives that are worth preserving.
Neural Nets
Posted: September 13th, 2010, 2:07 am
by Beachcomber
sorry...pursuing...not preserving
Neural Nets
Posted: September 13th, 2010, 2:43 pm
by crmorcom
Pattern-replication can be very useful, if there is a pattern to replicate. Neural-net methods have many of the same problems that any "rigorous" statistical model does: if OOS is not the same as IS, you are screwed, no matter how parsimonious and rigorous your model is.I agree with Beachcomber to some extent: if you have a stable non-linear relationship, NN methods can be quite helpful, because they do curve fitting in a quite compact and generalizable way.The overfitting is also something that you can control: use fewer units and or fewer layers or restrict the weights. With a NN, you always know how many free parameters you have. So long as you remember that NN methods are curve-fitting - nothing more and nothing less - then you can use them safely. If you are constructing any model with, say, 100 free parameters and 120 data points, you are just a bad person, no matter whether you are doing a linear regression, a neural net, or reading tea-leaves.
Neural Nets
Posted: September 14th, 2010, 12:04 pm
by pnrodriguez
QuoteOriginally posted by: spv205The problem with neural nets is that it was developed by computer scientists rather than statisticians. So they ignored all the basic theory/practise of statistics.Neural Networks and Projection Pursuit Regression have many similarities. In fact, a NN with one hidden layer has exactly the same "form" as a projection pursuit model.