That's exactly what it is. It's a log likelihood!I think the ability (or not) of the neural network to outperform traditional econometric methods is a critically important idea. So far, I have learned that your cross-entropy performance measure is close to a likelihood in MLE, so it indeed appears possible to compare things on essentially the same basis. But, I am interested in learning these new methods, not the old ones! So, I am trying to do that ... slowly ... while tanning.Thanks Alan.
Maybe it would be an idea to start a competition if we want to get an overview of methods and their relative performances? Adding all this a-priori knowledge in some manual designed model is precisely something I want to avoid and outsource to the neural network.
In ML the two most important elements (IMO) are the "cost function" and "how to check what performance will be on unseen data". Most of the deep learning models work with cost minimazation, and there are simple rules that tell you what cost metric to use for what type of problem. If you pick the wrong metric you'll get a wrong solution (I ran into that a coupe of times). Cross entropy is really good for classification type of problems -like "from which of these 100 bins do you think tomorrow's return will get drawn?"-.
Here is a nice intro
Enjoy the sun!