Sorry -- really 6 parameters, I suppose: 4 V-process parameters + T, moneyness for the option.
So we're down to about 7 axis values per parameter.
Still, might be an interesting compare, maybe again using 10 values per parameters and allowing the extra time to make the million entry table. (And we haven't even parallelized that!)
Hi Alan, thanks, always good talking to you.
Why 6 parameters though and not 9? Namely S/K, T, r, d, v0, vBar, kappa, xi and rho. Don't we need all these to price a vanilla under Heston?
As for this 1000secs, it's an arbitrary (curiously low) number that student used. Since you only need to do this once, why be so stingy? What's stopping you from letting it run for a whole day or more?
So you could create 10 million entries if you want, or more. But with 9 dimensions here it would still only give you 6 points per parameter. I doubt that can cut it for any practical level of accuracy.
Of course there are more sophisticated ways of interpolating in high dimensions, one idea being not using a cartesian grid, but rather placing the interpolating points a la Monte Carlo, or via quasi random sequences. There was a relevant talk last week at the CQF conference, probably available online if interested, it was called something like "Alternatives to NN's in Finance".
I have no idea how would these alternative interpolating methods compare with the NN's black magic. But they sure sounded more complicated to implement (though more "transparent"). And I may be wrong but I got the impression that they wouldn't be an option for dimension > 10 (wasn't paying 100% attention I admit).