August 29th, 2008, 3:26 pm
Interesting old thread... I use Python essentially exclusively where people use matlab and similar packages. Some comments:I've used PyTables and found them to very good. But the main use cases for them is if you have data with a lot of structure, need to archive it permanently or distribute widely. If you just have a few big arrays to dump to disk you may as well do this directly.Regarding large data structures, I've had no problems handling 8GB arrays on machines with 16GB of RAM. Somebody mentioned memory mapped files as a way of dealing with truly large data sets. This is supported by numarray and numpy out of the box. Just do numpy.load("foo", memmap=True)You can access most (all?) functionality of quantlib through the python binding.Or if you are developing your own algorithms, it is very easy to expose them for use in python. I use SWIG but other choices are boost, f2py etc