Serving the Quantitative Finance Community

 
User avatar
JosephFrank
Topic Author
Posts: 1
Joined: June 13th, 2003, 3:41 pm

Out of memory - Matlab

October 4th, 2003, 10:29 pm

Hello, When I was trying to build a 126600x633 matrix i received an out of memory message. MY PC is pentium 4 with a lot of empty space and my Rams are 1 geg. I am puzzled . This is the smallest file in my datasets and I have files 30 times larger than this one. Does this mean that I can't use matlab for my research or is there a problem that needs to be fixed. Any help would be appreciated. JF
 
User avatar
mghiggins
Posts: 0
Joined: November 3rd, 2001, 1:38 pm

Out of memory - Matlab

October 4th, 2003, 11:59 pm

126,600 X 633 = 80.1MM numbers. If each one's a double, it takes up 8 bytes. That means your matrix will take up 640MB. Depending on how matlab represents a matrix internally it could be significantly more than this.Not surprising you're running out of RAM (which is all that matters here). Something 30x larger will obviously be totally out of range of your machine's memory. It's not a problem with matlab - it's a problem with the size of your matrix, I'm afraid.You could be a custom program that loads only small bits of the matrix into memory at any one time, I suppose. Or can you tell matlab to reduce the precision with which it stores each number in a matrix? I dunno matlab well enough to say.Anyway, that's one big honking matrix. You'll have a tough time finding anything that can deal with it in a reasonable manner.
 
User avatar
ebifry
Posts: 0
Joined: December 9th, 2001, 8:34 am

Out of memory - Matlab

October 5th, 2003, 11:13 am

QuoteOriginally posted by: JosephFrankWhen I was trying to build a 126600x633 matrix i received an out of memory message Is the matrix sparse?, if it is, have a look at using sparsein matlab, and you might be able to do what you want to do, otherwise, buy lots of memory!CheersTony
 
User avatar
DominicConnor
Posts: 41
Joined: July 14th, 2002, 3:00 am

Out of memory - Matlab

October 6th, 2003, 8:13 am

mghiggins calculation represents a lower bound.In a simple compiled language like C++, that would also be close to the final number.I don't know the internals of Matlab, but there will be overhead, and Tony e3321534's tip about sparse leads up to look at other problems. The worst case of a sparse array is one that is actually full of data. You may be talking about a pointer of 4 bytes for every value, plus memory managment overhead which can easily be the same. Thus an array declared as "sparse" but actually used as fully populated may be more than twice as big as it would be as raw data.The memory managment in this sort of system is non trival, and you may well be simply hitting some sort of internal limit.The first internal limit is NT itself. Depending upon your version and settings, you have 1-3 Gb for all data. This limit is independant of hardware. This limit holds your 640mb of numbers, Matlab's data, some of NT's and any other apps running.To even get to this limit, you must ensure that your swap file is big enough. You must set it in the System applet in the Control panel. You'll find it in the Performance dialog in Advanced.More RAM may make your app go faster, but if making the swap file bigger doesn't help, then more RAM won't fix it.An interesting test would be to create an array of the smallest item you can. By binary search you can find out the maxium sized array you can have of a given type of variable.If you get the same result, regardless of the size of the object you contain, then you've hit some internal Matlab limit that probably can't be fixed.
 
User avatar
richg
Posts: 0
Joined: October 2nd, 2002, 1:52 pm

Out of memory - Matlab

October 6th, 2003, 9:40 am

If you don't need the Matlab desktop and don't use the Matlab Java-related stuff, you could trying running Matlab without the Java Virtual Machine (i.e. using the -nojvm option when running Matlab) - this will help you out with memory.richg.
 
User avatar
matthewcroberts
Posts: 1
Joined: October 18th, 2001, 7:52 pm

Out of memory - Matlab

October 6th, 2003, 2:47 pm

Joseph,In addition to all of the comments above, even though your application may, in theory, be able to use 2Gb of memory (the theoretical limit for Matlab under Win32) there are a couple of other constraints to remember: first, ML itself takes some resources, which can be reduced by launching matlab with the -nojvm option, as suggested by richg, which will save you about 80-100MB, but second, ML requires that variables be stored in contiguous memory, so in practice, even if you had 2gb of RAM, you'd have a tough time getting a 1.7 or 1.8 GB variable created.Most of these constraints are not necessarily ML-specific. You will encounter great difficulty utilizing a matrix with the number of elements of the order 10^8 on a PC. Certain software is written specifically to handle very large datasets, such as SAS, which accomplishes it by keeping large datasets on the disk.Depending on what you want to do with it, you can, however, achieve the same thing in Matlab by reading in your data set, breaking it up into smaller submatrices & storing it as .MAT files on the disk. Then, for your matrix operations, just perform treat the files as the elements of a partitioned matrix. It is certainly a bit more tedious, but it can be made pretty straightforward with some creative use of higher data types (cell arrays, etc) and overloaded operators.Matt.
 
User avatar
mib
Posts: 1
Joined: January 29th, 2002, 3:10 pm

Out of memory - Matlab

October 7th, 2003, 7:43 am

you should also be careful with what you are doing with this matrix. MATLAB is great for the simplicity of matrix operations, but you pay for it by not knowing that some operations require copying the matrix first. Thus, for many things, you will need at least double the memory your matrix takes.