Page 1 of 1

MPI subarray problem

Posted: March 5th, 2010, 10:02 pm
by Dcole
Hi all. I am still working on parallelizing that red-black G.S. solver that I previously mentioned in another thread. I am having a problem with a little test program I am developing to help with integration into our main program. I want to be able to "extract" a sub-matrix from a larger 2D matrix, and do it as efficiently as possible w.r.t. the amount of MPI overhead.Please see the attached code for a simple example. I am creating a 4x4 matrix, and I want to extract a 2x2 matrix from it, on another processor. In this case, I want to extract the 2x2 that is in the top-left corner. This is giving me bogus results, and may be because i am declaring the type wrong. If you can get a version working, I'd much apprecaite it.Thanks in advance!

MPI subarray problem

Posted: March 5th, 2010, 10:03 pm
by Dcole
please excuse any typos. I dont have a C-compiler on this computer, so I had to type this in from memory!

MPI subarray problem

Posted: March 6th, 2010, 9:17 am
by richardlm
When using MPI datatypes you need to commit them before you use them. See:MPI_Type_commitand clean up afterwards:MPI_Type_free.Your receive structure is not correct. 1) One send should be paired with one receive. 2) You have not allocated memory correctly for your receive. You are using the same datatype for send and recvs. It writes to exactly the same memory positions (relative to the address specified in the first argument of MPI_Recv). [Note, you don't have to the same datatype for the MPI_Recv, just consider what you are receiving to be a stream of doubles and put them where you want.]

MPI subarray problem

Posted: March 6th, 2010, 9:19 am
by mblatt
Hi.Why does your matrix MPI_Datatype have four blocks and not just two? Shouldn't this be MPI_Type_vector(2,2,4, MPI_DOUBLE, &matrix)?At the receiving end your allocated memory is not sufficient. You only allocate a 2x2 matrix, but MPI expects a 4x4 matrix due to the defined custom data type. Either you provide a sufficiently big receive buffer or use different MPI_Datatype for the receive.Additionally, I would be carefull to use MPI_Type_vector with an array of arrays as it is only intended to be used with plain arrays. There might be additional padding between your matrix rows on some platforms.Hint: You might want to get acquainted to memory debuggers like valgrind to find these problems yourself next time

MPI subarray problem

Posted: March 6th, 2010, 9:53 am
by richardlm
QuoteOriginally posted by: mblatt[...]Additionally, I would be carefull to use MPI_Type_vector with an array of arrays as it is only intended to be used with plain arrays. There might be additional padding between your matrix rows on some platforms.To avoid this you could create a struct type (MPI_Type_struct) of contiguous types (MPI_Type_contiguous).QuoteOriginally posted by: mblatt[...]Hint: You might want to get acquainted to memory debuggers like valgrind to find these problems yourself next timeYep, valgrind is great.

MPI subarray problem

Posted: March 6th, 2010, 11:21 pm
by Dcole
I will check these suggestions out when I get to work on monday. I dont have the MIPS compiler here that I was using to test this. Sounds like using a structure is going to be thew ay to go

MPI subarray problem

Posted: March 7th, 2010, 6:30 am
by mblatt
Out of curiosity: Why do you need to extract a submatrix to send it to another processor when doing red-black Gauss-Seidel?

MPI subarray problem

Posted: March 7th, 2010, 3:08 pm
by Dcole
I am fitting this in to a previously, and poorly, written G.S. solver. It is actually working in a 3D domain, so I am red-blacking each "slice" for now, then stuffing it back into the slice of the full 3D domain. Is there perhaps a better way?