Error while instal G4mpi library

Hello, everyone.
I’m trying to run MPI example exMPI01 on an cluster, so at first i try to install G4mpi library onto the Geant4.10.03.p03 version.
When i’m installing this by instruction from http://www.apc.univ-paris7.fr/~franco/g4doxy4.10/html/md___volumes__work_geant_geant4_810_800_8p01_examples_extended_parallel__m_p_i__r_e_a_d_m_e.html i can see that configurating is done good, but when i’m trying to make, there is error on 71% of make, which you can see on the attached picture, or in the listning below:

[ 71%] Building CXX object CMakeFiles/G4mpi.dir/src/G4MPIscorerMerger.cc.o
/mnt/pool/4/rfibragimov/test/MPI/source/src/G4MPIscorerMerger.cc: In member function ‘void {anonymous}::MPIStatDouble::Pack(void*, int, int*, MPI::Intracomm&) const’:
/mnt/pool/4/rfibragimov/test/MPI/source/src/G4MPIscorerMerger.cc:48:65: error: invalid conversion from ‘const void*’ to ‘void*’ [-fpermissive]
MPI_Pack(&m_n,1,MPI::INT,buffer,bufferSize,position,comm);
^
/mnt/pool/4/rfibragimov/test/MPI/source/src/G4MPIscorerMerger.cc:50:69: error: invalid conversion from ‘const void*’ to ‘void*’ [-fpermissive]
MPI_Pack(&data,5,MPI::DOUBLE,buffer,bufferSize,position,comm);
^
/mnt/pool/4/rfibragimov/test/MPI/source/src/G4MPIscorerMerger.cc: In member function ‘void G4MPIscorerMerger::Pack(const G4VScoringMesh*)’:
/mnt/pool/4/rfibragimov/test/MPI/source/src/G4MPIscorerMerger.cc:342:88: error: invalid conversion from ‘const void*’ to ‘void*’ [-fpermissive]
MPI_Pack(nn,ss,MPI::CHAR,outputBuffer,outputBufferSize,&outputBufferPosition,comm);
^
make[2]: *** [CMakeFiles/G4mpi.dir/src/G4MPIscorerMerger.cc.o] Error 1
make[1]: *** [CMakeFiles/G4mpi.dir/all] Error 2
make: *** [all] Error 2

Do you have any guess what i’m doing wrong or what i can do to fix it?
Thankx in advance.

Based on the output, this looks like OpenMPI 2.1, which unfortunately isn’t supported by the example yet. Per the README for the current production release, the options for the MPI implementation are:

  • OpenMPI 1.8.1
  • MPICH 3.2
  • Intel MPI 5.0.1

so I would see if you can install one of these. If you’re on a cluster supporting MPI, it’s highly likely that one or more of these is available.

The MPI example is in need of maintenance, and longer term integration in the core toolkit. Unfortunately we don’t have the manpower to do this right now.

Thank you for advice, i will try to check it out when i get access to the cluster again, and will tell here about results.