Just some notes for myself nothing to see here...

List of needed software

Steps

To install scalapack from source you will need to above list of dependancies installed. We will assume that your system has gcc, g++ and gfortran installed.

Configure, Compile and Install openmpi

tar zxvf src/openmpi-1.4.1.tar.gz 
cd openmpi-1.4.1/
./configure --prefix=$HOME/local
export PATH=$HOME/local/bin:$PATH
export LD_RUN_PATH=$HOME/local/lib
export LD_LIBRARY_PATH=$HOME/local/lib

Configure, Compile BLAS

tar zxvf blas.tgz
cd BLAS
make all FORTRAN=gfortran LOADER=gfortran
cp blas_LINUX.a $HOME/local/lib/libblas.a

you may or may not want to edit make.inc to customise the build more.

Configure, Compile LAPACK

tar zxvf lapack.tar.gz
cd lapack-3.2.1/
cp make.inc.example make.inc

The default settings are pretty sensible and should work fine with gfortran. Copy the BLAS library that you compiled earlier on to the lapack build directory. (or edit make.inc to point to the correct blas library)

cp ../BLAS/blas_LINUX.a .

Then just type make in the lapack directory. It will compile the lapack library and also run some tests. You should go into the TESTING directory and look for failures in the *.out files and investigate any significant failures. You should be left with a lapack_LINUX.a in the build directory

Configure, Compile mpiblacs

tar zxvf mpiblacs.tgz
cd BLACS
cp BMAKES/Bmake.MPI-LINUX ./Bmake.inc

Before you can compile BLACS/MPIBLACS you should go into the INSTALL directory and type make help. You will need to run some of the tests to correctly set the options needed to make BLACS work correctly.

You will need to know the location of the build directory in my case it was /home/jtang/develop/hpce2-scalapack/BLACS

So I did

cd INSTALL
make xintface F77=gfortran BTOPdir=$HOME/develop/hpce2-scalapack/BLACS
make xsize F77=gfortran BTOPdir=$HOME/develop/hpce2-scalapack/BLACS
make xsyserrors F77=gfortran BTOPdir=$HOME/develop/hpce2-scalapack/BLACS MPIdir=$HOME/local MPILIBDir=${MPIdir}/lib MPILIB="$HOME/local/lib/libmpi.so"

Run the above programs

./EXE/xintface
For this platform, set INTFACE = -DAdd_

./EXE/xsize 
ISIZE=4
SSIZE=4
DSIZE=8
CSIZE=8
ZSIZE=16


mpirun -np 4 ./EXE/xsyserrors 
libibverbs: Fatal: couldn't read uverbs ABI version.
--------------------------------------------------------------------------
[[34681,1],0]: A high-performance Open MPI point-to-point messaging module
was unable to find any relevant network interfaces:

Module: OpenFabrics (openib)
  Host: duo

Another transport will be used instead, although this may result in
lower performance.
--------------------------------------------------------------------------
If this routine does not complete, you should set SYSERRORS = -DZeroByteTypeBug.
Leave SYSERRORS blank for this system.

Take note of the output as you will need it to build BLACS. You should also build and run the following targets to make sure you get the right settings.

make xtc_CsameF77 F77=mpif90 BTOPdir=$HOME/develop/hpce2-scalapack/BLACS MPIdir=$HOME/local MPILIBDir=${MPIdir}/lib MPILIB="$HOME/local/lib/libmpi.so"

mpirun -np 4 ./EXE/xtc_CsameF77
--------------------------------------------------------------------------

[[42923,1],0]: A high-performance Open MPI point-to-point messaging module
was unable to find any relevant network interfaces:

Module: OpenFabrics (openib)
  Host: duo

Another transport will be used instead, although this may result in
lower performance.
--------------------------------------------------------------------------
libibverbs: Fatal: couldn't read uverbs ABI version.
libibverbs: Fatal: couldn't read uverbs ABI version.
libibverbs: Fatal: couldn't read uverbs ABI version.
libibverbs: Fatal: couldn't read uverbs ABI version.
 If this routine does not complete successfully,
 Do _NOT_ set TRANSCOMM = -DCSameF77


[duo:27485] *** An error occurred in MPI_Comm_free
[duo:27485] *** on communicator MPI_COMM_WORLD
[duo:27485] *** MPI_ERR_COMM: invalid communicator
[duo:27485] *** MPI_ERRORS_ARE_FATAL (your MPI job will now abort)
 Do _NOT_ set TRANSCOMM = -DCSameF77
--------------------------------------------------------------------------
mpirun has exited due to process rank 3 with PID 27486 on
node duo exiting without calling "finalize". This may
have caused other processes in the application to be
terminated by signals sent by mpirun (as reported here).
--------------------------------------------------------------------------
[duo:27482] 3 more processes have sent help message help-mpi-btl-base.txt / btl:no-nics
[duo:27482] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
[duo:27482] 1 more process has sent help message help-mpi-errors.txt / mpi_errors_are_fatal

The next two are just sanity checks

make xcmpi_sane F77=gfortran BTOPdir=$HOME/develop/hpce2-scalapack/BLACS MPIdir=$HOME/local MPILIBDir=${MPIdir}/lib MPILIB="$HOME/local/lib/libmpi.so"
make xfmpi_sane F77=mpif90 BTOPdir=$HOME/develop/hpce2-scalapack/BLACS MPIdir=$HOME/local MPILIBDir=${MPIdir}/lib MPILIB="$HOME/local/lib/libmpi.so"

mpirun -np 4 ./EXE/xcmpi_sane
mpirun -np 4 ./EXE/xfmpi_sane

With the above information I go back into the build directory. (see also http://www.open-mpi.org/faq/?category=mpi-apps#blacs, the TRANSCOMM option needs to be set to -DUseMpi2)

make F77=mpif90 BTOPdir=$HOME/develop/hpce2-scalapack/BLACS MPIdir=$HOME/local MPILIBDir=${MPIdir}/lib MPILIB="$HOME/local/lib/libmpi.so" TRANSCOMM=-DUseMpi2 INTFACE=-DAdd_  mpi what=clean
make F77=mpif90 BTOPdir=$HOME/develop/hpce2-scalapack/BLACS MPIdir=$HOME/local MPILIBDir=${MPIdir}/lib MPILIB="$HOME/local/lib/libmpi.so" TRANSCOMM=-DUseMpi2 INTFACE=-DAdd_  mpi

Then cd into the TESTING Directory

make F77=mpif90 BTOPdir=$HOME/develop/hpce2-scalapack/BLACS MPIdir=$HOME/local MPILIBDir=${MPIdir}/lib MPILIB="$HOME/local/lib/libmpi.so" TRANSCOMM=-DUseMpi2 INTFACE=-DAdd_

Then if you wish and have time, run the tests.

Going back to the root of the BLACS directory you should see that there is now a LIB directory containing the libraries that you have built.

Configure, Compile SCALAPACK

You will need to know the locations of the above compiled BLAS, LAPACK and BLACS libraries and some of the settings from the BLACS compilation.

tar zxvf scalapack.tgz
cd scalapack-1.8.0
cp  SLmake.inc.example SLmake.in
make home=$HOME/develop/hpce2-scalapack/scalapack-1.8.0 BLASLIB=$HOME/develop/hpce2-scalapack/BLAS/blas_LINUX.a LAPACKLIB=$HOME/develop/hpce2-scalapack/lapack-3.2.1/lapack_LINUX.a BLACSFINIT=$HOME/develop/hpce2-scalapack/BLACS/LIB/blacsF77init_MPI-LINUX-0.a BLACSCINIT=$HOME/develop/hpce2-scalapack/BLACS/LIB/blacsCinit_MPI-LINUX-0.a BLACSLIB=$HOME/develop/hpce2-scalapack/BLACS/LIB/blacs_MPI-LINUX-0.a CDEFS="-DAdd_ -DNO_IEEE -DUsingMpiBlacs" SMPLIB=""
Bookmark and Share