MPI support for ABINIT 9.4.1 on a Cray XC40 cluster

option, parallelism,...

Moderators: fgoudreault, mcote

Forum rules
Please have a look at ~abinit/doc/config/build-config.ac in the source package for detailed and up-to-date information about the configuration of Abinit 8 builds.
For a video explanation on how to build Abinit 7.x for Linux, please go to: http://www.youtube.com/watch?v=DppLQ-KQA68.
IMPORTANT: when an answer solves your problem, please check the little green V-like button on its upper-right corner to accept it.
Locked
johanhellsvik
Posts: 2
Joined: Wed Jun 02, 2021 2:36 pm

MPI support for ABINIT 9.4.1 on a Cray XC40 cluster

Post by johanhellsvik » Wed Jun 02, 2021 3:13 pm

Hi,

We are installing ABINIT 9.4.1 on a Cray XC40 cluster (beskow.pdc.kth.se) and have encountered issues with the MPI support.

Configuration, compilation, and linking work out fine for a non-MPI build. The problem arise when we activate MPI. The settings that I have tried, building on a compute node, are

------
#!/bin/bash

#SBATCH -J abinit
#SBATCH -A <projname>
#SBATCH -t 02:00:00
#SBATCH --nodes=1
#SBATCH --ntasks-per-node=1
#SBATCH --cpus-per-task=32

#wget https://www.abinit.org/sites/default/fi ... 4.1.tar.gz
#tar xf abinit-9.4.1.tar.gz
#cd abinit-9.4.1

# Load the build environment
# PrgEnv-intel
# ----------------------

module load cdt/19.06
module load intel/18.0.0.128
module swap PrgEnv-cray PrgEnv-intel
module load cray-fftw/3.3.8.3
module load cray-netcdf/4.6.3.0
module load cray-hdf5/1.10.6.1
module load libxc/4.0.3
module load anaconda/2019.03/py37

export CRAYPE_LINK_TYPE=dynamic
export CC="cc -D_Float128=__float128"
export CXX=CC
export FC=ftn

mkdir build
cd build
srun -n 1 ../configure \
--with-libxc=/pdc/vol/libxc/4.0.3/ \
--with-netcdf=/opt/cray/pe/netcdf/4.6.3.0/INTEL/19.0/ \
--with-netcdf-fortran=/opt/cray/pe/netcdf/4.6.3.0/INTEL/19.0/
srun -n 1 make -j 32 > build_log.txt 2> build_error.txt
------

for which the configure script renders the error message

------
checking whether the MPI C compiler is set... no
checking whether the MPI C++ compiler is set... no
checking whether the MPI Fortran compiler is set... no
------

Adding as argument to configure '--with_mpi_prefix=/opt/cray/pe/craype/2.6.1/bin/'

gave the error message.

configure: error: unrecognized option: `--with_mpi_prefix=/opt/cray/pe/craype/2.6.1/bin/

What I can see the ABINIT configure script uses the variables CC, CXX, and FC. These I have set to the standard compiler wrappers for the Cray system.

Any suggestions on what additional environment variable(s) or flag(s) could be used for the configure script are most welcome.

Best regards,
Johan Hellsvik

User avatar
jbeuken
Posts: 365
Joined: Tue Aug 18, 2009 9:24 pm
Contact:

Re: MPI support for ABINIT 9.4.1 on a Cray XC40 cluster

Post by jbeuken » Thu Jun 03, 2021 1:46 pm

Hi,
configure: error: unrecognized option: `--with_mpi_prefix=/opt/cray/pe/craype/2.6.1/bin/
can you try with

Code: Select all

--with-mpi=/opt/cray/pe/craype/2.6.1/
or

Code: Select all

--with-mpi="yes"
export FC="mpifort"
export CC="mpicc"
export CXX="mpicxx"
jmb
------
Jean-Michel Beuken
Computer Scientist

johanhellsvik
Posts: 2
Joined: Wed Jun 02, 2021 2:36 pm

Re: MPI support for ABINIT 9.4.1 on a Cray XC40 cluster

Post by johanhellsvik » Fri Jun 04, 2021 11:55 am

Hi,

Thank you for the advice. Unfortunately the MPI compilers are still not detected

Code: Select all

--with-mpi=/opt/cray/pe/craype/2.6.1/
gave the error message

configure: error: invalid MPI settings
Please adjust --with-mpi and/or CC and re-run configure

Code: Select all

--with-mpi="yes"
export FC="mpifort"
export CC="mpiicc"
export CXX="mpicxx"
gave the error message

configure: error: could not run Fortran compiler "mpifort"

Code: Select all

--with-mpi="yes"
export FC="mpiifort" #(added an "i")
export CC="mpiicc"  #(added an "i")
export CXX="mpicxx"
gave the error message

checking whether the MPI C compiler is set... no
checking whether the MPI C++ compiler is set... no
checking whether the MPI Fortran compiler is set... no

Best regards,
Johan Hellsvik

User avatar
jbeuken
Posts: 365
Joined: Tue Aug 18, 2009 9:24 pm
Contact:

Re: MPI support for ABINIT 9.4.1 on a Cray XC40 cluster

Post by jbeuken » Sun Jun 20, 2021 1:25 pm

Hi,

Sorry for the slow reaction...

It is necessary to start from the beginning, i.e. know the environment

From your batch script, can you run the following commands to fix the env :

Code: Select all

module load cdt/19.06
module load intel/18.0.0.128
module swap PrgEnv-cray PrgEnv-intel
module load cray-fftw/3.3.8.3
module load cray-netcdf/4.6.3.0
module load cray-hdf5/1.10.6.1
module load libxc/4.0.3
module load anaconda/2019.03/py37
now, can you send the outputs of :

Code: Select all

which mpifort
mpifort -show
which mpiifort
mpiifort -show
which mpif90
mpif90 -show
which ifort
echo $FC
best,
------
Jean-Michel Beuken
Computer Scientist

Locked