Skip to content

Instantly share code, notes, and snippets.

@alchem0x2A
Last active July 6, 2018 09:56
Show Gist options
  • Save alchem0x2A/0d6b36cc048de0219794882c907af2e1 to your computer and use it in GitHub Desktop.
Save alchem0x2A/0d6b36cc048de0219794882c907af2e1 to your computer and use it in GitHub Desktop.
Install GPAW with intel on Euler

Preparation

  • Compiler: intel/15.0, open_mpi/1.10.0, libxc/3.0.0, fftw/3.3.4, scalapack
  • tar for numpy scipy matplotlib

Compile python with intel

Download the python tarball, untar and cd into the root directory.

./configure --prefix="abs/path/to/build" --with-icc CC=icc CXX=icpc
make && make all

Working on virtualenv

/path/to/intel/python -m venv gpaw_intel
source ~/.virtualenv/gpaw/bin/activate

Now we are working on a clean virtualenv with all packages to be installed

Compile numpy with MKL

Create site.cfg to cope with compiler and library information

cd 'path/to/numpy'
cp site.cfg.example site.cfg

Uncomment and modify the section with openblas

[mkl]
library_dirs = /cluster/apps/intel/composer_xe_2015.0.090/composer_xe_2015.0.090/mkl/lib/intel64
include_dirs = /cluster/apps/intel/composer_xe_2015.0.090/composer_xe_2015.0.090/mkl/include
mkl_libs = mkl_rt
lapack_libs =

install numpy

python setup.py build --compile=intelem && python setupy.py install --compile=intelem

check numpy configuration

python -c "import numpy; numpy.__config__.show()"

Check if blas_mkl_info and lapack_mkl_info are correct.

Compile scipy

More straightforward than numpy. The OpenBLAS environment can be automatically detected if numpy corrected compiled.

cd 'path/to/numpy'
python setup.py build --compiler=intelem
python setup.py install

check numpy configuration

python -c "import scipy; scipy.__config__.show()"

Other packages

matplotlib ase

Install libvdwxc

Follow the instructions on the website and specify the root path. MPI compiler is compulsory in some cases as the build system cannot detect the mpicc automatically

./configure --with-mpi CC="mpicc" FC="mpif90" --prefix="abs/path/to/build"
make && make install

Test installations

Requires nose and pytest

python -c "import numpy; numpy.test()"
python -c "import scipy; scipy.test()"
ase test

Install GPAW

Download the newest version of GPAW, i.e. https://gitlab.com/gpaw/gpaw/-/archive/1.4.0/gpaw-1.4.0.zip Modify the customize.py, enable the nessary parts

  1. Edit the openblas part
  2. Edit the scalapack part
  3. Edit the libvdwxc part

Most probably need to add the path to libpython3.xm.so into the LIBRARY_PATH (not LD_LIBRARY_PATH)

A sample configuration file looks like:

compiler = 'icc'
mpicompiler = 'mpicc'  # use None if you don't want to build a gpaw-python                                                      
mpilinker = 'mpicc'
# platform_id = ''                                                                                                              
scalapack = True                                                                     

if scalapack:
    libraries = ['mkl_intel_lp64' ,'mkl_sequential' ,'mkl_core',
                 'mkl_lapack95_lp64',
                 'mkl_scalapack_lp64', 'mkl_blacs_openmpi_lp64',
                 'pthread'   ]
    define_macros += [('GPAW_NO_UNDERSCORE_CBLACS', '1')]
    define_macros += [('GPAW_NO_UNDERSCORE_CSCALAPACK', '1')]
# - dynamic linking (requires rpath or setting LD_LIBRARY_PATH at runtime):                                                     
if 1:
    xc = '/cluster/apps/libxc/3.0.0/x86_64/intel_15.0.0/'
    include_dirs += [xc + 'include']
    library_dirs += [xc + 'lib']
    # You can use rpath to avoid changing LD_LIBRARY_PATH:                                                                      
    extra_link_args += ['-Wl,-rpath={xc}/lib'.format(xc=xc)]
    if 'xc' not in libraries:
        libraries.append('xc')
# libvdwxc:                                                                                                                     
if 1:
    libvdwxc = True
    path = '/cluster/home/ttian/.virtualenvs/gpaw_intel/dependency/libvdwxc'
    extra_link_args += ['-Wl,-rpath=%s/lib' % path]
    library_dirs += ['%s/lib' % path]
    include_dirs += ['%s/include' % path]
    libraries += ['vdwxc']

Install with normal instructions:

python setup build && python setup install

If installation succeeded but some errors occur, use the method below to delete the files generated (pip uninstall incapable) https://gist.github.com/lovaulonze/4e31d83091a9d9543c0cfd297a1ded07

Test GPAW

Run the tests like on the website. Download the GPAW dataset first.

gpaw install-data 'path/to/gpaw/data'

Currently I didn't succeed with the libvdwxc/openmpi under intel compilation, so ticked out the libvdwxc

Use the gcc version for libvdwxc only

If necessary, clean up the file ~/.gpaw/rc.py to remove obsolote dependencies, then perform the test:

gpaw info #test basic settings, only scalapack flag is no
gpaw test -j 8 #depends on the system
gpaw-python -m gpaw info #scalapack flag should also be yes
mpiexec -n 24 gpaw-python -m gpaw test #may need to submit to the job queue

Make the module file

mkdir -p ~/modules/gpaw/
emacs -nw ~/modules/gpaw/gcc_5.2.0

Then edit the contents in the "gcc_5.2.0" modulefile:

#%Module1.0#####################################################################
#
#STATUS
#stat:moddep:gcc/5.2.0 open_mpi/1.10.0 openblas/0.2.13_seq scalapack/2.0.2 fftw/3.3.4 libxc/3.0.0
#stat:modauto:
#END STATUS

proc ModulesHelp { } {
    global helpmsg
    puts stderr "\t$helpmsg\n"
}

set version gcc_5.2.0
set curmod  [module-info name]

set topdir  /cluster/home/ttian/.virtualenvs/gpaw/bin/

# check if requirements are met
set envguess  [exec /cluster/apps/scripts/guess_compiler.pl --path=compiler=__CC__\ open_mpi=__MPI__]
set need_cb   "gcc"
set need_cc   "5.2.0"
set need_ompi "1.10.0"


#   if { [string compare $envguess "compiler=${need_cb}_$need_cc open_mpi=$need_ompi"] != 0 } {
#     puts stderr "$curmod error: this module requires $need_cb/$need_cc and open_mpi/$need_ompi to be loaded."
#      puts stderr "hint: run 'module purge ; module load $need_cb/$need_cc open_mpi/$need_ompi $curmod' to resolve this problem."
#      break
#   }
if { [string compare $envguess "compiler=${need_cb}_$need_cc open_mpi=$need_ompi"] != 0 } {
        module load ${need_cb}/$need_cc
        module load open_mpi/$need_ompi
	module load openblas/0.2.13_seq
	module load fftw/3.3.4
	module load scalapack/2.0.2
	module load libxc/3.0.0
	module load python/3.6.0
}

# virtualenv
if { [module-info mode load] || [module-info mode switch2] } {
    puts stdout "source /cluster/home/ttian/.virtualenvs/gpaw/bin/activate;"
} elseif { [module-info mode remove] && ![module-info mode switch3] } {
    puts stdout "deactivate;"
}


# not reached on error

Finally add the following line in .bashrc or .bash_profile to enable local modules:

module use $HOME/modules
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment