General software and libraries (Hexagon): Difference between revisions

From HPC documentation portal
No edit summary
 
(5 intermediate revisions by 2 users not shown)
Line 1: Line 1:
<big><span style="color:#FF0000">'''This documentation page is outdated, will be updated soon'''</span>.<br>
This page contains software specific notes and is not a full list of software available on Hexagon.  
'''Full up to date list is available a'''t [http://www.notur.no/software/ NOTUR]
 
== Application software by discipline: ==
For a full list please run <code> module available</code> on Hexagon login node.
=== Bio-informatics ===
 
== Applications==
{|border="1" cellspacing="0" font-align="right"
{|border="1" cellspacing="0" font-align="right"
! Software !! Version !! Module !! Resources !! Notes  
! Software !! Version !! Module !! Notes  
|-
| Gaussian || all || gaussian/g16 ||  1. You will need to ask for a CCM enabled account. Please contact [[Support]]
2. [[Gaussian_on_Hexagon | Please use this example script to learn how to get started]]
|-
|-
| NAMD || 2.8b1 || namd || [http://www.ks.uiuc.edu/Research/namd/ Homepage] || Recommended env variables (loaded with module): <pre>export MPICH_PTL_SEND_CREDITS=-1
| Matlab || all || matlab || Can not be run on compute nodes, therefore must be used only to preview some data, but not massive calculations
|-
| NAMD || all || namd || Recommended env variables (loaded with module): <pre>export MPICH_PTL_SEND_CREDITS=-1
export MPICH_MAX_SHORT_MSG_SIZE=8000
export MPICH_MAX_SHORT_MSG_SIZE=8000
export MPICH_PTL_UNEX_EVENTS=100000
export MPICH_PTL_UNEX_EVENTS=100000
export MPICH_UNEX_BUFFER_SIZE=100M</pre>
export MPICH_UNEX_BUFFER_SIZE=100M</pre>
  aprun -n $PBS_NNODES -cc cpu namd2
  aprun -n $PBS_NNODES -cc cpu namd2
Use [[Support]] for support.
|-
| MEME || 4.4.0 || meme || [http://meme.nbcr.net Homepage] || NOTE: linking was forced on this installation. Please compare output with previous (non-forced) runs for correctness.
Use [[Support]] for support.
|}
=== Chemistry ===
{|border="1" cellspacing="0" font-align="right"
! Software !! Version !! Module !! Resources !! Notes
|-
|-
| ADF || NA || NA || NA || Not supported on Hexagon
| Python || all || python || Only performance dependent, MPI and some basic packages are compiled into the Python modules, users are encouraged to use pip and their local folders to install extra Python packages.
|-
|-
| Daulton || User installation|| NA || NA || Use [[Support]] for support
| Python || miniconda || python/miniconda3-4.4.10 ||
|-
|-
| Gamess || NA || NA || NA || Not supported on Hexagon
| R || all || R || To run R in batch (on compute nodes) adjust your job script as follows: <br><pre>#SBATCH --ntasks=32
|-
#
| Gaussian || NA || NA || NA || Not supported on Hexagon
# fix for temporary filesystem
|-
export TMP=/work/users/$USER/
| Molpro || NA || NA || NA || Not supported on Hexagon
export TMPDIR=$TMP
|-
#
| NWChem || 5.1.1 || nwchem-cnl || || The module sets necessary variables
aprun -n1 -N1 -m32000M R --no-save <./myscript.R
</pre>
|-
|-
| VASP || User installation || NA || NA || Use [[Support]] for support
| Trilinos || all || trilinos || http://trilinos.sandia.gov/index.html
|}
=== Fluid dynamics ===
{|border="1" cellspacing="0" font-align="right"
! Software !! Version !! Module !! Resources !! Notes
|-
|-
| OpenFOAM || 1.5, 1.6, 1.7.1 || openfoam || [http://www.openfoam.com/ Homepage] || Limited amount of solvers are available due to static linking
| WRF || all || WRF || If you are using spectral nudging you will need to use 8000MB of memory per core. Otherwise despite of message that spectral nudging was enabled the output netcdf file will not be properly produced (see case #21869)
|}
 
=== Geosciences ===
{|border="1" cellspacing="0" font-align="right"
! Software !! Version !! Module !! Resources !! Notes
|-
| BOM || User installation || NA || [http://www.mi.uib.no/BOM Bergen Ocean Model] || Use [[Support]] for support
|-
| EnKF || User installation || NA || || Use [[Support]] for support
|-
| HYCOM || User installation || NA || [http://panoramix.rsmas.miami.edu/hycom/ Homepage] || Use [[Support]] for support
|-
| MYCOM || User installation || NA|| [http://www.rsmas.miami.edu/groups/micom.html MICOM group homepage] [http://www-mount.ee.umn.edu/~okeefe/micom/ Parallel MICOM page] || Use [[Support]] for support
|-
| NCL || 5.1.1, 5.2 || ncl_ncarg || || Default version is aprun compatible. If you are missing some features, you can use ncl_ncarg/##-login version
|-
| NCO || 3.9.6, 4.0.0 || nco-cnl || ||
|-
| ROMS || User installation || NA || [http://www.myroms.org/ Homepage] || Use [[Support]] for support
|-
| UDUNITS2 || 2.1.7 || udunits-cnl || ||
|-
| WPS || 3.0.1, 3.1, 3.1.1, 3.2, 3.2.1, 3.3 || WPS-cnl || [http://www.mmm.ucar.edu/wrf/users/wpsv3.2/wps.html WPS site] || '''NOTE:''' some results can be different (it depends on the input files) if WRF module is loaded. In this case please ensure that WRF module is not loaded
|-
| WRF || 2.2.1, 3.0.1.1, 3.1, 3.1.1, 3.2, 3.2.1, 3.3 || WRF-cnl || [http://www.mmm.ucar.edu/wrf/users/wrfv3.2/wrf_model.html WRF site] || If you are using spectral nudging you will need to use 8000MB of memory per core. Otherwise despite of message that spectral nudging was enabled the output netcdf file will not be properly produced (see case #21869)
|-
| WRF (special version) || 3.1.1, 3.2 || WRFquiet-cnl || || This version of WRF is not producing rsl* files during execution, but send all output to STDOUT&STDERR. You can use this module to avoid producing big amount of rsl files.
 
'''Remember!''' aprun wrf.exe > /dev/null
|-
| WRFDA || 3.1.1, 3.2 || WRFDA || ||
|-
| WRFNL || 3.2 || WRFNL || ||
|-
| WRFPLUS || 3.2 || WRFPLUS || ||
|-
| WRFchem || 3.0.1.1, 3.1, 3.1.1, 3.2, 3.2.1, 3.3 || WRFchem-cnl || [http://ruc.noaa.gov/wrf/WG11/ WRFchem site] ||
|-
|WRFpolar-cnl || 3.1.1 || WRFpolar-cnl || ||
|}
|}


== Performance libraries ==
== Performance libraries ==
{|border="1" cellspacing="0" font-align="right"
{|border="1" cellspacing="0" font-align="right"
! Software !! Version !! Module !! Resources !! Notes  
! Software !! Module !! Notes  
|-
|-
| BLAS<br>LAPACK<br>ScaLAPACK<br>BLACS<br>IRT<br>SuperLU sparse solver routines<br>CRAFFT
| BLAS<br>LAPACK<br>ScaLAPACK<br>BLACS<br>IRT<br>SuperLU sparse solver routines<br>CRAFFT
| System || xt-libsci || [http://docs.cray.com/cgi-bin/craydoc.cgi?mode=Show;q=2396;f=/books/S-2396-22/html-S-2396-22//chapter.4.4rBs9Cn5.html#z1114089508oswald CrayXT Programming Environment User Guide] ||  BLAS and LAPACK include routines from the 64-bit libGoto library from the University of Texas (GotoBLAS).
| cray-libsci || BLAS and LAPACK include routines from the 64-bit libGoto library from the University of Texas (GotoBLAS). <br> [http://docs.cray.com/cgi-bin/craydoc.cgi?mode=Show;q=S-2529;f=/books/S-2529-116/html-S-2529-116//chapter-ck3x6qu8-brbethke.html Cray Programming Environment User's Guide]
|-
|-
| PETSc:<br>MUMPS<br>SuperLU<br>ParMETIS<br>HYPRE || System || petsc || [http://docs.cray.com/cgi-bin/craydoc.cgi?mode=Show;q=2396;f=/books/S-2396-22/html-S-2396-22//chapter.4.4rBs9Cn5.html#section-tcuiuxmr-oswald CrayXT Programming Environment User Guide] ||
| PETSc:<br>MUMPS<br>SuperLU<br>ParMETIS<br>HYPRE || petsc || [http://docs.cray.com/cgi-bin/craydoc.cgi?mode=Show;q=S-2529;f=/books/S-2529-116/html-S-2529-116//chapter-ck3x6qu8-brbethke.html Cray Programming Environment User's Guide]
|-
|-
| ACML || System || acml || [http://docs.cray.com/cgi-bin/craydoc.cgi?mode=Show;q=2396;f=/books/S-2396-22/html-S-2396-22//chapter.4.4rBs9Cn5.html#z1114089457oswald CrayXT Programming Environment User Guide] ||
| ACML || acml || [http://docs.cray.com/cgi-bin/craydoc.cgi?mode=Show;q=S-2529;f=/books/S-2529-116/html-S-2529-116//chapter-ck3x6qu8-brbethke.html Cray Programming Environment User's Guide]
|-
|-
| FFTW || System || fftw || [http://docs.cray.com/cgi-bin/craydoc.cgi?mode=Show;q=2396;f=/books/S-2396-22/html-S-2396-22//chapter.4.4rBs9Cn5.html#section-cwhguhda-oswald CrayXT Programming Environment User Guide] ||
| FFTW || fftw || [http://docs.cray.com/cgi-bin/craydoc.cgi?mode=Show;q=S-2529;f=/books/S-2529-116/html-S-2529-116//chapter-ck3x6qu8-brbethke.html Cray Programming Environment User's Guide]
|-
|-
| Fast_mv || System || libfast || [http://docs.cray.com/cgi-bin/craydoc.cgi?mode=Show;q=2396;f=/books/S-2396-22/html-S-2396-22//chapter.4.4rBs9Cn5.html#section-5616tcv7-oswald CrayXT Programming Environment User Guide] ||
| Fast_mv || libfast || [http://docs.cray.com/cgi-bin/craydoc.cgi?mode=Show;q=S-2529;f=/books/S-2529-116/html-S-2529-116//chapter-ck3x6qu8-brbethke.html Cray Programming Environment User's Guide]
|}
|}


== Scientific data libraries ==
== Scientific data libraries ==
{|border="1" cellspacing="0" font-align="right"
{|border="1" cellspacing="0" font-align="right"
! Software !! Version !! Module !! Resources !! Notes  
! Software !! Module !! Notes  
|-
|-
| NetCDF 3
| NetCDF 3
| System || netCDF || ||
| netcdf ||
|-
|-
| NetCDF 4
| NetCDF 4
| System || netcdf || [http://docs.cray.com/cgi-bin/craydoc.cgi?mode=Show;q=netcdf;f=books/S-2396-22/html-S-2396-22/chapter.4.4rBs9Cn5.html#section-70yb8rlt-oswald CrayXT Programming Environment User Guide] ||
| cray-netcdf || Pay attention that netcdf4 modules have prefix cray- as well as hdf5 modules
|-
|-
| NetCDF 4 parallel
| NetCDF 4 parallel
| System || netcdf-hdf5parallel || ||
| cray-netcdf-hdf5parallel ||
|-
|-
| HDF5  
| HDF5  
| System || hdf5 || [http://docs.cray.com/cgi-bin/craydoc.cgi?mode=Show;q=netcdf;f=books/S-2396-22/html-S-2396-22/chapter.4.4rBs9Cn5.html#section-70yb8rlt-oswald CrayXT Programming Environment User Guide]
| cray-hdf5 || Will be loaded automatically with the module cray-netcdf
| Will be loaded automatically with the module netcdf
|-
|-
| HDF5 parallel  
| HDF5 parallel  
| System || hdf5-parallel || || Will be loaded automatically with the module netcdf-hdf5parallel
| cray-hdf5-parallel || Will be loaded automatically with the module cray-netcdf-hdf5parallel
|}
|}
== General ==
{|border="1" cellspacing="0" font-align="right"
! Software !! Version !! Module !! Resources !! Notes
|-
| Matlab || 2007b, 2008b, 2009a || matlab || || Can not be run on compute nodes, therefore must be used only to preview some data, but not massive calculations
|-
| Gnuplot || 4.2.5 || gnuplot-login || ||
|-
| GSL || 1.11 || gsl-cnl || ||
|-
| Perl || 5.8.8 || perl-cnl || ||
|-
| Python || 2.6.2, 2.6.5 || python-cnl || || Version 2.6.2 is statically linked and contains only essential Python modules, numpy, netCDF4, Cython, mpi4py<br>Version 2.6.5 is dynamically linked, contains mostly all Python modules, NumPy, netCDF4, SciPy, Matplotlib, lxml, pyYaml
|-
| Paraview || 3.6.1 || paraview-cnl, paraview-login || || Statically compiled, supports only pvbatch, with MPI enabled
|-
| R || 2.15.0, 3.0.2 || R || || To run R in batch (on compute nodes) adjust your job script as follows: <br><pre>#PBS -l mppwidth=1,mppnppn=1,mppmem=32000mb
#
# fix for temporary filesystem
export TMP=/work/$USER/
export TMPDIR=$TMP
#
aprun -B R --no-save <./myscript.R
</pre>
|-
| Trilinos || System || trilinos || http://trilinos.sandia.gov/index.html ||
|}
[[Category:Hexagon]]

Latest revision as of 14:26, 19 April 2018

This page contains software specific notes and is not a full list of software available on Hexagon.

For a full list please run module available on Hexagon login node.

Applications

Software Version Module Notes
Gaussian all gaussian/g16 1. You will need to ask for a CCM enabled account. Please contact Support

2. Please use this example script to learn how to get started

Matlab all matlab Can not be run on compute nodes, therefore must be used only to preview some data, but not massive calculations
NAMD all namd Recommended env variables (loaded with module):
export MPICH_PTL_SEND_CREDITS=-1
export MPICH_MAX_SHORT_MSG_SIZE=8000
export MPICH_PTL_UNEX_EVENTS=100000
export MPICH_UNEX_BUFFER_SIZE=100M
aprun -n $PBS_NNODES -cc cpu namd2
Python all python Only performance dependent, MPI and some basic packages are compiled into the Python modules, users are encouraged to use pip and their local folders to install extra Python packages.
Python miniconda python/miniconda3-4.4.10
R all R To run R in batch (on compute nodes) adjust your job script as follows:
#SBATCH --ntasks=32
#
# fix for temporary filesystem
export TMP=/work/users/$USER/
export TMPDIR=$TMP
#
aprun -n1 -N1 -m32000M R --no-save <./myscript.R
Trilinos all trilinos http://trilinos.sandia.gov/index.html
WRF all WRF If you are using spectral nudging you will need to use 8000MB of memory per core. Otherwise despite of message that spectral nudging was enabled the output netcdf file will not be properly produced (see case #21869)

Performance libraries

Software Module Notes
BLAS
LAPACK
ScaLAPACK
BLACS
IRT
SuperLU sparse solver routines
CRAFFT
cray-libsci BLAS and LAPACK include routines from the 64-bit libGoto library from the University of Texas (GotoBLAS).
Cray Programming Environment User's Guide
PETSc:
MUMPS
SuperLU
ParMETIS
HYPRE
petsc Cray Programming Environment User's Guide
ACML acml Cray Programming Environment User's Guide
FFTW fftw Cray Programming Environment User's Guide
Fast_mv libfast Cray Programming Environment User's Guide

Scientific data libraries

Software Module Notes
NetCDF 3 netcdf
NetCDF 4 cray-netcdf Pay attention that netcdf4 modules have prefix cray- as well as hdf5 modules
NetCDF 4 parallel cray-netcdf-hdf5parallel
HDF5 cray-hdf5 Will be loaded automatically with the module cray-netcdf
HDF5 parallel cray-hdf5-parallel Will be loaded automatically with the module cray-netcdf-hdf5parallel