General software and libraries (Hexagon): Difference between revisions
From HPC documentation portal
No edit summary |
No edit summary |
||
| Line 1: | Line 1: | ||
This page contains software specific notes and is not a full list of software available on Hexagon. | |||
For a full list please run <code> module available</code> on Hexagon login node. | |||
== | |||
== Applications== | |||
{|border="1" cellspacing="0" font-align="right" | {|border="1" cellspacing="0" font-align="right" | ||
! Software !! Version !! Module | ! Software !! Version !! Module !! Notes | ||
|- | |||
| Matlab || all || matlab || Can not be run on compute nodes, therefore must be used only to preview some data, but not massive calculations | |||
|- | |- | ||
| NAMD || | | NAMD || all || namd || Recommended env variables (loaded with module): <pre>export MPICH_PTL_SEND_CREDITS=-1 | ||
export MPICH_MAX_SHORT_MSG_SIZE=8000 | export MPICH_MAX_SHORT_MSG_SIZE=8000 | ||
export MPICH_PTL_UNEX_EVENTS=100000 | export MPICH_PTL_UNEX_EVENTS=100000 | ||
export MPICH_UNEX_BUFFER_SIZE=100M</pre> | export MPICH_UNEX_BUFFER_SIZE=100M</pre> | ||
aprun -n $PBS_NNODES -cc cpu namd2 | aprun -n $PBS_NNODES -cc cpu namd2 | ||
|- | |- | ||
| | | Python || all || python || Only performance dependent, MPI and some basic packages are compiled into the Python modules, users are encouraged to use pip and their local folders to install extra Python packages. | ||
|- | |- | ||
|- | |- | ||
| | | R || all || R || To run R in batch (on compute nodes) adjust your job script as follows: <br><pre>#PBS -l mppwidth=1,mppnppn=1,mppmem=32000mb | ||
# | |||
# fix for temporary filesystem | |||
export TMP=/work/$USER/ | |||
export TMPDIR=$TMP | |||
# | |||
aprun -B R --no-save <./myscript.R | |||
</pre> | |||
|- | |- | ||
| | | Trilinos || all || trilinos || http://trilinos.sandia.gov/index.html | ||
|- | |- | ||
| | | WRF || all || WRF || If you are using spectral nudging you will need to use 8000MB of memory per core. Otherwise despite of message that spectral nudging was enabled the output netcdf file will not be properly produced (see case #21869) | ||
| | |||
| | |||
|} | |} | ||
== Performance libraries == | == Performance libraries == | ||
{|border="1" cellspacing="0" font-align="right" | {|border="1" cellspacing="0" font-align="right" | ||
! Software | ! Software !! Module !! Notes | ||
|- | |- | ||
| BLAS<br>LAPACK<br>ScaLAPACK<br>BLACS<br>IRT<br>SuperLU sparse solver routines<br>CRAFFT | | BLAS<br>LAPACK<br>ScaLAPACK<br>BLACS<br>IRT<br>SuperLU sparse solver routines<br>CRAFFT | ||
| | | cray-libsci || BLAS and LAPACK include routines from the 64-bit libGoto library from the University of Texas (GotoBLAS). <br> [http://docs.cray.com/cgi-bin/craydoc.cgi?mode=Show;q=S-2529;f=/books/S-2529-116/html-S-2529-116//chapter-ck3x6qu8-brbethke.html Cray Programming Environment User's Guide] | ||
|- | |- | ||
| PETSc:<br>MUMPS<br>SuperLU<br>ParMETIS<br>HYPRE | | PETSc:<br>MUMPS<br>SuperLU<br>ParMETIS<br>HYPRE || petsc || [http://docs.cray.com/cgi-bin/craydoc.cgi?mode=Show;q=S-2529;f=/books/S-2529-116/html-S-2529-116//chapter-ck3x6qu8-brbethke.html Cray Programming Environment User's Guide] | ||
|- | |- | ||
| ACML | | ACML || acml || [http://docs.cray.com/cgi-bin/craydoc.cgi?mode=Show;q=S-2529;f=/books/S-2529-116/html-S-2529-116//chapter-ck3x6qu8-brbethke.html Cray Programming Environment User's Guide] | ||
|- | |- | ||
| FFTW | | FFTW || fftw || [http://docs.cray.com/cgi-bin/craydoc.cgi?mode=Show;q=S-2529;f=/books/S-2529-116/html-S-2529-116//chapter-ck3x6qu8-brbethke.html Cray Programming Environment User's Guide] | ||
|- | |- | ||
| Fast_mv | | Fast_mv || libfast || [http://docs.cray.com/cgi-bin/craydoc.cgi?mode=Show;q=S-2529;f=/books/S-2529-116/html-S-2529-116//chapter-ck3x6qu8-brbethke.html Cray Programming Environment User's Guide] | ||
|} | |} | ||
== Scientific data libraries == | == Scientific data libraries == | ||
{|border="1" cellspacing="0" font-align="right" | {|border="1" cellspacing="0" font-align="right" | ||
! Software | ! Software !! Module !! Notes | ||
|- | |- | ||
| NetCDF 3 | | NetCDF 3 | ||
| | | netcdf || | ||
|- | |- | ||
| NetCDF 4 | | NetCDF 4 | ||
| | | cray-netcdf || Pay attention that netcdf4 modules have prefix cray- as well as hdf5 modules | ||
|- | |- | ||
| NetCDF 4 parallel | | NetCDF 4 parallel | ||
| | | cray-netcdf-hdf5parallel || | ||
|- | |- | ||
| HDF5 | | HDF5 | ||
| | | cray-hdf5 || Will be loaded automatically with the module cray-netcdf | ||
| Will be loaded automatically with the module netcdf | |||
|- | |- | ||
| HDF5 parallel | | HDF5 parallel | ||
| | | cray-hdf5-parallel || Will be loaded automatically with the module cray-netcdf-hdf5parallel | ||
|} | |} | ||
Revision as of 11:37, 17 January 2018
This page contains software specific notes and is not a full list of software available on Hexagon.
For a full list please run module available on Hexagon login node.
Applications
| Software | Version | Module | Notes |
|---|---|---|---|
| Matlab | all | matlab | Can not be run on compute nodes, therefore must be used only to preview some data, but not massive calculations |
| NAMD | all | namd | Recommended env variables (loaded with module): export MPICH_PTL_SEND_CREDITS=-1 export MPICH_MAX_SHORT_MSG_SIZE=8000 export MPICH_PTL_UNEX_EVENTS=100000 export MPICH_UNEX_BUFFER_SIZE=100M aprun -n $PBS_NNODES -cc cpu namd2 |
| Python | all | python | Only performance dependent, MPI and some basic packages are compiled into the Python modules, users are encouraged to use pip and their local folders to install extra Python packages. |
| R | all | R | To run R in batch (on compute nodes) adjust your job script as follows: #PBS -l mppwidth=1,mppnppn=1,mppmem=32000mb # # fix for temporary filesystem export TMP=/work/$USER/ export TMPDIR=$TMP # aprun -B R --no-save <./myscript.R |
| Trilinos | all | trilinos | http://trilinos.sandia.gov/index.html |
| WRF | all | WRF | If you are using spectral nudging you will need to use 8000MB of memory per core. Otherwise despite of message that spectral nudging was enabled the output netcdf file will not be properly produced (see case #21869) |
Performance libraries
| Software | Module | Notes |
|---|---|---|
| BLAS LAPACK ScaLAPACK BLACS IRT SuperLU sparse solver routines CRAFFT |
cray-libsci | BLAS and LAPACK include routines from the 64-bit libGoto library from the University of Texas (GotoBLAS). Cray Programming Environment User's Guide |
| PETSc: MUMPS SuperLU ParMETIS HYPRE |
petsc | Cray Programming Environment User's Guide |
| ACML | acml | Cray Programming Environment User's Guide |
| FFTW | fftw | Cray Programming Environment User's Guide |
| Fast_mv | libfast | Cray Programming Environment User's Guide |
Scientific data libraries
| Software | Module | Notes |
|---|---|---|
| NetCDF 3 | netcdf | |
| NetCDF 4 | cray-netcdf | Pay attention that netcdf4 modules have prefix cray- as well as hdf5 modules |
| NetCDF 4 parallel | cray-netcdf-hdf5parallel | |
| HDF5 | cray-hdf5 | Will be loaded automatically with the module cray-netcdf |
| HDF5 parallel | cray-hdf5-parallel | Will be loaded automatically with the module cray-netcdf-hdf5parallel |