Commit d5f4ff00 authored by Cedric Jourdain's avatar Cedric Jourdain 🐵
Browse files

Update README_ACC.md

parent 0683ce65
...@@ -10,12 +10,19 @@ Clone the repository in a location of your choice, let's say $HOME. ...@@ -10,12 +10,19 @@ Clone the repository in a location of your choice, let's say $HOME.
cd $HOME cd $HOME
git clone https://github.com/geodynamics/specfem3d_globe.git git clone https://github.com/geodynamics/specfem3d_globe.git
``` ```
Then use a fixed and stable version of specfem3D_globe (the one of October 31, 2017
Also get the test case from git (in you $HOME again): for example, see https://github.com/geodynamics/specfem3d_globe/commits/master)
```shell
cd $HOME/specfem3d_globe
git checkout b1d6ba966496f269611eff8c2cf1f22bcdac2bd9
```
If this is not done, clone the ueabs repository.
```shell ```shell
cd $HOME cd $HOME
git clone https://github.com/MisterFruits/bench_spec git clone https://repository.prace-ri.eu/git/UEABS/ueabs.git
``` ```
In the specfem3D folder of this repo, you will find test cases in the test_cases folder,
you will also find environment and submission scripts templates for several machines
## Load the environment ## Load the environment
...@@ -49,7 +56,9 @@ export MPICC=`which mpicc` ...@@ -49,7 +56,9 @@ export MPICC=`which mpicc`
export CUDA_LIB="$CUDAROOT/lib64" export CUDA_LIB="$CUDAROOT/lib64"
export CUDA_INC="$CUDAROOT/include" export CUDA_INC="$CUDAROOT/include"
``` ```
Once again, you will find in the specfem3D folder of this repo a folder name env,
with file name env_x which gives examples of the environment used on several supercomputers
during the last benchmark campaign
## Compile specfem ## Compile specfem
As arrays are staticaly declared, you will need to compile specfem once for each As arrays are staticaly declared, you will need to compile specfem once for each
...@@ -95,22 +104,16 @@ make clean ...@@ -95,22 +104,16 @@ make clean
make all make all
``` ```
## Launch specfem **-> You will find in the specfem folder of ueabs repository the file "compile.sh" which is an compilation script template for several machines (different architectures : KNL, SKL, Haswell and GPU)**
The launch procedure is simplified by the `run_mesher_solver.bash` script included
with tests cases. You just have to simlink some parameters file and binaries before launching it:
``` ## Launch specfem
cd $HOME/bench_spec/test_case_${test_case_id}/DATA
ln -s $HOME/specfem3d_globe/DATA/crust2.0
ln -s $HOME/specfem3d_globe/DATA/s362ani
ln -s $HOME/specfem3d_globe/DATA/QRFSI12
ln -s $HOME/specfem3d_globe/DATA/topo_bathy
ln -s $HOME/specfem_compil_${test_case_id}/bin You can use or be inspired by the submission script template in the job_script folder using the appropriate job submission command :
- qsub for pbs job,
- sbatch for slurm job,
- ccc_msub for irene job (wrapper),
- llsubmit for LoadLeveler job.
sbatch -J specfem -N 1 --ntasks=24 --cpus-per-task=2 -t 01:00:0 --mem=150GB run_mesher_solver.bash
```
## Gather results ## Gather results
...@@ -118,8 +121,8 @@ The relevant metric for this benchmark is time for the solver. Using slurm, it i ...@@ -118,8 +121,8 @@ The relevant metric for this benchmark is time for the solver. Using slurm, it i
easy to gather as each `mpirun` or `srun` is interpreted as a step wich is already easy to gather as each `mpirun` or `srun` is interpreted as a step wich is already
timed. So the command line `sacct -j <job_id>` allows you to catch the metric. timed. So the command line `sacct -j <job_id>` allows you to catch the metric.
Otherwise edit the `run_mesher_solver.bash` script and add the time command befor the Or you can find more precise timing info at the end of this output file : specfem3d_globe/OUTPUT_FILES/output_solver.txt
call to the solver.
......
Supports Markdown
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment