QCD_Run_README.txt 2.5 KB
Newer Older
Jacob Finkenrath's avatar
Jacob Finkenrath committed
1
2
Running the QCD Benchmarks contain in ./part_cpu in the JuBE Framework
==========================================================================
Valeriu Codreanu's avatar
Valeriu Codreanu committed
3

Jacob Finkenrath's avatar
Jacob Finkenrath committed
4
5
The folder ./part_cpu contains following subfolders
     
Valeriu Codreanu's avatar
Valeriu Codreanu committed
6
7
8
9
10
11
12
13
     PABS/
     applications/
     bench/
     doc/
     platform/
     skel/
     LICENCE

Jacob Finkenrath's avatar
Jacob Finkenrath committed
14
The applications/ subdirectory contains the QCD benchmark applications.  
Valeriu Codreanu's avatar
Valeriu Codreanu committed
15
The bench/ subdirectory contains the benchmark environment scripts. 
Jacob Finkenrath's avatar
Jacob Finkenrath committed
16
17
18
The doc/ subdirectory contains the overall documentation of the framework and a tutorial. 
The platform/ subdirectory holds the platform definitions as well as job submission script templates for each defined platform. 
The skel/ subdirectory contains templates for analysis patterns for text output of different measurement tools.  
Valeriu Codreanu's avatar
Valeriu Codreanu committed
19
20
21
22

Configuration
=============

Jacob Finkenrath's avatar
Jacob Finkenrath committed
23
Definition files are already prepared for many platforms. If you are running on a defined platform just go forward, otherwise please have a look at QCD_Build_README.txt.
Valeriu Codreanu's avatar
Valeriu Codreanu committed
24
25
26
27

Execution
=========

Jacob Finkenrath's avatar
Jacob Finkenrath committed
28
Assuming the Benchmark Suite is installed in a directory that can be used during execution, a typical run of a benchmark application will
Valeriu Codreanu's avatar
Valeriu Codreanu committed
29
contain two steps.  
Jacob Finkenrath's avatar
Jacob Finkenrath committed
30

Valeriu Codreanu's avatar
Valeriu Codreanu committed
31
32
1. Compiling and submitting the benchmark to the system scheduler.
2. Verifying, analysing and reporting the performance data.
Jacob Finkenrath's avatar
Jacob Finkenrath committed
33

Valeriu Codreanu's avatar
Valeriu Codreanu committed
34
35
36
Compiling and submitting
------------------------

Jacob Finkenrath's avatar
Jacob Finkenrath committed
37
38
If configured correctly, the application benchmark can be compiled and submitted on the system (e.g. the IBM BlueGene/Q at Jülich) with the commands:  

Valeriu Codreanu's avatar
Valeriu Codreanu committed
39
40
41
>> cd PABS/applications/QCD
>> perl ../../bench/jube prace-scaling-juqueen.xml

Jacob Finkenrath's avatar
Jacob Finkenrath committed
42
The benchmarking environment will then compile the binary for all node/task/thread combinations defined, if those parameters need to be compiled into the binary. It creates a so-called sandbox subdirectory for each job, ensuring conflict free operation of the individual applications at runtime. If any input files are needed, those are prepared automatically as defined. 
Valeriu Codreanu's avatar
Valeriu Codreanu committed
43

Jacob Finkenrath's avatar
Jacob Finkenrath committed
44
Each active benchmark in the application’s top-level configuration file will receive an ID, which is used as a reference by JUBE later on. 
Valeriu Codreanu's avatar
Valeriu Codreanu committed
45
46
47
48

Verifying, analysing and reporting
----------------------------------

Jacob Finkenrath's avatar
Jacob Finkenrath committed
49
After the benchmark jobs have run, an additional call to jube will gather the performance data. For this, the options -update and -result are used.  
Valeriu Codreanu's avatar
Valeriu Codreanu committed
50
51
52
53

>> cd DEISA_BENCH/application/QCD
>> perl ../../bench/jube -update -result <ID>

Jacob Finkenrath's avatar
Jacob Finkenrath committed
54
The ID is the reference number the benchmarking environment has assigned to this run. The performance data will then be output to stdout, and can be post-processed from there.