Experiment Data

The Hydra code is divided into several independent libraries, i.e. different parts of the code (modules) are located in different dynamically linked libraries. There is a library with common code (base) which is loaded by default into ROOT; all other libraries are loaded on user request only. This way the code which is not needed at a given moment does not use system memory. To reach this goal the reconstruction program is tailored in a modular fashion.

The event reconstruction proceeds in several steps in which the data, originally taken from the Data Acquisition System (or simulation code) are gradually elaborated. Each reconstruction step is executed by an algorithm or modular procedure and delivers one data level.

The data obtained from the data acquisition, are unpacked in order to produce the first data level of the detector: these are the Raw data. Raw data is saved in a binary file with extension .hld. (exp: xx01114232156.hld or be02340031210.hld)

The unpacking takes the information from the readout electronics which is organized in channels and reorders it to data classified in terms of detectors, modules etc. In a second stage, each detector's data are calibrated in one or several steps. As a result of the calibration we get the physical information of the detected particles: position, energy loss, time of flight, etc.

How to run DST

Type e.g  . hydv6_15-gcc296  // this script sets the environmental variables for Root 303-09 and Hydra v6_15 and Oracle 9i (9.0.1).

Copy the files ( rootlogon.C, analysisDST_gen2.cc, analysisDST.h and analysisDST.make) to your working directory. Type

before making dst you should compile the analysisDST_gen2.cc using following command:

make -f analysisDST.make

run the DST scripts. This script available for the nov02 beamtime:

hydra_batch_gen2.sh