offline software for the atlas combined test beam

16
Computing in High Energy Physics – Interlaken - September 2004 Ada Farilla Offline Software for the ATLAS Combined Test Beam Ada Farilla – I.N.F.N. Roma3 On behalf of the ATLAS Combined Test Beam Offline Team: M. Biglietti, M. Costa, B. Di Girolamo, M. Gallas, R. Hawkings, R. McPherson, C. Padilla, R. Petti, S. Rosati, A. Solodkov

Upload: khuyen

Post on 03-Feb-2016

37 views

Category:

Documents


0 download

DESCRIPTION

Offline Software for the ATLAS Combined Test Beam. Ada Farilla – I.N.F.N. Roma3 On behalf of the ATLAS Combined Test Beam Offline Team: M. Biglietti, M. Costa, B. Di Girolamo, M. Gallas, R. Hawkings, R. McPherson, C. Padilla, R. Petti, S. Rosati, A. Solodkov. - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Offline Software for the ATLAS Combined Test Beam

Computing in High Energy Physics – Interlaken - September 2004Ada Farilla

Offline Software for the ATLAS Combined Test Beam

Ada Farilla – I.N.F.N. Roma3

On behalf of the ATLAS Combined Test Beam Offline Team:

M. Biglietti, M. Costa, B. Di Girolamo, M. Gallas, R. Hawkings, R. McPherson, C. Padilla, R. Petti, S. Rosati, A. Solodkov

Page 2: Offline Software for the ATLAS Combined Test Beam

Computing in High Energy Physics – Interlaken - September 2004Ada Farilla

ATLAS Combined Test Beam (CTB)

A full slice of the ATLAS detector (barrel) has been installed on the H8 beam line of the SPS in the CERN North Area: beams of e/ will be delivered in the range 1-350 GeV

The data taking has started this year on May 17th and will continue until mid November 2004: ATLAS is one of the main users of the SPS beam in 2004

The setup spans more that 50 meters and is enabling the ATLAS team to test the detector’s characteristics long before LHC starts

Among the most important goals for this project:

1. Study the detector performance of an ATLAS barrel slice2. Calibrate the calorimeters at a wide range of energies3. Gain experience with the latest offline reconstruction and simulation packages4. Collect data for a detailed comparison data-MonteCarlo5. Gain experience with the latest Trigger and Data Acquisition packages 6. Test the ATLAS Computing Model with real data7. Study commissioning aspects (i.e. integration of many sub-detectors, test the online

and offline software with real data, management of conditions data)

Page 3: Offline Software for the ATLAS Combined Test Beam

Computing in High Energy Physics – Interlaken - September 2004Ada Farilla

ATLAS CTB: Layout

The setup includes elements of all the ATLAS sub-detectors:

1. Inner Detector: Pixel, Semi-Conductor Tracker (SCT), Transition Radiation Tracker (TRT)

2. Electromagnetic Liquid Argon Calorimeter 3. Hadronic Tile Calorimeter4. Muon System: Monitored Drift Tubes (MDT), Cathode Strip Chambers (CSC),

Resistive Plate Chambers (RPC), Thin Gap Chambers (TGC)

The DAQ includes elements of the Muon and Calorimeter Trigger (Level 1 and Level 2 Triggers) as well as Level 3 farms

A dedicated internal Fast Ethernet Network deals with controls .A dedicated internal Gigabit Ethernet Network deals with the data which are stored on the CERN Advanced Storage Manager (CASTOR)

About 5000 runs (50 Millions events) already collected so far, >1TB of data stored

Page 4: Offline Software for the ATLAS Combined Test Beam

Computing in High Energy Physics – Interlaken - September 2004Ada Farilla

ATLAS CTB: Layout CTB Layout as seenby the G4 Simulation

About two hundreds people involved in the effort (physicists, engineers and technicians) with about 40 people in four support groups ensuring the smooth running of the detectors and DAQ

The ATLAS detector

Partial view of thesetup in the H8 area

Incoming Beam

Page 5: Offline Software for the ATLAS Combined Test Beam

Computing in High Energy Physics – Interlaken - September 2004Ada Farilla

ATLAS CTB: Offline Software

The CTB Offline Software is based on the standard ATLAS software, withadaptations due to the different geometry of the CTB setup w.r.t the ATLASgeometry. Very little CTB-specific code was developed, mostly for monitoring

andsub-detectors’ code integration.

About 900 packages now in the ATLAS offline software suite, residingin a single CVS repository at CERN. Management of package dependencieslibraries and executable building is performed with CMT.

The ATLAS software is based on the Athena/Gaudi framework, whichembodies a separation between data and algorithms. Data are handled

through anEvent Store for event information and a Detector Store for condition information. Algorithms are driven through a flexible event loop. Common

utilities(i.e. magnetic field map) are provided trough services. PYTHON JobOptionsscripts allow to specify algorithms and services parameters and sequencing.

Page 6: Offline Software for the ATLAS Combined Test Beam

Computing in High Energy Physics – Interlaken - September 2004Ada Farilla

ATLAS CTB: Software Release Strategy

New ATLAS releases are built approx. every three weeks, following apredefined plan for software development

New CTB releases are built every week following a CVS-branch of thestandard release.

During test bean operation, a reason of concern has been to find a compromisebetween two conflicting needs: the need of new functionalities and the need ofstability. The strategy to achieve that was discussed during daily/weeklymeetings: a good level of communication between all the involved parties helped in

solving problems and take good decisions.

Page 7: Offline Software for the ATLAS Combined Test Beam

Computing in High Energy Physics – Interlaken - September 2004Ada Farilla

ATLAS CTB: Simulation

Simulation of the CTB layoutis performed in a flexible wayby a package (CTB_G4Sim)similar to the ATLAS G4Simulation package (same code as forthe full ATLAS detector) Python JobOption scriptsand macro files allow to selectamong different configurations(i.e. with and without magneticfield, different positions of thesub-detectors on the beamline,different rotation of thecalorimeters w.r.t. the y axis,etc) We are in pre-production mode:about 500.000 events alreadyproduced for validation studies

YX

Z

Inner Region

Calorimeter Region

Muon Region

CTB layout H8 (2004)

The detailed comparison data-MonteCarlo is expected to be one of the major outcomeof the physics analysis

Page 8: Offline Software for the ATLAS Combined Test Beam

Computing in High Energy Physics – Interlaken - September 2004Ada Farilla

ATLAS CTB: Reconstruction

Full reconstruction is performed in a package (RecExTB) that integrates all the sub-detectors’ reconstruction algorithms, through the following steps:

1. Conversion of raw data from the online format to their Object representation 2. Access to conditions data3. Access to ancillary information (i.e. scintillators, beam chambers,trigger patterns)4. Execution of sub-detector reconstruction5. Combined reconstruction across sub-detectors6. Production of Event Summary Data and ROOT Ntuples7. Production of xml files for event visualization using the Atlantis Event Display

From the full reconstruction of real data we are learning a lot on calibration and alignment procedures and on the performances of new reconstruction algorithms.

Moreover, we expect to gain experience in exercising the available tools for Physics Analysis, aiming at their wide-spread use in a community which is much bigger than the one restricted to software developers: non negligible by-product from an educational point of view

Page 9: Offline Software for the ATLAS Combined Test Beam

Computing in High Energy Physics – Interlaken - September 2004Ada Farilla

ATLAS CTB: (very) preliminary results

Events with a track in the Muon system

E(TileCal)

Z0 parameter for tracks in the Inner Detector (Pixels+TRT) and in the Muon system

ID Z0 (mm)

Muo

n Z

0 (m

m)

Energy in Tile Cal for events with/without a muon track

Energy in e.m. LAr Cal for 250 GeVelectrons

Page 10: Offline Software for the ATLAS Combined Test Beam

Computing in High Energy Physics – Interlaken - September 2004Ada Farilla

ATLAS CTB: Detector Description

The detailed description of the detectors and of the experimental set-up is a key issue for the CTB software, along with the possibility to handle in a coherent waydifferent experimental layouts (versioning).

The NOVA MySQL database is presently used for storing detector descriptioninformation and in few weeks this information will also be fully available in ORACLE

The new ATLAS detector description package (GeoModel) has also been extensivelyused both for simulation and reconstruction.

Page 11: Offline Software for the ATLAS Combined Test Beam

Computing in High Energy Physics – Interlaken - September 2004Ada Farilla

ATLAS CTB: Conditions Database

Three components: Lisbon Interval-Of-Validity

Database (IoVDB, MySQL implementation)• Data stored internally as blobs

(XML strings) or tables • Or as external ref to POOL or NOVA Database

POOL for large calibration objects

NOVA for Liquid Argon information

The management of large volumes of conditionsdata (i.e. slow control data, calibrationand alignment parameters, run parameters) is akey issue for the CTB . We are now using aninfrastructure that will be of help in definingthe long term ATLAS solution for theconditions database

Page 12: Offline Software for the ATLAS Combined Test Beam

Computing in High Energy Physics – Interlaken - September 2004Ada Farilla

ATLAS CTB: Conditions Database

Data aquisition programs

Data aquisition programs

Online server(atlobk01)

Offline server(atlobk02)

Browsing, Athena

applications

Browsing, Athena

applications

DB

rep

lica

tio

n

OBK DBs

Test DBs

CondDBBCTB DBs

OBK DBs

CondDBCTB DBs

NOVA

POOL cat

POOL cat

NOVA

OfflineBookkeeping

Database

OfflineBookkeeping

Database Replication using mySQL binary log

Unidirectional only onlineoffline

Page 13: Offline Software for the ATLAS Combined Test Beam

Computing in High Energy Physics – Interlaken - September 2004Ada Farilla

ATLAS CTB: Production Infrastructures

GOAL: we have to deal with a big data set (more than 5000 runs, 107 events) and be able to perform many reprocessings in the next months. A production structure is in place to achieve that in short time and in a centralized way.

A sub-set of the information from the Conditions Database is copied to the bookkeeping database AMI (ATLAS Metadata Interface) : run number, file name, beam energy, etc . Other run information is entered by the Shifter.

AMI (Java application) contains a generic read-only web interface for searching. It is directly interfaced to AtCom (ATLAS Commander), a tool for automated job definition, submission and monitoring, developed for the CERN LSF Batch System, widely used during the “ATLAS Data Challenge 1” in 2003

Page 14: Offline Software for the ATLAS Combined Test Beam

Computing in High Energy Physics – Interlaken - September 2004Ada Farilla

ATLAS CTB: Production Infrastructures AMI, AtCom and a set of scripts represent the “production

infrastructure” presently used both for real data reconstruction and for MonteCarlo pre-production done at CERN, with Data Challenge 1 tools.

The “bulk” of MonteCarlo production will be performed on Grid (“continuous production”), using the infrastructure presently used for running the “ATLAS Data Challenge 2”

Both production infrastructures will be used in the coming months,operated by two different production groups

Page 15: Offline Software for the ATLAS Combined Test Beam

Computing in High Energy Physics – Interlaken - September 2004Ada Farilla

ATLAS CTB: High Level Trigger (HLT)

HLT = combination of Level2 and Level3 Triggers

HLT at the CTB as a service: It provides the possibility to run “high-level monitoring”

algorithms executed in the Athena processes running in the Level3 farms. It is the first place, in the data flow, where to establish correlations between objects reconstructed in the separate sub-detectors

HLT at the CTB as a client: Complex selection algorithms are under test with the aim to

have two full slices (one slice for electrons and photons, one slice for muons) LVL1 -> LVL2 -> LVL3 and producing separate outputs in the Sub Farm Output for the different selected particles

Page 16: Offline Software for the ATLAS Combined Test Beam

Computing in High Energy Physics – Interlaken - September 2004Ada Farilla

Conclusions

The Combined Test Beam is the first opportunity to exercise the “semi-final” ATLAS software with real data in a complex setup

All the software has been integrated and is running successfully, including the HLT: centralized reconstruction has already started on limited data samples and preliminary results are already available, showing good quality

Calibration, alignment and reconstruction algorithms will

continue to improve and to be tested with the largest data sample ever collected in a test beam environment (O 107 events)

It has been a first, important step towards the integration of different sub-detectors and of people from different communities (software developers and sub-detector experts)