computing for lhc dr. wolfgang von rüden, cern, geneva isef students visit cern, 28 th june - 1 st...
TRANSCRIPT
![Page 1: Computing for LHC Dr. Wolfgang von Rüden, CERN, Geneva ISEF students visit CERN, 28 th June - 1 st July 2009](https://reader034.vdocuments.mx/reader034/viewer/2022051515/551626ba550346b2068b4876/html5/thumbnails/1.jpg)
Computing for LHC
Dr. Wolfgang von Rüden, CERN, Geneva
ISEF students visitCERN, 28th June - 1st July 2009
![Page 2: Computing for LHC Dr. Wolfgang von Rüden, CERN, Geneva ISEF students visit CERN, 28 th June - 1 st July 2009](https://reader034.vdocuments.mx/reader034/viewer/2022051515/551626ba550346b2068b4876/html5/thumbnails/2.jpg)
LHC Computing
![Page 3: Computing for LHC Dr. Wolfgang von Rüden, CERN, Geneva ISEF students visit CERN, 28 th June - 1 st July 2009](https://reader034.vdocuments.mx/reader034/viewer/2022051515/551626ba550346b2068b4876/html5/thumbnails/3.jpg)
The LHC Computing Challenge
Signal/Noise: 10-9
Data volume High rate * large number of
channels * 4 experiments 15 PetaBytes of new data each year
Compute power Event complexity * Nb. events *
thousands users 100 k of (today's) fastest CPUs 45 PB of disk storage
Worldwide analysis & funding Computing funding locally in major
regions & countries Efficient analysis everywhere GRID technology
![Page 4: Computing for LHC Dr. Wolfgang von Rüden, CERN, Geneva ISEF students visit CERN, 28 th June - 1 st July 2009](https://reader034.vdocuments.mx/reader034/viewer/2022051515/551626ba550346b2068b4876/html5/thumbnails/4.jpg)
Wolfgang von Rüden, CERN 4
Particle collisions in the centre of a detector
June 2009
![Page 5: Computing for LHC Dr. Wolfgang von Rüden, CERN, Geneva ISEF students visit CERN, 28 th June - 1 st July 2009](https://reader034.vdocuments.mx/reader034/viewer/2022051515/551626ba550346b2068b4876/html5/thumbnails/5.jpg)
Wolfgang von Rüden, CERN 5
Massive Online Data Reduction
June 2009
![Page 6: Computing for LHC Dr. Wolfgang von Rüden, CERN, Geneva ISEF students visit CERN, 28 th June - 1 st July 2009](https://reader034.vdocuments.mx/reader034/viewer/2022051515/551626ba550346b2068b4876/html5/thumbnails/6.jpg)
Wolfgang von Rüden, CERN 6
Tier 0 at CERN: Acquisition, First pass processing Storage & Distribution
June 2009
![Page 7: Computing for LHC Dr. Wolfgang von Rüden, CERN, Geneva ISEF students visit CERN, 28 th June - 1 st July 2009](https://reader034.vdocuments.mx/reader034/viewer/2022051515/551626ba550346b2068b4876/html5/thumbnails/7.jpg)
Tier 0 – Tier 1 – Tier 2
Wolfgang von Rüden, CERN 7
Tier-0 (CERN):• Data recording• Initial data
reconstruction• Data distribution
Tier-1 (11 centres):• Permanent storage• Re-processing• Analysis
Tier-2 (~130 centres):• Simulation• End-user analysis
June 2009
![Page 8: Computing for LHC Dr. Wolfgang von Rüden, CERN, Geneva ISEF students visit CERN, 28 th June - 1 st July 2009](https://reader034.vdocuments.mx/reader034/viewer/2022051515/551626ba550346b2068b4876/html5/thumbnails/8.jpg)
Recent grid activity
These workloads are at the level anticipated for 2009 data
In readiness testing WLCG ran more than10 million jobs /month
(1 job is ~ 8 hours use of a single processor)
350k /day
8Wolfgang von Rüden, CERNJune 2009
![Page 9: Computing for LHC Dr. Wolfgang von Rüden, CERN, Geneva ISEF students visit CERN, 28 th June - 1 st July 2009](https://reader034.vdocuments.mx/reader034/viewer/2022051515/551626ba550346b2068b4876/html5/thumbnails/9.jpg)
Data transfer out of Tier0
• Full experiment rate needed is 650 MB/s
• Desire capability to sustain twice that to allow for Tier 1 sites to shutdown and recover
• Have demonstrated far in excess of that
• All experiments exceeded required rates for extended periods, & simultaneously
• All Tier 1s achieved (or exceeded) their target acceptance rates
Tier-2s and Tier-1s are inter-connected by the general
purpose research networks
Any Tier-2 mayaccess data at
any Tier-1
Tier-2 IN2P3
TRIUMF
ASCC
FNAL
BNL
Nordic
CNAF
SARAPIC
RAL
GridKa
Tier-2
Tier-2
Tier-2
Tier-2
Tier-2
Tier-2
Tier-2Tier-2
Tier-2
June 2009 9Wolfgang von Rüden, CERN
![Page 10: Computing for LHC Dr. Wolfgang von Rüden, CERN, Geneva ISEF students visit CERN, 28 th June - 1 st July 2009](https://reader034.vdocuments.mx/reader034/viewer/2022051515/551626ba550346b2068b4876/html5/thumbnails/10.jpg)
10
WLCG depends on two major science grid infrastructures ….
EGEE - Enabling Grids for E-ScienceOSG - US Open Science Grid ... as well as many national grid projects
Interoperability & interoperation is vital significant effort in building the procedures to support it
June 2009 Wolfgang von Rüden, CERN
![Page 11: Computing for LHC Dr. Wolfgang von Rüden, CERN, Geneva ISEF students visit CERN, 28 th June - 1 st July 2009](https://reader034.vdocuments.mx/reader034/viewer/2022051515/551626ba550346b2068b4876/html5/thumbnails/11.jpg)
Enabling Grids for E-sciencE
EGEE-II INFSO-RI-031688
240 sites45 countries45,000 CPUs12 PetaBytes> 5000 users> 100 VOs> 100,000 jobs/day
ArcheologyAstronomyAstrophysicsCivil ProtectionComp. ChemistryEarth SciencesFinanceFusionGeophysicsHigh Energy PhysicsLife SciencesMultimediaMaterial Sciences…
Apr-04Jul-0
4
Oct-04Jan
-05
Apr-05Jul-0
5
Oct-05Jan
-06
Apr-06Jul-0
6
Oct-06Jan
-07
Apr-07Jul-0
70
10000
20000
30000
40000
50000
No. CPU
Apr-04Jul-0
4
Oct-04Jan
-05
Apr-05Jul-0
5
Oct-05Jan
-06
Apr-06Jul-0
6
Oct-06Jan
-07
Apr-07Jul-0
70
50
100
150
200
250
300
No. Sites
Grid infrastructure project co-funded by the European Commission - now in 3rd phase with over 100 partners