12.07.2015 Views

View - HEPHY

View - HEPHY

View - HEPHY

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

1 IntroductionWhile the Standard Model of elementary particle physics has withstood all experimentaltests since its inception, there is ample reason to believe that it is an effectivetheory that arises as a special case of a more fundamental description of nature. Itis theoretically unwieldy, with a large number of free parameters. Glaringly, it failsto provide an explanation for gravitation, and cosmological observations suggest thatlarge parts of the energy of the universe are not accounted for by any form of matterknown in the Standard Model. While many theories exist to extend the StandardModel in theoretically more appealing directions, the final arbiter of their value remainsthe comparison with experimental data. To that end, the Large Hadron Collider(LHC), currently being commissioned at CERN, is designed to explore the landscapeof particle physics at energies an order of magnitude higher than previous machines.Chapter 2 reviews some of the expected features of this landscape from an experimentalpoint of view, both for the Standard Model and some candidate theories beyond theStandard Model. The LHC machine and the Compact Muon Solenoid (CMS) experiment,one of its two large general purpose detectors, form the experimental contextof this work. They are described in Ch. 3.A major technological challenge in the analysis of particle collisions is event selection.As early as 1955, the ability of experiments to generate events outstripped the abilityof humans to analyze them, causing Luis Alvarez to complain that “one day of bubblechamber operation could keep a group of ‘cloud chamber analysts’ busy for a year” [1].This launched particle physics experiments on a trajectory of ever greater automationand computerization, culminating in the current situation at LHC. When the designluminosity of L = 1.0×10 34 cm −2 s −1 is reached, the experiments will observe an event(usually containing multiple proton-proton interactions) for every 25 ns interval. Withthe measurement capabilities of the CMS experiment, this amounts to a data volumeof about 40 TB/s, far beyond any storage capabilities currently feasible. In addition,much of this data deluge consists of well-known processes and is thus of little interestfor the physics program. Since actual storage capability is limited to about 100 eventsper second, a rejection factor of almost 10 −6 has to be achieved in the event selection.The CMS experiment’s approach to this problem relies on two stages. First, all detectordata is preserved in buffer memories while a dedicated electronics system makesa preliminary decision to accept or reject the event, based on coarse measurements.This decision has to be rendered within 3.2 µs due to limited buffer space and is expectedto reduce the event rate by a factor of about 10 −3 . This system is the CMSLevel-1 Trigger, described in detail in Ch. 4. In a second stage, the events acceptedby the Level-1 Trigger are examined by the High-Level Trigger (HLT), implementedin software running on general purpose CPUs, which achieves another 10 −3 rejection1

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!