Computing for HEP projects
Master work in Particle Physics - Computing for HEP projects
Modern Particle Physics is also appropriately called High Energy Physics, because it relies upon accelerators delivering high-energy particle collisions in order to generate data. Gone are the days when particle physicists were climbing mountains with photographic plates, hoping to register rare particles coming from collisions of cosmic rays with elements in Earth athmosphere. Even though such collisions can have a very high energy, they are rather rare and quite unpredictable. Accelerators allow us to recreate such collisions in a predictable and abundant manner. Large Hadron Collider at CERN is the latest in the line of large particle colliders, delivering high quality research data.
But this amazing technology achievement comes with a new challenge: the data need to be collected and processed. While photographic plates were processed at a slow pace and analyzed with a naked eye, modern particle detectors are like huge digital 3-dimensional photo cameras, with resolution reaching microns, and producing thousands of "frames" per second. Handling this immense load of data in such a manner that scientists can obtain results rapidly requires a very special computing, storage and software infrastructures.
Our group contributes to the solution of this data problem in several ways. We develop specialised software that brings together data centers from around the world, helping to create a computing infrastructure that is distributed worldwide. The software is called ARC - Advanced Resource Connectior, and the infrastructure is called WLCG -Worldwide LHC Computing Grid. We also contribute to operation of the Nordic Tier1 data centre of WLCG, which can be seen as a part of CERN in Sweden. And of course Lund researchers develop software that helps processing data from the LHC, as well as modelling the collisions and detectors.
Diploma projects in computing are typically done in conjunction with one of the experiments, such as e.g. ATLAS or LDMX, or may concern generic software that is common to all experiments. As the projects often deal with code development, students are expected to be familiar with Linux and at least one programming language. Knowledge of C++ is most beneficial, though Python, C, Java etc are also relevant.
Examples of projects are:
- Studies of optimal software algorithms and tools for data analysis, including e.g. artificial neural networks, programming for new processor architectures etc
- Optimisation of existing software, such as e.g. the GEANT4 detector simulation toolkit
- Development of analysis or simulation workflows for distributed computing infrastructures
- Development of tools to support distributed data processing or storage