Go to page

Bibliographic Metadata

 The document is publicly available on the WWW

The issue of storing large quantities of data is already challenging for research institutions, and will become increasingly difficult over the next decade as more data are generated in addition to the data already requiring storage. CERN, as the largest particle physics laboratory globally, faces a particularly critical problem in this regard and, with specic reference to the ATLAS experiment, this is complicated by the upcoming implementation of the High-Luminosity Large Hadron Collider upgrade in 2027. In order to address this issue, novel models for the optimal utilisation of already existing storage systems as well as the integration of further storage solutions are required. However, this is hindered by the absence of sufficient test beds, considering the scale of the problem, as well as an overall lack of research related to alternative storage models at the exabyte scale.The primary aim of this thesis was to extend the research in this area towards new models and commercial cloud storage. This thesis therefore explored the combination of the Data Carousel model, which is currently being evaluated at CERN, and the Hot/Cold Storage model, which is planned to moderate certain disadvantages of the Data Carousel model. The GACS simulation tool was developed throughout this thesis and then used for the evaluation of the new model combination. The validation of GACS showed a difference between real world data and simulated data of at most 3:3%. Together, the new model and GACS provides a foundation for further investigation of cost-effective and efficient data storage methods at exabyte scale by current R&D programmes at CERN.