CMS data archival at MIT

In 2024, CMS collected around 50 PB of raw data and generated more than 75 PB of simulated data. Together with prompt processing, this amounts to more than 150 PB of new data. Using such an enormous amount of data…
In 2024, CMS collected around 50 PB of raw data and generated more than 75 PB of simulated data. Together with prompt processing, this amounts to more than 150 PB of new data. Using such an enormous amount of data…
David is our new SubMIT project leader after Josh Bendavid’s departure on to his permanent CERN position and today he has passed the first very important challenge: he installed his first ever Linux server and his main helper was Pietro,…
The SubMIT project team has recently undertaken a significant project by migrating from a Gluster-based to a Ceph-based file system. This move marks a critical step in ensuring that the SubMIt system can handle the increasing demands of research and…
In 2017, the Large Hadron Collider (LHC) achieved a groundbreaking milestone in instantaneous luminosity, delivering data at twice the rate it was originally designed for. This presented a new challenge for the CMS detector: efficiently selecting and retaining significant events…
The Annual High Throughput Computing workshop took place in Madison, Wisconsin from July 8 till July 12. The conference itself had a session which provided the US CMS Computing and Software organization an opportunity to hold its annual all hands…
At the end of March (18-22), CMS held a major internal workshop (Computing and Offline Week) dedicated to issues related to data processing, distribution, and analysis. This workshop occurs twice a year and brings together experts from various laboratories, computing…