The Large Hadron Collider (LHC) near Geneva, Switzerland, is the largest and most energetic particle collider in the world. Since it was activated in 2008, it has collided nearly a quadrillion protons. When particles collide they shatter, annihilate, and completely reorganize into a firework of new particles flying out in all directions, producing hundreds of millions of gigabytes of data. Physicists have used this data to confirm predictions and test new theories.
The sheer amount of data output from the LHC allows for many research opportunities. It is likely that we would have seen a new particle by now, even if its production is rare; but this volume of data is extremely hard to analyze exhaustively. Almost all of the analysis on LHC data has been done by the experimentalists collecting the data, but in this past week a team of five theorists from Harvard, MIT and CERN performed a new style of analysis that could be used to search for yet undiscovered particles. This analysis was conducted on Open Data, data taken at the LHC that has been made available to the entire public. Calculations reveal that theoretical new particles would register with a higher energy or momentum than those of known, standard particles. The theorists showed that by requiring particles recorded in the data to be above a certain momentum, they could more easily detect these newly formed particles. While they did not find new particles, experimentalists can implement their techniques in future particle searches.
This analysis was important for two reasons: it demonstrated a proof of concept for a new analysis and demonstrated that old data can be used to derive new results. In this era where the next big collider will cost billions of dollars and take decades to build, the ability to reuse old data is invaluable. It also sets an optimistic precedent for future collaborations between theorists and experimentalists.
Managing Correspondent: Cari Cesarotti
Image Courtesy of CERN