feature CERN is nothing like today's agentic AI jockeys, who mostly rely on pre-set weights and generic TPUs and GPUs to generate their slop. CERN burns custom nanosecond-speed AI into the silicon itself just to eliminate excess data. Like the major league pitcher who comes to his kid's take-your-parent-to-school day, CERN's Thea Aarrestad gave a presentation at the virtual Monster Scale Summit earlier this month about meeting a set of ultra-stringent requirements that few of her peers may ever experience. Aarrestad is an assistant professor of particle physics at ETH Zurich. AT CERN (European Organization for Nuclear Research), she uses machine learning to optimize data collection from the Large Hadron Collider (LHC). Her specialty is anomaly detection, a core component of any proper observability system. Each year the LHC produces 40,000 EBs of unfiltered sensor data alone, or about a fourth of the size of the entire Internet, Aarrestad estimated. CERN can't store all that data. As a result, "We have to reduce that data in real time to something we can afford to keep." By "real time," she means extreme real time. The LHC detector systems process data at speeds up to hundreds of terabytes per second, far more than Google or Netflix, whose latency requirements are also far easier to hit as well. Algorithms processing this data must be extremely fast," Aarrestad said. So fast that decisions must be burned into the chip design itself. Smash burgers Contained in a 27-kilometer ring located a hundred meters underground between the border of Switzerland and France, the LHC smashes subatomic particles together at near-light speeds. The resulting collisions are expected to produce new types of matter that fill out our understanding of the Standard Model of particle physics — the operating system of the universe. At any given time, there are about 2,800 bunches of protons whizzing around the ring at nearly the speed of light, separated by 25-nanosecond intervals. Just befor...
First seen: 2026-03-22 11:49
Last seen: 2026-03-24 22:38