Can you weigh an elephant earlier than and after it picks up a one-rupee coin, and inform the distinction? You can if the measurement is exact. To detect a 4-gram coin on a 4-tonne creature, the scales should include a decision of at the very least 1 half per million (ppm). Precise measurements comparable to these routinely result in startling discoveries in elementary physics. This is what the physicist Albert Michelson meant by “The future truths of physical science are to be looked for in the sixth place of decimals”.
This June, the Muon g-2 (pronounced “gee minus two”) Experiment at Fermilab within the US offered its extremely anticipated remaining outcomes. With data collected over three years and with the involvement of greater than 170 physicists, the collaboration had measured a novel property of a subatomic particle referred to as a muon with an unprecedented precision of 0.127 ppm, outdoing its said aim of 0.140 ppm.

Usually, physicists test such measurements in opposition to the prediction of the Standard Model, the idea of subatomic particles that predicts their properties. If they don’t match, the measured worth would trace on the presence of unseen forces. But on this occasion, there are two methods to make theoretical predictions about this property of the muon. One of them is in line with the Fermilab experiment and the opposite is way off. Nobody is aware of which actually is true, and an intriguing drama has been unfolding over the previous couple of years with no clear decision in sight.
g minus 2
The muon is an elementary particle that mimics the electron in each trait aside from being 207-times heavier. Discovered in 1936 in cosmic rays, its place within the sample of the Standard Model was, and nonetheless is, one thing of a thriller, prompting physicist Isidor Rabi to comment: “Who ordered that?”
The muon carries non-zero quantum spin, which suggests it capabilities like a petite magnet. The power of this magnet, referred to as the magnetic second, is captured by a amount referred to as the g issue. In a high-school calculation, g can be precisely 2, but in superior principle its worth drifts a tiny bit from 2 attributable to quantum subject results. It is that this so-called anomalous magnetic second that the Muon g -2 Experiment painstakingly ascertained.
Measurements of the g-2 of the muon have been first made at CERN in Europe and the outcomes have been printed in 1961 with a definition of 4000 ppm. Over the following 20 years at CERN, the precision was improved to 7 ppm. Matters took an attention-grabbing trans-Atlantic flip when the Muon E821 experiment on the Brookhaven National Lab within the US took data between 1997 and 2001 and achieved a precision of 0.540 ppm, which was just like the uncertainty within the theoretical prediction. In different phrases, the 2 numbers — the theoretical calculation and the noticed worth — could possibly be meaningfully in contrast.
And lo and behold, they considerably disagreed.
Contrast with principle
Nothing appeared to have been amiss within the experiment, so many physicists wagered that the disagreement was a touch of ‘new physics’. Numerous explanations involving theories past the Standard Model poured into the literature over the following 20 years. At the identical time, the theoretical physicists themselves acquired all the way down to refining the Standard Model prediction itself, which was no imply job.
So stood affairs till Fermilab began measuring g-2 in 2017. When it had collected simply 6% of the full supposed data by 2021, it had already reached a precision of 0.460 ppm, corresponding to E821. This first consequence was in such wonderful settlement with E821 data that when the 2 outcomes have been mixed, the discrepancy with principle deepened to worrying ranges.
But the spectacle didn’t finish there. On the very day that Fermilab introduced this consequence to a lot fanfare, there quietly appeared in Nature a brand new paper wherein a gaggle of physicists, referred to as the Budapest-Marseille-Wuppertal (BMW) Lattice collaboration, argued there could also be no gap between principle and experiment values in any case.
Theorists compute the muon g-2 utilizing both (i) Feynman diagrams, a device that has served calculations in quantum subject principle for 3 quarters of a century, or (ii) the so-called lattice, a supercomputer simulation of spacetime as a discretised grid that represents quantum fields. Both approaches are technically very difficult on this context.
The remaining outcomes at Fermilab from June are much less complicated. They are in line with their previous bulletins; it’s their distinction with principle that’s unsettled.

An previous pal
The experimental setup itself was ingenious. A beam of anti-muons is injected right into a 15-m-wise ring with a uniform magnetic subject. There the antiparticles make round orbits with a attribute frequency. Meanwhile, the antiparticles’ spin vectors — a elementary property of theirs — rotate within the magnetic subject and so they resemble spinning tops, spinning with a sure spin frequency. The central conceit is to measure the distinction between the frequency of the round orbits and the spin frequencies. This distinction carries direct data on the muon’s g-2 worth.
Both E821 and Fermilab operated on this precept. This will not be inconceivable. Fermilab reused a part of the E821 gear and thus some unknown defects could have made their presence felt within the Fermilab data as effectively. This is why it’s essential to have a very unbiased measurement by a distinct experimental method. An upcoming effort on the Japan Proton Accelerator Research Complex will do exactly this.
Uncertainty is an previous pal of elementary physics. It has at all times borne the promise of an imminent disclosure of a deep secret of nature. We wait now for the following phrase from theorists and hope that the jury will quickly be in.
Nirmal Raj is an assistant professor of theoretical physics on the Centre for High Energy Physics within the Indian Institute of Science, Bengaluru.



