Graduate Student Tames Quantum Interference in LHC Data

Introduction
The Large Hadron Collider (LHC) at CERN is the world’s highest-energy particle accelerator, producing petabytes of collision data every year. Yet one of the most profound obstacles in interpreting these data is quantum interference—an intrinsic phenomenon of quantum mechanics that causes overlapping particle‐interaction histories to amplify or cancel each other. Conventional statistical methods, even those augmented by machine learning classifiers, have struggled to recover the full physics potential of these measurements. In 2025, the ATLAS collaboration published two landmark papers introducing Neural Simulation-Based Inference (NSBI), a deep‐learning–driven framework that directly estimates likelihood ratios without discrete signal/background classification. This new approach, pioneered by graduate student Aishik Ghosh and colleagues, has already rewritten ATLAS’s projections for Higgs boson precision and will shape the High-Luminosity LHC (HL-LHC) era.
Quantum Interference: The Core Challenge
Quantum interference manifests when multiple amplitudes—or “histories”—lead from the same initial to the same final state. In the classic double-slit experiment, electrons traversing two slits produce an interference pattern on a screen. At the LHC, interference occurs between Feynman diagrams with and without intermediate Higgs production. For instance, when two incoming protons each emit a W boson, those bosons can either fuse into a Higgs boson that subsequently decays into two Z bosons, or they can directly convert into Z bosons without the Higgs mediator. These amplitudes interfere in the cross section σ, with the total rate given by
σ_total = |A_signal + A_background|² = |A_signal|² + |A_background|² + 2Re(A_signal A*_background).
Standard methods slice events into “signal” and “background” histograms, but the interference term 2Re(A_signal A*_background) induces cancellations and enhancements that these slices cannot fully capture.
From Classification to Direct Inference
ATLAS physicists traditionally employed supervised classifiers—boosted decision trees (BDTs) or deep neural networks—to separate signal‐like from background‐like events. However, in the presence of interference, the concept of a pure “signal event” vanishes: a Higgs‐mediated decay may be completely canceled by its non‐Higgs counterpart in a particular kinematic region. Training a classifier on mixed samples leads to biased likelihood estimates and inflated uncertainties.
Aishik Ghosh’s insight was to bypass classification altogether. Instead, he adopted NSBI, where a neural network is trained on simulated datasets parameterized by physics parameters θ (e.g., Higgs decay width Γ_H). Using architectures based on normalizing flows—invertible networks that map complex distributions to simple latent spaces—NSBI directly approximates the likelihood ratio
r(x; θ₀, θ₁) = p(x|θ₀) / p(x|θ₁),
where x denotes observed kinematic variables (four‐lepton invariant mass, angular distributions, etc.). By learning r(x), one can perform a likelihood‐based inference on real data via techniques such as profile likelihood or Bayesian posterior sampling, fully incorporating interference effects.
Technical Implementation and Computational Infrastructure
The NSBI workflow leverages state-of-the-art computing resources on the Worldwide LHC Computing Grid (WLCG) and CERN OpenLab partnerships. Key technical specifications include:
- Training clusters with NVIDIA A100 GPUs orchestrated by Kubernetes and Kubeflow Pipelines.
- Normalizing‐flow models (e.g., Masked Autoregressive Flows) implemented in TensorFlow 2.12, optimized with mixed‐precision AMP.
- Parameter scans over Γ_H ∈ [1, 10] MeV and anomalous coupling coefficients using distributed OpenMPI ensemble jobs.
- CI/CD via GitLab runners and automated validation suites to ensure statistical robustness (covering toy MC tests, coverage studies, and stress tests in rare corners of phase space).
During development, the team processed >108 events in parallel, achieving end-to-end model training in under 48 hours—three times faster than earlier GPU‐free implementations.
Reanalysis and Validation on Run 2 Data
In December 2025, ATLAS released two companion papers. The first detailed the NSBI methodology, providing theoretical proofs of asymptotic convergence and uncertainty calibration using the evidence lower bound (ELBO). The second paper presented a full reanalysis of the Run 2 four-lepton channel (H → ZZ* → 4ℓ), comparing NSBI against the legacy BDT‐based approach. Key results included:
- 30% reduction in the statistical uncertainty on Γ_H.
- 20% improvement in sensitivity to potential anomalous Higgs couplings.
- Robust performance in kinematic regions previously deemed infeasible due to destructive interference.
Internal review panels, comprising experts in detector calibration, statistical methods, and theoretical modeling, validated these findings, paving the way for NSBI’s inclusion in future ATLAS analysis frameworks.
Expert Insights
“The leap from classification to direct likelihood estimation is paradigm‐shifting,” said Daniel Whiteson (UC Irvine). “It’s as if you unlocked the full quantum tapestry of these events.”
“We always knew interference held hidden information,” noted David Rousseau (IJCLab). “NSBI finally lets us extract it cleanly.”
Future Implications for High-Luminosity LHC
Looking ahead to the HL-LHC upgrade (projected to start physics runs in 2028 with instantaneous luminosities up to 7.5×1034 cm⁻²s⁻¹), NSBI will become central to ATLAS’s physics program, including:
- Precision measurements of rare processes, e.g., Higgs pair production (HH → 4b, 2b2γ).
- Searches for beyond-Standard-Model effects in vector-boson scattering and dark sector portals.
- Real-time inference workflows integrated into the trigger system using FPGA‐accelerated normalizing flows.
ATLAS has already begun upgrading its Tier-0 data processing farm to include GPU nodes, ensuring that NSBI can be deployed on both recorded and streamed data.
Comparison with Traditional Statistical Methods
While profile‐likelihood scans and Bayesian MCMC have been staples in LHC data analysis, they rely on handcrafted probability density estimates or binned histograms that cannot capture interference at subbin resolution. NSBI’s differentiable likelihood estimators supplant kernel density estimates and Gaussian approximations, providing:
- Continuous, unbiased likelihood surfaces across high-dimensional phase spaces.
- Automatic uncertainty quantification via ensemble networks and calibration metrics (e.g., PIT histograms).
- Scalability to ten or more physics parameters, critical for EFT (Effective Field Theory) fits.
Conclusion
Aishik Ghosh’s six-year journey, supported by IJCLab, UC Irvine, and a global ATLAS team, has demonstrated how advanced AI techniques can resolve long-standing quantum challenges in collider physics. By integrating NSBI into its core analysis pipeline, ATLAS is poised to redefine precision in the HL-LHC era, unlocking new discoveries at the intersection of quantum mechanics, machine learning, and high-performance computing.