In 1935, Albert Einstein co-authored a paper designed to expose a fatal flaw in quantum mechanics. The flaw he identified, quantum entanglement, turned out to be real. Einstein spent his remaining two decades trying to explain it away, calling it “spukhafte Fernwirkung,” or spooky action at a distance[s]. He failed. In 2025, physicists at Caltech trapped 6,100 atoms in a laser grid and kept them in superposition for about 13 seconds, building hardware toward computers that could exploit the very phenomenon Einstein refused to accept[s].
What Quantum Entanglement Actually Is
Quantum entanglement occurs when two particles become linked so that measuring one instantly determines what you will find when measuring the other, regardless of the distance between them. If two electrons are entangled and you measure one as spinning “up,” you immediately know the other will measure as spinning “down.” This correlation persists whether the electrons are a millimeter apart or across the galaxy.
The key insight from the physics of subatomic particles is that before measurement, neither particle has a definite state. They exist in a superposition of possibilities. The measurement of one particle does not simply reveal a pre-existing property; it appears to determine the property of both particles simultaneously. This is what disturbed Einstein. How could information travel instantaneously between particles separated by vast distances when his own theory of relativity forbids anything from traveling faster than light?
The EPR Thought Experiment
On May 15, 1935, the Physical Review published a paper by Einstein, Boris Podolsky, and Nathan Rosen entitled “Can Quantum Mechanical Description of Physical Reality Be Considered Complete?”[s] The paper proposed a thought experiment, now known as the EPR paradox, that sought to prove quantum mechanics was incomplete.
The EPR argument went like this: two particles with intertwined but indeterminate properties are separated. Then, the property of one is fixed by measurement[s]. If measuring particle A instantly determines the property of distant particle B, one of two things must be true. Either the particles communicated faster than light, which violates relativity. Or both particles always had definite properties; we simply did not know them until we measured. Einstein believed the second option. Quantum mechanics, he argued, was incomplete; a fuller theory would explain the correlations without abandoning locality. Entanglement, in this view, was not mysterious but a leftover trace of a shared origin, like two gloves shipped to different cities.
Einstein was dissatisfied with how the paper turned out. Upon seeing the published version, he complained that it obscured his central concerns: “Rather, the essential thing was, so to speak, smothered by formalism”[s].
Bell’s Theorem: The Test Einstein Never Saw
For nearly three decades after EPR, the debate remained philosophical. Then, in 1964, physicist John Bell proved that no theory of nature that obeys locality and realism can reproduce all the predictions of quantum theory[s].
Bell’s theorem belongs to a class of mathematical impossibility theorems that constrain what any physical theory can predict. If particles have predetermined states as Einstein believed, certain statistical correlations between measurements must satisfy mathematical limits called Bell inequalities. If quantum mechanics is correct, these inequalities can be violated[s].
Bell transformed a philosophical dispute into an experimental question. The universe itself would have to answer.
The Experiments That Proved Einstein Wrong
In 1982, Alain Aspect’s experiments in France provided strong evidence that quantum mechanics was correct and local hidden-variable explanations were ruled out[s]. But critics noted loopholes. Perhaps the detectors missed some particles. Perhaps there was time for signals to pass between measurement stations.
The definitive answer came in 2015, when researchers at Delft University of Technology performed the first loophole-free Bell test. They entangled electron spins in diamond crystals separated by 1.3 kilometers. Efficient spin read-out avoided the detection loophole, while fast random-basis selection and the spatial separation ensured locality conditions[s]. Across 245 trials, they found a Bell inequality violation of S = 2.42 ± 0.20, where classical physics predicts S ≤ 2. The probability that a local-realist model could produce such results was at most 3.9%[s].
The scientific consensus is now firm: quantum entanglement is real, and Bell-type nonlocality is a feature of our universe, though it does not permit faster-than-light signaling[s].
Why Entanglement Does Not Allow Faster-Than-Light Communication
The measurement outcomes are random. When you measure one entangled particle, you get a random result. Your partner measuring the distant particle also gets a random result. Only when the results are compared, at light speed or slower, does the correlation become apparent[s].
This is the no-communication theorem. Quantum entanglement cannot transmit information. The correlations are real, but they are useless for signaling until classical communication occurs. Relativity remains intact.
From Paradox to Technology
What Einstein considered a flaw is now the foundation of quantum computing. In September 2025, Caltech physicists created the largest qubit array ever assembled: 6,100 neutral-atom qubits trapped in a grid by lasers. They achieved 99.98 percent single-qubit accuracy and maintained superposition for about 13 seconds, nearly 10 times longer than previous arrays[s].
In November 2025, IBM Quantum prepared the largest GHZ state reported to date: 120 superconducting qubits in a Greenberger-Horne-Zeilinger state, achieving a fidelity of 0.56 ± 0.03, surpassing the 0.5 threshold required to confirm genuine multipartite quantum entanglement across all qubits[s].
The same year, physicists at BESIII extended entanglement tests into high-energy physics, using 10 billion J/ψ events to test Bell inequalities with entangled hyperon pairs, achieving greater than 5.2σ violation of local hidden variable theory[s]. Unlike photon experiments, studies with entangled massive particles are uncommon[s].
What Entanglement Means for Reality
The implications remain contested. Some physicists argue that quantum entanglement reveals something fundamental about the nature of reality itself. The nonlocal connection between measurement outcomes cannot be removed using hidden variables; this is the ultimate nonlocality of quantum systems[s].
Einstein was right about one thing: if you accept standard quantum mechanics, you accept action at a distance. What he got wrong was that this was a problem. The universe, it turns out, is stranger than he was willing to allow.
In May 1935, Einstein, Podolsky, and Rosen published “Can Quantum Mechanical Description of Physical Reality Be Considered Complete?” in Physical Review[s]. The EPR paper proved a lemma: if measuring one system allows certain prediction of a property of a distant system without disturbing it, that property corresponds to an element of reality. Applied to entangled states, this generates a contradiction: either quantum mechanics is incomplete, or locality fails. Einstein believed the former; he dismissed the alternative as “spukhafte Fernwirkung,” spooky action at a distance[s].
The EPR State and Hidden Variables
The original EPR argument used position-momentum entanglement. Bohm later reformulated it using spin-½ particles in a singlet state. For the singlet |Ψ⁻⟩ = (1/√2)(|↑↓⟩ − |↓↑⟩), measuring spin along any axis on particle A yields the opposite result for particle B along the same axis. EPR’s criterion of reality implies both particles possess definite spin values prior to measurement, contradicting the quantum formalism where the state is a superposition until measured.
Hidden variable theories restore determinism: unmeasured parameters λ predetermine outcomes. The physics of subatomic particles, in this view, would be completed by specifying λ.
Bell Inequalities and Experimental Tests
In 1964, Bell proved that no theory of nature that obeys locality and realism can reproduce all the predictions of quantum theory[s]. For the CHSH formulation, any local hidden variable theory satisfies S ≤ 2, where S = |E(a,b) − E(a,b′) + E(a′,b) + E(a′,b′)|. Quantum mechanics predicts S_max = 2√2 ≈ 2.828 for optimal measurement angles.
Bell’s theorem belongs to mathematical impossibility theorems constraining physical theories. If particles have predetermined states, measurement correlations must satisfy Bell inequalities. Violation implies quantum entanglement correlations exceed classical bounds[s].
Early tests (Freedman-Clauser 1972, Aspect 1982[s]) violated Bell inequalities but required assumptions. The detection loophole assumed fair sampling of detected photons. The locality loophole allowed subluminal communication between measurement stations.
Loophole-Free Tests
The 2015 Hensen et al. experiment at TU Delft closed all major loopholes simultaneously. They used NV centers in diamond as electron-spin qubits, entangled via single-photon interference, separated by 1.3 km. An event-ready scheme generated robust entanglement with estimated state fidelity 0.92 ± 0.03. Efficient spin read-out avoided fair-sampling assumptions, while fast random-basis selection using remote quantum random number generators ensured space-like separation of measurement choices[s].
Results: 245 trials yielded S = 2.42 ± 0.20. The probability that local-realist models could produce such violation, even allowing memory effects, was P ≤ 0.039[s].
Subsequent loophole-free tests confirmed the consensus: Bell-type nonlocality is a feature of nature, though superluminal signaling remains forbidden[s].
Quantum Entanglement in High-Energy Physics
Most Bell tests use photons. In 2025, BESIII collaboration tested Bell inequalities with entangled hyperon pairs (Λ-Λ̄) produced in J/ψ → γηc → Λ(pπ⁻)Λ̄(p̄π⁺) decays. Using 10.087 × 10⁹ J/ψ events, they achieved >5.2σ exclusion of local hidden variable theory[s].
Massive-particle entanglement tests are rare[s]. The BESIII result extends Bell violation to strange baryons decaying through weak interactions, probing entanglement in a qualitatively different regime.
Scaling Entanglement: 2025 Benchmarks
Quantum computing requires scaling entangled qubit counts while maintaining coherence. Two 2025 milestones:
Caltech (September 2025): 6,100 neutral-atom qubits (cesium) in optical-tweezer arrays. Coherence time ~13 seconds (10× improvement over previous arrays). Single-qubit gate fidelity 99.98%[s].
IBM Quantum (November 2025): 120-qubit GHZ state on superconducting processor. Fidelity 0.56 ± 0.03, exceeding the 0.5 threshold for genuine multipartite quantum entanglement certification. Techniques: adaptive compilation to avoid noisy regions, low-overhead parity checks, temporary uncomputation to reduce idle-time decoherence[s].
Interpretive Questions
Loophole-free Bell violations establish that nature exhibits nonlocal correlations. The interpretation remains contested. All single-world interpretations require action at a distance[s]. Many-worlds interpretations avoid action at a distance but introduce branching: measurement outcomes exist in all branches, with correlations emerging only upon comparison.
What remains unambiguous is the operational content: the nonlocal connection between measurement outcomes, shown to be unremovable using local hidden variables, is the ultimate nonlocality of quantum systems[s]. Questions about the nature of reality, or whether quantum mechanics itself approximates a deeper deterministic theory, remain open.
Einstein complained that in the EPR paper, “the essential thing was, so to speak, smothered by formalism”[s]. Nearly ninety-one years later, the formalism won.



