Experiment puts 80-year entanglement debate to rest

Monday, November 23, 2015

Researchers in Canada, the United States and Europe led by the National Institute of Standards and Technology in Boulder, Colorado and Institute for Quantum Computing alumnus Krister Shalm have ruled out classical theories of correlation with remarkably high precision. A group including Institute for Quantum Computing members Evan Meyer-Scott, Yanbao Zhang, Thomas Jennewein, and alumnus Deny Hamel built and performed an experiment that shows the world is not governed by local realism.

Local realism states that the world is predictable and only influenced by its immediate surroundings. It assumes that all objects have a pre-existing value for any measurement and that any influence on the outcome of the measurement cannot travel faster than the speed of light. This is in contrast to the inherent probabilistic nature of quantum mechanics, which led Einstein and his colleagues Boris Podolsky and Nathan Rosen to claim that the wave function at the heart of quantum mechanics does not provide a complete description of physical reality.

In particular, because quantum entanglement appeared to allow for the measurement outcomes of two physically separated particles to be perfectly correlated, even though the individual outcomes are random, it seemed that the measurement of one particle instantaneously influences the outcome of the other. Einstein, Podolsky, and Rosen believed there must be a deeper local realistic theory to explain the correlations of entanglement.

Almost 30 years after this thought experiment, John Bell showed that there are strict limits to measurement correlations under local pre-existing conditions. Because of the seemingly non-local nature of quantum entanglement, entangled particles could show correlations beyond those limits. Bell’s work allowed scientists to experimentally test the hypothesis that nature is governed by local realism, by measuring entangled particles to find the strength of their correlations.

“We did a really careful, definitive test to show that local realism is unlikely to be true,” said Meyer-Scott. “It’s a nice idea to be able to predict what’s going to happen and that things only happen where they’re supposed to, and in most cases that’s still true – but at a fundamental level it’s an untenable view to hold now.”

Researchers have been performing increasingly conclusive Bell tests since the 1970s. Due to technological limitations, they had to make additional assumptions to show local realism was incompatible with their experimental results – and as a result opened possible loopholes – locality, freedom-of-choice and fair-sampling.

Jennewein has been involved in experiments addressing some of these loopholes over the past twenty years, but with new detectors built by NIST and a new high-performance photon source, researchers now have the technology needed to perform the Bell test closing all three loopholes simultaneously. “I am convinced that 2015 will be remembered as the year of the loophole-free Bell test,” Jennewein said. “I am glad our new experiment finally provides some closure to this question.”

In the experiment, entangled photon pairs were created so that the photons’ polarizations were highly correlated. After separating the pairs, the photons travelled through fibre optic cables to detectors, Alice and Bob, that were in different, widely separated rooms in a NIST laboratory.

A diagram of the loophole experiment setup

This long distance along with fast measurement decisions were important in closing the no-communication, or locality, loophole. Alice completes her measurement in such a time that any information about Bob’s measurement setup and outcome, traveling at the speed of light, could not have possibly reached her and vice versa.

A random number generator picked one of two settings for each polarization analyzer. The random number generators were placed so that they could not influence or be influenced by the photon source to ensure randomness and independence. This closed the freedom-of-choice loophole.

The fair-sampling loophole arises if not all photons are detected, but one assumes the detected photons were a fair sample of the whole. It can be closed by detecting a large enough fraction of the created photons. In this experiment the 90% efficiency of NIST’s single photon detectors and rigorous optimization of all optical components allowed closing the loophole by detecting 75% of the emitted photons.

The paper, A strong loophole-free test of local realism, has been published on the arXiv. Similar results were achieved by a team at the University of Vienna in Austria, and the two papers have jointly been submitted to Physical Review Letters. These photonic experiments complement a previous closure of the loopholes using entangled electron spins in diamond by a group at the Delft University of Technology. The results could have great significance for device-independent quantum communication. If each time Alice and Bob communicate they violate the Bell test with all the loopholes closed, the system is completely secure – hackers cannot get the information no matter how the communication devices actually work.

Meyer-Scott in the lab building the measurement setup
In the present experiment, Meyer-Scott, a PhD student in the Department of Physics and Astronomy at the University of Waterloo, constructed the measurement setup and helped build the photon source. Zhang, a postdoctoral fellow at IQC analyzed the experimental data, ensuring the loopholes were closed. Jennewein, a faculty member in the Department of Physics and Astronomy, provided guidance on the fibre channels and fast switching. Additionally, the experiment used time taggers from Jennewein’s startup Universal Quantum Devices.