Skip to main content
U.S. flag

An official website of the United States government

Official websites use .gov
A .gov website belongs to an official government organization in the United States.

Secure .gov websites use HTTPS
A lock ( ) or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

Truly Random Numbers — But Not by Chance

Schematic diagram of loophole-free Bell test

Schematic diagram of loophole-free Bell test, with time on the vertical axis and space on the horizontal axis. Two entangled photons are generated by a source. Each proceeds on a separate path to different detector destinations (Alice and Bob respectively) placed several hundred meters apart. At some time between each photon's generation and its arrival at the detector, a random choice of measurement basis (e.g., polarization) is made. That choice affects the result when each photon hits its detector. The blue lines represent light cones: the space-time volume within which one object can interact with another. The position of the lines indicates that it is impossible for information about the basis choice made on Alice's photon — even if it is traveling at the speed of light — to get to Bob's photon before it is detected, and vice versa. This arrangement closes the "locality" or "light-cone" loophole.

Life can seem haphazard and chaotic, but true randomness is fundamentally mysterious, elusive, and remarkably difficult to observe. If it can be realized and put to use, it offers enormous benefits to a digital society in which strings of nominally random numbers are used hundreds of billions of times every day to encrypt information in virtually every secure network transaction.

Today's encryption schemes use random-number generators, typically software algorithms or physical devices, to produce strings of bits that can pass many statistical tests for randomness. But none of those sequences is truly random: No device that relies on classical physical principles or operations can certify that its output is absolutely unpredictable. That is a troubling vulnerability in a nation in which an estimated $50 billion or more is lost every year to identity theft alone, and a number of severe security breaches have resulted from random-number generation schemes that have been cracked by outsiders.

To address that weakness, collaborators from PML and NIST's Information Technology Laboratory (ITL) have embarked on a daunting project with dual goals. One is to generate a sequence of truly random numbers that is guaranteed by the laws of physics to be unknowable in advance of its generation, uncorrelated with anything in the universe.

"That's simply not possible in any classical physical context," says Joshua Bienfang of PML's Quantum Measurement Division. "But quantum mechanics has the potential to allow this, and within five years we intend to devise and demonstrate a practical random-number generator based on verifiably random quantum events." Doing so will require extensive work from ITL team members in quantum state characterization, quantum information theory, statistics, security analysis, cryptographic protocols, and more.

The other goal is a major modification of ITL's prototype random-number "beacon" to provide a continuous stream of the quantum-generated random numbers that is freely available to anyone over the World Wide Web. In doing so, the ITL members will study new applications for the beacon. If successful, the work could lead to unprecedented levels of network security for confidential digital applications from multiparty contract bidding to tamper-proof voting, and constitute a trusted common standard.

"One way to think about it," says Yi-Kai Liu of ITL, "is that the beacon serves the same role as a referee in a game. You and another person are trying to exchange some information or conduct some sort of transaction, and you need a third party to do something for you that both of you can trust. That way, you don't have to trust one another or reveal more information than you want to."

The randomness project was one of three proposals selected in August 2012 for NIST's highly competitive Innovations in Measurement Science Awards. The program provides multiyear funding to explore high-risk, leading-edge research concepts that anticipate future measurement and standards needs of industry and science. Rene Peralta and two colleagues from ITL will handle the design of the new random number beacon and its applications. Bienfang and Alan Migdall of PML's Quantum Measurement Division, Sae Woo Nam of the Quantum Electronics and Photonics Division, Xiao Tang of ITL's Quantum Communications Group, and four theorists from ITL will undertake the physics challenges.

They are formidable. The effort relies on the fact that quantum measurements have inherently random components. Prior to observation or measurement, a particle exists in a "superposition" of possible states. The act of measurement forces it to take on specific properties, the recorded values of which are purely probabilistic. So such measurements can, in theory, be a completely trustworthy source of random numbers -- providing that it can be proved that the phenomenon generating the numbers is entirely quantum mechanical, with no possible classical influences.

There is a way to confirm that condition. It is a test devised in 1964 by Irish physicist John Bell in response to Albert Einstein's rejection of the quantum-mechanical explanation of a condition called entanglement, whereby two particles can become intrinsically correlated, no matter how far separated they become in space. Bell showed that if there were any pre-existing or hidden classical connections between the particles, correlated measurements taken on both particles – after each passes through a separate device that randomly determines the basis of measurement – must lie within a certain range of values. Quantum mechanics predicts correlation values outside of those bounds and their appearance is evidence of indisputably and exclusively quantum-mechanical phenomena. They have been consistently observed in labs around the globe for more than three decades.

However, such Bell tests – as they have been implemented so far – have "loopholes" that might allow the numbers to result from hidden variables or other classical influences. So the PML team will have to conduct demonstrably loophole-free Bell tests, which has never been done.

"We will conduct the tests on photonic states," Bienfang says. "NIST is already a world leader in producing and manipulating pairs of entangled photons with quantum optics techniques such as parametric down-conversion or four-wave mixing. There are significant engineering challenges associated with closing the loopholes, and we believe that with the proper choice of source we can do it." Among other things, that means that the photon detectors be sited so far apart (several hundred meters) that the time lapse between the measurement basis choice and the photon's detection is shorter than the time it would take for information to travel from one particle to another at the speed of light. This closes the "light cone" loophole.

At the same time, the scientists must close the "detection" loophole. An ideal Bell test detects 100 percent of the photons so that the statistics are not skewed. "Fortunately," says Migdall, "we're starting out with an advantage. We already have superconducting nanowire detectors made by Sae Woo Nam's team at NIST/Boulder that have demonstrated efficiencies above 92 percent as well as inherently high speed. That design, which is being improved, is likely to fully satisfy the needs of our project. But we've got to get the total efficiency of the whole system, including all the components, to somewhere around 85% to make this work."

Bienfang is optimistic. "We're bringing together world-class expertise in the metrology of optical loss, device characterization, and detection efficiency measurements. We also make use of expertise in quantum information theory and the measurement of quantum states."

If all goes well, the researchers' initial target is to update a 512-bit string every minute for output on the beacon, with plans for faster generation in the future. As with the current prototype beacon signal, all output would be time-stamped by the NIST Time Service. The beacon would also have direct importance to NIST's National Strategy for Trusted Identities in Cyberspace (NSTIC) program. That program's goal is to improve upon the current on-line password system so that "all participating service providers will have agreed to consistent standards for identification, authentication, security, and privacy."

ITL theorist Stephen Jordan says that "there have been a lot of papers written since the 1980s about all sorts of protocols that you could do if you had some sort of shared, trusted beacon of random numbers. You could conduct secure auctions, or certify randomized audits of data. One of the most intriguing benefits is that a trusted source would allow for selective disclosure of information. Suppose, for example, that you have some kind of ID card like a driver's license, and you want to reveal only that you're over 21, and not other stuff like your name and birthdate. Or you want to allow someone to have access to only a part of your medical records, but not the whole thing. That would become possible. Moreover, there is a great deal of interest in 'device independence' for secure communications. In an ideal arrangement, people should be able to check the source of the random numbers, rather than own it and secure it themselves."

In addition, pursuing a loophole-free Bell test and determining how to put its output to use raise difficult questions in science and philosophy. "Randomness is a profoundly fundamental thing, and truly understanding it is as deep as any question in physics," says Migdall. "It's mysterious, and it's incredibly important."

Released October 1, 2012, Updated March 26, 2019