Suppose someone asked you to devise the most powerful computer possible. Alan Turing, whose reputation as a central figure in computer science and artificial intelligence has only grown since his untimely death in 1954, applied his genius to problems such as this one in an age before computers as we know them existed. His theoretical work on this problem and others remains a foundation of computing, AI and modern cryptographic standards, including those NIST recommends.
The road from devising the most powerful computer possible to cryptographic standards has a few twists and turns, as does Turing’s brief life.
In Turing’s time, mathematicians debated whether it was possible to build a single, all-purpose machine that could solve all problems that are computable. For example, we can compute a car’s most energy-efficient route to a destination, and (in principle) the most likely way in which a string of amino acids will fold into a three-dimensional protein. Another example of a computable problem, important to modern encryption, is whether or not bigger numbers can be expressed as the product of two smaller numbers. For example, 6 can be expressed as the product of 2 and 3, but 7 cannot be factored into smaller integers and is therefore a prime number.
Some prominent mathematicians proposed elaborate designs for universal computers that would operate by following very complicated mathematical rules. It seemed overwhelmingly difficult to build such machines. It took the genius of Turing to show that a very simple machine could in fact compute all that is computable.
His hypothetical device is now known as a “Turing machine.” The centerpiece of the machine is a strip of tape, divided into individual boxes. Each box contains a symbol (such as A,C,T, G for the letters of genetic code) or a blank space. The strip of tape is analogous to today’s hard drives that store bits of data. Initially, the string of symbols on the tape corresponds to the input, containing the data for the problem to be solved. The string also serves as the memory of the computer. The Turing machine writes onto the tape data that it needs to access later in the computation.
The device reads an individual symbol on the tape and follows instructions on whether to change the symbol or leave it alone before moving to another symbol. The instructions depend on the current “state” of the machine. For example, if the machine needs to decide whether the tape contains the text string “TC” it can scan the tape in the forward direction while switching among the states “previous letter was T” and “previous letter was not C.” If while in state “previous letter was T” it reads a “C,” it goes to a state “found it” and halts. If it encounters the blank symbol at the end of the input, it goes to the state “did not find it” and halts. Nowadays we would recognize the set of instructions as the machine’s program.
It took some time, but eventually it became clear to everyone that Turing was right: The Turing machine could indeed compute all that seemed computable. No number of additions or extensions to this machine could extend its computing capability.
To understand what can be computed it is helpful to identify what cannot be computed. In a previous life as a university professor I had to teach programming a few times. Students often encounter the following problem: “My program has been running for a long time; is it stuck?” This is called the Halting Problem, and students often wondered why we simply couldn’t detect infinite loops without actually getting stuck in them. It turns out a program to do this is an impossibility. Turing showed that there does not exist a machine that detects whether or not another machine halts. From this seminal result followed many other impossibility results. For example, logicians and philosophers had to abandon the dream of an automated way of detecting whether an assertion (such as whether there are infinitely many prime numbers) is true or false, as that is uncomputable. If you could do this, then you could solve the Halting Problem simply by asking whether the statement “this machine halts” is true or false.
Turing went on to make fundamental contributions to AI, theoretical biology and cryptography. His involvement with this last subject brought him honor and fame during World War II, when he played a very important role in adapting and extending cryptanalytic techniques invented by Polish mathematicians. This work broke the German Enigma machine encryption, making a significant contribution to the war effort.
Turing was gay. After the war, in 1952, the British government convicted him for having sex with a man. He stayed out of jail only by submitting to what is now called “chemical castration.” He died in 1954 at age 41 by cyanide poisoning, which was initially ruled a suicide but may have been an accident according to subsequent analysis. More than 50 years would pass before the British government apologized and “pardoned” him (after years of campaigning by scientists around the world). Today, the highest honor in computer sciences is called the Turing Award.
Turing’s computability work provided the foundation for modern complexity theory. This theory tries to answer the question “Among those problems that can be solved by a computer, which ones can be solved efficiently?” Here, “efficiently” means not in billions of years but in milliseconds, seconds, hours or days, depending on the computational problem.
For example, much of the cryptography that currently safeguards our data and communications relies on the belief that certain problems, such as decomposing an integer number into its prime factors, cannot be solved before the Sun turns into a red giant and consumes the Earth (currently forecast for 4 billion to 5 billion years). NIST is responsible for cryptographic standards that are used throughout the world. We could not do this work without complexity theory.
Technology sometimes throws us a curve, such as the discovery that if a sufficiently big and reliable quantum computer is built it would be able to factor integers, thus breaking some of our cryptography. In this situation, NIST scientists must rely on the world’s experts (many of them in-house) in order to update our standards. There are deep reasons to believe that quantum computers will not be able to break the cryptography that NIST is about to roll out. Among these reasons is that Turing’s machine can simulate quantum computers. This implies that complexity theory gives us limits on what a powerful quantum computer can do.
But that is a topic for another day. For now, we can celebrate how Turing provided the keys to much of today’s computing technology and even gave us hints on how to solve looming technological problems.
I am eternally grateful to Alan Turing. I grew up next door to Arthur C. Clarke (https://en.wikipedia.org/wiki/Arthur_C._Clarke) in Sri Lanka and I learned about Alan Turing at an early age thanks to Mr. Clarke.
Small correction to this one - "His involvement with this last subject brought him honor and fame during World War II, when he played a very important role in adapting and extending cryptanalytic techniques invented by Polish mathematicians. This work broke the German Enigma machine encryption, making a significant contribution to the war effort."
Polish mathematicians broke the Enigma machine. They have also developed the Bomb machine. Then work was taken over by Turing. Turing's contribution was significant as he was able to extend the Bomb concept along with further development of Enigma.
The "Polish mathematicians" mentioned in the article had their names: Marian Rejewski, Jerzy Różycki and Henryk Zygalski.
These guys actually broke Enigma.
The contribution of Turing was "industrializing" the technique, as we would say today.
Wasn't "an automated way of detecting whether an assertion ... is true or false" already proved impossible by Kurt Gödel's incompleteness theorem in 1931 or was Turing's result somehow distinct from that finding?
Mr Touring's deduction that the way to solve mathematical problems accurately and consistently was to use a machine became the foundation of computer science. He also worked with Tommy Flowers (Post Office engineer) to create the Colossus I and Colossus II. The latter was the first digital computer and helped break the encrypted messages of the German Geheimschreiber that used many rotors instead of four. We might also credit Ada Lovelace who in the 1800s came up with the concept of binary (digital) computation.