"The drb sustains a level of commentary on Irish and international matters that no other journal in Ireland and few elsewhere can reach. It deserves all the support that can be given it." X
Space to Think, a new book celebrating ten years of the Dublin Review of Books More Information 

Alan Turing: The Enigma

The Book That Inspired the Film The Imitation Game
Andrew Hodges
Publisher
Vintage
Price
£8.99
ISBN
9781784700089



EXTRACT COPYRIGHTED MATERIAL

Preface

On 25 May 2011, the President of the United States, Barack Obama, speaking to the parliament of the United Kingdom, singled out Newton, Darwin and Alan Turing as British contributors to science. Celebrity is an imperfect measure of significance, and politicians do not confer scientific status, but Obama's choice signalled that public recognition of Alan Turing had attained a level very much higher than in 1983, when this book first appeared.

Born in London on 23 June 1912, Alan Turing might just have lived to hear these words, had he not taken his own life on 7 June 1954. He perished in a very different world, and his name had gone unmentioned in its legislative forums. Yet even then, in its secret circles, over which Eisenhower and Churchill still reigned, and in which the names of NSA and GCHQ were spoken in whispers, Alan Turing had a unique place. He had been the chief backroom boy when American power overtook British in 1942, with a scientific role whose climax came on 6 June 1944, just ten years before that early death.

Alan Turing played a central part in world history. Yet it would be misleading to portray his drama as a power play, or as framed by the conventional political issues of the twentieth century. He was not political as defined by contemporary intellectuals, revolving as they did round alignment or non-alignment with the Communist party. Some of his friends and colleagues were indeed party members, but that was not his issue. (Incidentally, it is equally hard to find money-motivated 'free enterprise', idolised since the 1980s, playing any role in his story.) Rather, it was his individual freedom of mind, including his sexuality, that mattered - a question taken much more seriously in the post-1968, and even more in the post-1989, era. But beyond this, the global impact of pure science rises above all national boundaries, and the sheer timelessness of pure mathematics transcends the limitations of his twentieth-century span. When Turing returned to the prime numbers in 1950 they were unchanged from when he left them in 1939, wars and superpowers notwithstanding. As G. H. Hardy famously said, they are so. Such is mathematical culture, and such was his life, presenting a real difficulty to minds set in literary, artistic or political templates.

Yet it is not easy to separate transcendence from emergency: it is striking how leading scientific intellects were recruited to meet the existential threat Britain faced in 1939. The struggle with Nazi Germany called not just for scientific knowledge but the cutting edge of abstract thought, and so Turing's quiet logical preparations in 1936-38 for the war of codes and ciphers made him the most effective anti-Fascist amongst his many anti-Fascist contemporaries. The historical parallel with physics, with Turing as a figure roughly analogous to Robert Oppenheimer, is striking. This legacy of 1939 is still unresolved, in the way that secret state purposes are seamlessly woven into intellectual and scientific establishments today, a fact that is seldom remarked upon.

The same timelessness lies behind the central element of Alan Turing's story: the universal machine of 1936, which became the general-purpose digital computer in 1945. The universal machine is the focal, revolutionary idea of Turing's life, but it did not stand alone; it flowed from his having given a new and precise formulation of the old concept of algorithm, or mechanical process. He could then say with confidence that all algorithms, all possible mechanical processes, could be implemented on a universal machine. His formulation became known immediately as 'the Turing machine' but now it is impossible not to see Turing machines as computer programs, or software.

Nowadays it is perhaps taken rather for granted that computers can replace other machines, whether for record-keeping, photography, graphic design, printing, mail, telephony, or music, by virtue of appropriate software being written and executed. No one seems surprised that industrialised China can use just the same computers as does America. Yet that such universality is possible is far from obvious, and it was obvious to no one in the 1930s. That the technology is digital is not enough: to be all-purpose computers must allow for the storage and decoding of a program. That needs a certain irreducible degree of logical complexity, which can only be made to be of practical value if implemented in very fast and reliable electronics. That logic, first worked out by Alan Turing in 1936, implemented electronically in the 1940s, and nowadays embodied in microchips, is the mathematical idea of the universal machine.

In the 1930s only a very small club of mathematical logicians could appreciate Turing's ideas. But amongst these, only Turing himself had the practical urge as well, capable of turning his hand from the 1936 purity of definition to the software engineering of 1946: 'every known process has got to be translated into instruction table form...' (p. 409). Donald Davies, one of Turing's 1946 colleagues, later developed such instruction tables (as Turing called programs) for 'packet switching' and these grew into the Internet protocols. Giants of the computer industry did not see the Internet coming, but they were saved by Turing's universality: the computers of the 1980s did not need to be reinvented to handle these new tasks. They needed new software and peripheral devices, they needed greater speed and storage, but the fundamental principle remained. That principle might be described as the law of information technology: all mechanical processes, however ridiculous, evil, petty, wasteful or pointless, can be put on a computer. As such, it goes back to Alan Turing in 1936.

That Alan Turing's name has not from the start been consistently associated with praise or blame for this technological revolution is due partly to his lack of effective publication in the 1940s. Science absorbs and overtakes individuals, especially in mathematics, and Alan Turing swam in this anonymising culture, never trying to make his name, although frustrated at not being taken seriously. In fact, his competitive spirit went instead into marathon running at near-Olympic level. He omitted to write that monograph on 'the theory and practice of computing', which would have stamped his name on the emergent post-war computer world. In 2000 the leading mathematical logician Martin Davis, whose work since 1949 had greatly developed Turing's theory of computability published a book1 which was in essence just what Turing could have written in 1948, explaining the origin of the universal machine of 1936, showing how it became the stored-program computer of 1945, and making it clear that John von Neumann must have learnt from Turing's 1936 work in formulating his better-known plan. Turing's very last publication, the Science News article of 1954 on computability, demonstrates how ably he could have written such an analysis. But even there, on terrain that was incontestably his own discovery, he omitted to mention his own leading part.