Jim’s Reading Corner is a reading list to stimulate debate in which our Secretary-General Jim Cloos analyses and reviews books of interest to Europe. From the unique perspective of a lifetime EU practitioner, Jim gives his comment on books, articles, long-reads, and more – and tackles the leading issues of the day. Today’s book is “The Innovators: How a Groupd of Hackers, Geniuses, and Geeks Created the Digital Revolution”, by Walter Isaacson.
Recommended to me by Tony Gardner. Fascinating account of one of the great adventures of the twentieth century, the invention of the computer and everything that goes with it. Sadly, this is an almost exclusively American adventure; it is an object lesson in how to foster and promote innovation. This is an area where Europe has a lot to learn from the USA; we have the brains, the ability and the technology, but we seem to be lacking the drive, the dynamism, the cross-fertilisation, the risk capital, and the courage to take risks that characterise the American society. We also lack a united military industry that pushes innovation. Time to have a fresh look at how to develop a coherent policy and a framework allowing for innovation to happen and thrive. A nice challenge for the upcoming Tallinn summit on digital Europe.
Practically all of the major steps are associated not with one name but with pairs of names and often teams. Good ideas arise from discussion, from comparing ideas, from putting together people of various horizons. At the heart of the digital era, as happened with the Industrial Revolution, were innovators who combined imagination and passion with a profound knowledge of technology. “The sparks come from ideas rubbing against each other rather than as bolts out of the blue sky.”
One of the horizontal questions that arose early on concerned the issue of intellectual property and patents. Issacson phrases it in the following way: “Should intellectual property be shared freely and placed whenever possible into the public domain and open-source commons? That course, largely followed by the developers of the Internet and the Web, can spur innovation through the rapid dissemination and crowdsourced improvement of ideas. Or should intellectual property rights be protected and inventors allowed to profit from their proprietary ideas and innovations? That path, largely followed in the computer hardware, electronics, and semiconductor industries, can provide the financial incentives and capital investment that encourages innovation and rewards risks.”
The origins
The origins of the computer idea date back to the first half of the nineteenth century, with the works of Babbage (1791-1871), who invented the Difference Engin. He drew on the work of the French mathematician Gaspard de Prony who, in the 1790s, came up with the idea, in order to create logarithm and trigonometry tables, of breaking down the operations into very simple steps that involved only addition and subtraction. In fact, he created an “assembly line” on the model of the great industrial-age innovation that was memorably analysed by Adam Smith. After a trip to Paris, Babbage wrote: “I conceived all of a sudden the idea of applying the same method to the immense work with which I had been burdened and to manufacture logarithms as one manufactures pins”. The Difference machine helped to mechanise this process.
But there was another and more unexpected key figure who played a major part in what was later to become the era of the computer: Ada Countess of Lovelace, the daughter of Lord Byron. She combined a literary mind with strong mathematical skills, and she was a visionary. One the of key insights she explained in her writings was to become the core concept of the digital age: any piece of content, date, or information -music, text, pictures, numbers, symbols, sounds, video-could be expressed in digital form and manipulated by machines. Even Babbage failed to see this fully; he focused on numbers. Ada is also famous for having set out what Alan Turing named “Lady Lovelace’s Objection”, i.e. that the analytical engine had no pretensions whatever to originate anything and that machines could become partners of humans, but not replace them. In the 1930s, Alan Turing tried to prove her wrong by providing an operational definition of a thinking machine; that a person submitting questions could not distinguish the machine from the human. 60 years later, Ada’s objection still holds; no machine has cleared her higher bar of being able to “originate” any thoughts of its own. (I just bought the latest book by Gary Kasparov who talks about the man-machine relationship!)
1937, annus mirabilis
But then not much happened for about a hundred years. It was only in the 1930s that history accelerated, with a number of revolutionary ideas and inventions that heralded the triumph of four properties, somewhat interrelated, that would define modern computing: digital, binary, electronic, general purpose:
- The creation of IBM in 1924 and the first major use of electrical circuits to process information
- Vannevar Bush’s creation of theDifferential Analyzer (1931), an analogue electromechanical computer.
- The invention by Flowers of the Vacuum tubesthat allowed for on/off switches in electric circuits (1935)
- The publication in 1937 by the British scientist Alan Turing of “On computable numbers”; he set out the idea of a universal computer.
- The intuition by Shannon that circuits could do Boolean algebra
- The development by G. Stibitz at Bell Labs of a calculator using electronic circuits
- The project of a large digital computer put forward by Aiken
- The creation of Hewlett and Packardin Palo Alto.
With the war looming, the military became more and more interested in these ideas. Many of the people mentioned above started working directly for the military, sometimes on different issues; thus Turing went to Bletchley Park to help crack the Nazi codes. In Germany itself, Konrad Zuse invented an electronic programmable digital computer. In Harvard, von Neuman, who had been working on the Harvard Mark I, helped develop ENIAC (1945); the beast weighed thirty tons. A lot of the work was actually done by a group of women programmers who fed the computer with its stored-program architecture. ENIAC, completed by Presper Eckert and John Maunchley, was the first machine to incorporate the full set of traits of a modern computer. It was all-electronic, superfast, and could be programmed by plugging and unplugging the cables connecting its different units. It was capable of changing paths based on interim results, and it could be qualified as a general-purpose Turing-complete machine. It was not, however, binary.
The next stage was reached with EDVAC, a stored-program computer as conceived by Alan Turing and delivered by von Neumann, which broke the distinction between numbers that mean things and numbers that do things.
The advent of the transistor, the microchip and the microprocessor
Then came a truly revolutionary advance,with the invention of the transistor, by W. Brattain, John Bardeen and W. Shockley at Bell Labs. In December 1947, scientists at Bells Lab succeeded in putting together a tiny contraption they had concocted from some strips of gold foil, a chip of semiconducting material, and a bent paper clip. When wiggled just right, it could amplify an electric current and switch it on and off. This was the real birth of the digital area. The transistor replaced the expensive vacuum tubes. It came about because of progress in solid state physics (the way electrons flow through solid matter) and quantum physics (which is about what goes on within an atom, with the orbiting of electrons that do quantum leaps between trajectories). The silicon transistor came about in 1954, at Texas Instruments.
In 1957, Noyce and Moore, who had worked with Shockley, split with him and created Fairchild Semiconductors. They and Texas Instruments independently achieved the next step with the microchips. Everything was put on one chip, resistors, capacitor, transistor (multiple transistors on one chip, connected to the circuit by printed copper wires). The first microchip was actually used by the army, in the Minuteman II missile. The first civil use was in pocket calculators (Pat Haggerty). Arthur Rock, who had already financed Fairchild, now invested in INTEL, created by Noyce, Moore, and Andy Gove in Silicon Valley. Moore remarked that “integrated circuits will lead to such wonders as home computers, or at least terminals connected to a central computer, automatic controls for Automobiles, and personal portable communications equipment.”
The first microprocessor was invented by Ted Hoff and his team in 1971, the Intel 4004, a general-purpose chip programmed to do all kinds of things, like a small computer on a chip. Intel licensed it to a firm called Busicom but insisted that Intel retain the rights to the new chip and be allowed to licence it to other companies! It was a deal that Bill Gates and Microsoft would emulate a decade later with IBM; this ushered in a shift in the electronics industry: the importance of hardware engineers, who designed the placement of the components on a circuit board, began to be supplanted by a new breed of software engineers, whose job it was to program a set of instructions into a system.
The microprocessor allowed the advent of personal and interactive computers and the video game industry. The key developer of the latter was a guy called Nolan Bushnell who had what is required to succeed with innovation: a great idea, the engineering talent to execute it, and the business savvy (plus money) to turn it into a successful product. His new venture called Atari (from the game of GO) highlighted three aspects of the hacker culture that became themes of the digital age: it was created collaboratively, it was free and open-source software, and it was based on the belief that computers could be personal and interactive.
From networks to the Internet
But computers were isolated machines, with their own limited networks. The idea arose of de-centralised networks and interfaces and of a far greater man-computer symbiosis.One of the fathers of the internet was Joseph Licklider, who in 1960 had published a revolutionary paper called “Man-Computer Symbiosis”. “The hope is that in not too many years human brains and computing machines will be coupled together very tightly, and that the resulting partnership will think as no human brain has ever thought and process data in a way not approached by information-handling machines we know today.”
Bob Taylor and Larry Roberts invented ARPANET, a network of research computers. This was greatly helped by another invention, that of packet-switching (D. Davies) and distributed control. Packet switching is a special type of store-and-forward switching in which messages are broken into bit-sized units of exactly the same size, called packets, which are given address headers describing where they should go.
ARPANET then morphed into INTERNET, a network of networks. Kahn and Cerf elaborated the first Internet Protocol (IP) on how packets travel. TCP (Transmission Control Package) allowed to put packets that had been transmitted in small bits and by different routes together again (so TCP/IP = a protocol for network interconnections).
The personal computer and the software revolution
The idea of the personal computer had been around for some time. V. Bush already thought of it in 1945. The hippie and libertarian revolution in the 1960s gave it a strong push, with the idea of “back-to-the land and nature combined with technological change that empowered individuals.” The mouse, invented by Engelbert and B. English in 1969 helped in this respect, too, as did the progress in graphic interface (Alan Kay, at Xerox Park in 1970). The first working PC was the ALTAIR 8800 (1975), developed by Roberts; the name was suggested by his daughter who was a Star Trek junkie!
Next came the software revolution. Two young guys, Bill Gates and Allen, decided to write a BASIC based language for the ALTAIR bought by MITS in 1975. Gates had the brilliant idea to insist on retaining ownership; he would do the same with IBM later on. His intuition was that hardware and software could be separated, and what he was interested in was the software that could be marketed to various computers. That was a very different approach from the one chosen by Apple (Steve Wozniak and Steve Jobs) who developed their personal computer at about the same time. Their Apple I and Apple II computers were fully integrated, very elegant and simple to use, with a keyboard, a monitor and a PC. On the basis of a first idea by Xerox, they developed VisiCal (a software interface tool) which contributed to the success of Apple II. Gates and Jobs did a deal on VisiCal, with Gates promising not to use it before a certain period. But as Jobs was late in developing the Macintosh (1984), Gates was perfectly entitled to jump the gun in 1983 with Windows. Their friendship did not survive that episode (at the same time, other people pursue a very different approach from both Gates and Jobs; R Stallman and the Finnish Linus Torvalds develop free-and-open software).
The online revolution and the creation of the world-wide web
Both Internet and the PC were born in the 1970s, but they grew apart from each other, with hardly any link. This changed in the late 1980s with the arrival of the online revolution. Email had been invented in 1972 (Roy Tomlinson) thus allowing the first virtual communities. With the development of the first modems in the early 1980s, a connection could be created between the PC and the phone networks. A first prototype of an online community was the WELL, but the revolution came with AMERICA ONLINE (Steve Case and Wilhelm von Meister). It is interesting to note that it was only in 1992 and after heavy lobbying, with the active support of Al Gore, that open access to the internet was authorised.
Then entered Tim Berners-Lee, who worked at CERN in Geneva. With a Belgian engineer called Robert Caillau he created the sandpit where everybody could play, the WEB. This required the creation of Hypertext which, combined to the Internet, created the Web; hypertext allowed to make the connections Berners-Lee wanted to make. URL: uniform resource locators like http://www.cern.ch) were created to access addresses. HTML (hypertext markup language) and HTTP( hypertext transfer protocol) allow to create and transfer pages. A first web browser was developed by Andreesen, it was called Mosaic. The WWW quickly hit orbital velocity. One of the problems that would cause difficulties later on was that these were one-way links, not two-way links, which did not allow for small automatic payments. Internauts got used to having everything for free, which created problems.
Web Logs then appeared. The first one was created by Hall, who had a running web log of his personal activities. It became the first new form of content to be created for and take advantage of PC networks. Web logs were then jocularly called “we-blog” by PeterMerholz, which gave the term of blog. E Williams developed Blogger, a device to help people write blogs, which was then sold to Google (Larry Page and Serge Brin). Google, from googol (number 1 followed by a hundred zeros) was an automated and very sophisticated search engine, which quickly became the primary method for finding things on the web. Twitter followed, as well as “wiki”, a medium that allowed users to modify Web pages, not by having an editing tool in their browser but by clicking and typing directly onto Web pages that ran wiki software. The invention of these open, interactive, and rewritable texts led to the creation of Wikipedia. And of course, all the rest, Facebook, LinkedIn and probably many more things to come.
All in all, a fascinating account of a digital revolution that has transformed our lives. This summary only provides a very sketchy and a bit boring overview. What is missing, among others, are the very lively descriptions of the individual itineraries of the “innovators”, the little stories to illustrate important points, the chance encounters that lead to breakthroughs and all of that.