In 1801, Frenchman Joseph-Marie Jacquard invented the automated loom, which changed the fortunes of the silk weaving industry. The loom was a scientific wonder. It showed that the whole weaving exercise could be ‘programmable’. Until then, looms had followed a manual process: they created a pattern by using hooks to lift selected warp threads, and then a rod pushed a woof thread underneath.

The Jacquard loom, however, used a series of cards with punched holes to control this process. The holes determined which hooks and rods would be activated for each pass of the weave, thus automating the creation of intricate patterns. Each time the shuttle was thrown to create a new pass of the thread, a new punch card would come into play. This was a seemingly simple and utilitarian result that was supported by an enchantingly complex idea. In fact, Jacquard developed the idea of the loom based on the thoughts and ideas of inventors Basile Bouchon, Jean Baptiste Falcon and Jacques Vaucanson.

The Jacquard loom would influence inventors of myriad hues in the years to come. Among them was an Englishman, Charles Babbage, who was so enamoured by the idea of using punched cards to control intricate patterns that he decided to borrow the idea to develop a complex device he was fancying to build — the Analytical Engine, an idea he conceived in 1834.

Like all great inventors, Babbage too possessed an uncanny ability to combine “innovations that had cropped up in other fields”, writes Walter Isaacson in The Innovators . Babbage realised he could use the punched cards in place of the steel drums in his Analytical Engine, to input an unlimited number of instructions to the machine. This was conceived as a general purpose computer that would carry out a variety of different operations based on programming instructions given to it — exactly what a computer does today.

Ahead of their times

Babbage’s machine was an idea that was a hundred years ahead of his time, Isaacson notes. But he didn’t find too many takers for his seemingly utopian idea — except one. And that was Ada, the Countess of Lovelace. Isaacson paints a strikingly vivid portrait of the relationship the ‘father of modern computer’ shared with the countess, a mathematical genius who was the first one to envision the modern computer.

Born on December 10, 1815, Ada was the only legitimate child of legendary English poet Lord Byron. Ada’s margin notes to her translation of a study in Italian of Babbage’s Engine, written by a (then) young engineer called Luigi Menabrea, became more famous and coveted than the thesis itself. Menabrea later became the prime minister of Italy. It was in these notes that Ada discussed the idea of a general purpose machine, which later evolved into the modern computer.

Many rightly believe her works in the area of computing and mathematics didn’t get the deserved attention, even though over the years she was celebrated as a feminist icon and computer pioneer, especially in the initial decades of the digital revolution. Hers could be one of the first and perhaps the most important case of criminal neglect the computing industry later became notorious for.

To his credit, Isaacson devotes a significant amount of real estate to sketch Ada’s interestingly eccentric life, even though one gets a sense that even he is not thoroughly convinced of her genius. “Whether due to her opiates or her breeding or both”, Isaacson writes, Ada developed an outsize opinion of her own talents and started describing herself as a “genius”. Interestingly, in his biography of Steve Jobs (2011), Isaacson appeared to be all awe describing Jobs’ eccentric and self-obsessed ways.

The collaborators

In his journey tracing the evolution of the digital revolution, Isaacson introduces an ensemble cast — many known and some unknown. There are the usual suspects such as Bill Gates and Steve Jobs, and a refreshing multitude of interesting faces that makes The Innovators the most interesting account of the history of the digital era. The depth of Isaacson’s research is spectacular to say the least. And his narration is dexterously non-linear, making it an intelligent reader’s delight.

A former managing editor of Time magazine, he obviously has the probing eyes of a curious journalist, while maintaining a philosopher’s detachment to the events and ideas he discusses. He is bewitchingly lucid even when describing the most complex concepts in computing. His profiles of hackers and geeks beam truthfulness and striking nonchalance.

Unlike the industrial revolution, which was made possible to a large extent through individual brilliance and resilience (like James Watt or George Stephenson), the digital revolution — despite the cult faces that pop up in most manuals — was a collaborative effort.

A continuing journey

There were leaders such as Jobs or Gates, but there were no “solo inventors suitable to be singled out on magazine covers or put into a pantheon with Edison, Bell and Morse”. Most of the key inventions of the digital era came from the collaboration of hackers, inventors and entrepreneurs.

Even though Isaacson declares at the outset that it’s the story of collaboration that he is trying to capture, he ends up highlighting many cults. But that’s pardonable, given that each character, representing an epochal event in the history of computing, stands testimony to the camaraderie and unity that propelled digital inventions. While narrating the stories, Isaacson captures how each invention in the digital universe has helped humanity leapfrog on many fronts and how these devices, services and events helped make complex tasks simpler and more enjoyable.

Technology, especially computing technology, has made human lives a highly entertaining affair. In the second half of the past century, especially at the tail end of it, computers and the internet started influencing human life like no other forces of nature have.

In the past decade or so, we saw the world wide web — invented by the ideas of Tim Berners-Lee who was, like Gates and Jobs, also born in 1955 — and the internet (which was an offshoot of a military mechanism for communication) redefining the contours of nations, triggering revolutions, challenging militaries. Computing technology even gave us a new, global currency (bitcoin) and the world’s first free, open repository of information (Wikipedia) and many more services and devices.

This is a continuing journey, an unstoppable revolution. And as Isaacson notes, what is making this revolution even more enchanting and likeable is the interplay of art and science in it.

And as Ada, Countess of Lovelace had believed, — and as Isaacson sums up — innovators are people who are able to link beauty to engineering, humanity to technology, and poetry to processors. This chemistry will spring more surprises in the years to come.

comment COMMENT NOW