The transformative impact of the individual genius is one of the oldest stories in Western history and one of the most misleading. The accepted history of pretty much every institution has a version of it. The history of computing is still being written, but the canon of geniuses is already in place. The two most well-known names in the mainstream are Bill Gates and Steve Jobs, and 2014’s Oscar-bait biopic The Imitation Game may well make Alan Turing the third. Walter Isaacson, CEO of the Aspen Institute, wrote the biography Steve Jobs. (The book is the basis for Aaron Sorkin’s screenplay Jobs, which may or may not become a movie directed by Danny Boyle starring Michael Fassbender as Jobs and Seth Rogen as Steve Wozniak, depending on the aftermath of the Sony leaks.) The Jobs bio firmly places its protagonist in the “difficult men” camp. And one major problem with biographies about difficult men is that they tend to imply a connection between their subjects’ importance and their self-importance, downplaying the structural advantages that helped put them in a position to claim so much credit.
In October, Isaacson released a new book called The Innovators: How a Group of Hackers, Geniuses and Geeks Created the Digital Revolution, which became a bestseller. In it, Isaacson has made an effort to emphasize the many contributors to the history of coding and computing. The Innovators acknowledges that real history is not all big leaps conceived in “Eureka!” moments by great men. Computing moved forward in dribs and drabs, through the efforts of many. The book aims to take the emphasis off of exceptional individuals and put it on groups, following history as a chain of overlapping, interlocking parts rather than a series of discrete happenings. It is not entirely successful to this end; it still centers around a canon of individuals and takes a very Western-centric view of computing history; everyone in Isaacson’s pantheon is white. The book begins in industrial-era England — which means it is an abbreviated history of mechanical computers. It does not trace the history of computing machines from tally sticks and the abacus to modern machines, omitting some of the earliest innovators. I understand the choice to start with Charles Babbage, whose Difference Engine is generally recognized as the first computer, but the book still could have discussed Japanese inventors like Yazu Ryoichi (Automatic Abacus, 1902), Kawaguchi Ichitaro (Kawaguchi Style Electric Tabulation Machine, 1905), and Ohmoto Torajiro (Tiger Calculator, 1923), whose innovations spanned the long interim between Babbage and Turing. And there’s no talk about the computer built by Tokyo Imperial University’s Aviation Lab, which could solve nine equations at once, created in 1944 — the same year Harvard built the Mark I computer. The most radical aspect of the book is that Isaacson opens and closes it with Ada Lovelace, the daughter of Lord Byron, who wrote the first computer program — an algorithm designed for Babbage’s Analytical Engine, a proposed successor to the Difference Engine, to carry out.
The chapter on Lovelace is a fascinating self-contained biography of her. Lovelace’s mother enrolled her in math and logic classes in an effort to make sure she wouldn’t turn out like her charlatan poet father — who was, in fact, the typical embodiment of the romantic genius. Lovelace turned out a lot like him anyway. She took a passionate, romantic approach to mathematics, which she called “poetical science.” Working under the mentorship of Babbage, she became a serious student of machines. She was obsessed with the Difference Engine, essentially a giant mechanical calculator. After seeing a mechanical loom that had been programmed to weave an elaborate pattern, Lovelace realized that any sort of mathematical program could also be executed this way.
Her theoretical program for the Analytical Engine to compute Bernoulli numbers was the first-ever program written to be carried out on a computing device, although it was never put to the test. Despite her early high achievement, Lovelace struggled, developing a gambling problem and inventing a mathematical scheme to rig bets that instead put her deep in debt. She died at age 36 from cancer — the same age at which her father died.
Lovelace’s contributions to computing have come under fire, with some claiming that her work for the Analytical Engine was really done by Babbage. There is, however, no good evidence to back up these claims, and Babbage discusses her work in one of his own publications, also stating that Lovelace noticed a mistake in his work — the first computer bug. That kind of collaboration is how progress in technology actually works. She did not work alone, but neither did he. Isaacson argues that Lovelace was not a genius mathematician, but she was a tech genius in the modern sense: synthesizing ideas from radically differing disciplines to forge new perspectives on the rapidly developing technology of the industrial age.1 She was among the earliest to recognize that the potential in computing machines lay beyond mathematical computing. It is not only her programming prowess that makes Lovelace an important and influential figure in computing history — it was also her futuristic, inclusive vision.
1. She recently received a shout-out from Halt and Catch Fire’s Cameron Howe, who named her BIOS “Lovelace.”
Grace Hopper is The Innovators’ other female programming lodestone, and also the subject of a new short documentary directed by Gillian Jacobs (Community, Life Partners). A spiritual descendant of Lovelace, Hopper programmed the gigantic Mark I computer in the ’40s and worked on the COBOL programming language. (She popularized the term “bug” for a computer glitch after a moth got stuck in the Mark II.) Hopper served in the Navy for decades, eventually becoming an admiral. Like Lovelace, she saw beyond the parameters of what was considered possible at the time. She wrote the first-ever compiler in 1952, disproving critics who had said computers could only do arithmetic. The documentary makes a strong case that Grace Hopper should be every bit as much of a household name as Steve Jobs. But her story shows more than her individual importance. It becomes a story about the field — the difficulties of being a woman in a field in which women’s innovations are often downplayed or made invisible.
Hopper, like many women who manage to succeed in traditionally male-dominated fields, chose to pretend that sexism had no effect on her life, that she had succeeded because she was a genius programmer, which rendered her sex irrelevant. It is an understandable fantasy. Hopper may have succeeded despite her gender, but that doesn’t undercut the endless nameless women for whom success in tech was unfeasible because of systemic inequality and the presumption that women are inferior to men. “Amazing Grace” have may have been uniquely talented, but there were women for whom a career like Hopper’s was impossible for reasons beyond their control.
The documentary makes a case that Hopper was brilliant, but also lucky. She was one of the first and few women to earn a PhD in mathematics from Yale, and she was fortunate enough to get the requisite male co-sign early on, although even one of the men who became her biggest advocates was at first reluctant to work with a woman. She never had children, which meant she was not required to work a domestic “double shift” at home and could instead devote herself to writing FLOW-MATIC and COBOL. In 1969, Data Processing Management awarded her the Computer Science Man-of-the-Year Award, which was called “Man-of-the-Year” even though Hopper was the first recipient. But even if Hopper promoted an idea that she had succeeded on the basis of talent and hard work alone, she worked to make the field more inclusive. As she aged, she became a mentor to many young people training to be programmers. She retired at 79, at the rank of rear admiral, and was the oldest person serving there. Her persistence and prolific contributions to computer science built her well-deserved reputation as the mother of computing.
I don’t want to make any kind of sweeping arguments about what kind of programming innovations women contribute to history, because programming is not a gendered activity. The brain is not a sexed organ, but as soon as we start talking about women and computers, the stereotype that women are not suited to science, technology, engineering, and math disciplines (STEM disciplines) floats into play.
Kieran Snyder wrote about the pervasiveness of sexism in Silicon Valley for the Washington Post recently. She noted that women in the tech industry, as in any industry where they are a minority, have to pretend their own gender is invisible, while being expected to shake off casual sexism. Snyder wrote: “Why do any women stick around, especially in an industry where ‘will it work for your mom?’ is geek-speak for ‘will dummies like it too?’” Those who object to this kind of language are labeled “difficult women” — and difficult women can never get away with the kind of dickish behavior Steve Jobs could, which is one reason they often try to just brush off and ignore their own feelings.
When computing was considered drudgery, women played a significant role. They were hired to be “human computors” who carried out math problems and solved equations before machines that could do so existed. During World War II, women were drafted into the Electronic Numerical Integrator and Computer program, where they worked as human “computers.” The women of ENIAC — including Betty Jean Jennings, Kay McNulty, Betty Snyder, Ruth Lichterman, Fran Bilas, and Marlyn Wescoff — were drafted into service as programmers. Snyder wrote SORT/MERGE, the first generative programming system. The women of ENIAC did much of the work but received little credit for it; the Army downplayed their involvement. Once programming became seen as a creative art rather than a rote secretarial one, women were not as welcome. (The Innovators also covers the women of ENIAC in detail, and discusses exactly how programming evolved from being seen as rote flip-switching to an intellectual endeavor.)
Women in tech today are taking a more direct approach to confronting issues of gender inequality. Rooting out the exact causes and conspirators who keep women on tech’s sidelines is difficult, because most forms of prejudice are deeply ingrained and subtly enforced. The solution, at least in part, may come from increasing the visibility of the issues. Tracy Chou, a Pinterest programmer and rising star in tech, has begun asking companies to release the data on their own internal makeup so that it can be tracked. The dismal statistics — women making up 17 percent of the workforce in technology- or engineering-related jobs at Google, 15 percent at Facebook, 9 percent at Mozilla — demonstrate that female engineers and programmers who felt alienated and underrepresented were not imagining things. To combat the concept of the tech bro, there must be a tech sisterhood. Tech history is not a chain of command, it’s a crazy quilt — no machine is ever really built by one person alone. It would be a mistake to consider Ada Lovelace and Grace Hopper as just lone geniuses — the same way it is a mistake to think that way of the men. But they are two avatars of importance for women in tech — the proof that natural talent knows no type. To that end, women are looking back to one of their own: the biggest conference promoting and connecting women in computing is called the Grace Hopper Celebration. Perhaps together they can change the line of logic. After all, computers can be reprogrammed.