Dennis Ritchie (standing with Ken Thompson in the 1973 photo above) is a revered figure in the history of computing. But before he became a legend for his contributions to the world of operating systems and programming languages, Ritchie was a humble graduate student in applied mathematics at Harvard University, spending his share of time playing videogames and arguing with the university library about the cost of binding an academic paper.
Recently the Computer History Museum in Silicon Valley caught a glimpse of this forgotten moment in time, rediscovering a copy of Ritchie’s final dissertation which had been presumed lost for over half a century. Written in 1968, when he was just 27 years old, the paper is a chance to peek at the earliest days of computer science, to understand the challenges faced by pioneers who came before us, and appreciate an intellect that left behind a legacy we’re still building on today. But maybe it’s also a reminder of just how far we’ve come — and how much technology itself can change over the course of a single lifetime.
The news came in an announcement on the blog of Silicon Valley’s Computer History Museum by technology historian David C. Brock, the director of the museum’s Software History Center. Though they emphasize source code, the group “seeks to put history to work today in gauging where we are, where we have been, and where we are heading.”
In a blog post, Brock notes the museum’s Dennis Ritchie collection, which includes some of the earliest Unix source code dating from 1970 to 1971 — and points out their collection now also includes “a fading and stained photocopy” of Ritchie’s doctoral dissertation, “Program Structure and Computational Complexity.”
It also includes a cleaner digital scan of a copy of the manuscript owned by Ritchie’s graduate school friend, Albert Meyer.
The paper’s 28 footnotes include one citing Alan Turing’s 1936 paper “On computable numbers, with an application to the Entscheidungsproblem.”
“Recovering a copy of Ritchie’s lost dissertation and making it available is one thing,” jokes the museum’s blog post, “understanding it is another.”
Doing the Math
Back in the early 20th century, Alan Turing (and Kurt Gödel before him) had tried to identify the ultimate limits of mathematics, necessarily touching on which kinds of questions were in fact computable. “In the decades that followed, and before the emergence of computer science as a recognized discipline, mathematicians, philosophers, and others began to explore the nature of computation in its own right, increasingly divorced from connections to the foundation of mathematics,” explained the museum’s blog post.
The post later describes this arcane area of interest as “What could be proven about the realm of possible computations?” and notes applied mathematics departments like Harvard’s were “one of few places where these new investigations were taking place in the mid-1960s.”
MIT professor Albert Meyer, who’d gone to graduate school with Ritchie, recently gave the museum an oral history interview about their experiences together, and remembers their advisor, Patrick Fischer, had been interested in “what made things hard? What made things easy? What kinds of things could different kinds of programs do?”
Meyer remembered that “This was a time when… well, first of all, there were hardly even any computer science departments in the country then. This was in the early 1960s. And there was just coming to be a vision that there was something special about computation.”
Also at their grad school was Stephen Cook, who did some of the original work on what Meyer calls “the P = NP question” of ascertaining the “computability” of an issue — whether its solution can be derived by algorithms (where NP stands for “nondeterministic polynomial time”). But Meyer shares an interesting detail. “Prior to that we were very focused on this issue of ‘How do you recognize problems that can be solved by computation but there’s no efficient way to do it…?’ And that was, those papers by Dennis and me, were exactly about that…
“The insight that comes from both Dennis’s work and my work and our joint work together was that in a certain sense, the trick to proving it and understanding what’s going on is to get away from the syntax and realize that it’s simply talking about how long the computations are allowed to run.”
It ultimately became the subject for Ritchie’s dissertation, which the museum calls “the intellectual and biographical fork-in-the-road separating an academic career in computer science from the one at Bell Labs leading to C and Unix.”
Beginning at Bell Labs
After Dennis Ritchie secured his desired position at Bell Telephone Labs, he remained there for the rest of his entire career.
And Ritchie remained remarkably productive. Besides co-creating Unix with Ken Thompson (both pictured on the right), Ritchie also created the C programming language, a pair of achievements that the Computer History Museum calls “a fundamental dyad of the digital world that followed.” But he left a legacy in other fundamental ways. Ritchie also co-authored that language’s definitive documentation, a 1978 book titled “The C Programming Language,” with Brian W. Kernighan, a book so famous it’s sometimes referred to by its authors’ initials – as “the K&R book.” To this day programmers still identify a specific code-formatting technique — placing code-surrounding braces on their own stand-alone line when indicating the start of a function — as “K&R style.”
Given this, his stint in academia became something that Ritchie liked to deprecate. “My undergraduate experience convinced me that I was not smart enough to be a physicist, and that computers were quite neat,” Ritchie quipped later on his biography page at Bell Labs. “My graduate school experience convinced me that I was not smart enough to be an expert in the theory of algorithms and also that I liked procedural languages better than functional ones.”
But the Computer History Museum sees a different story. “While his predilection for procedural languages is without question, our exploration of his lost dissertation puts the lie to his self-assessment that he was not smart enough for theoretical computer science. More likely, Ritchie’s graduate school experience was one in which the lure of the theoretical gave way to the enchantments of implementation, of building new systems and new languages as a way to explore the bounds, nature, and possibilities of computing.”
A Paper Chase
After Ritchie’s death in 2011 — just one week after the death of Steve Jobs — an honorary scholarship was set up by the Association for Computing Machinery‘s Special Interest Group on Operating Systems. But the founders of the Dennis M. Ritchie Doctoral Dissertation award soon discovered an oddity: Ritchie himself had never been awarded his Ph.D.
In his web page at Bell Labs, Ritchie had finessed the issue by writing “The subject of my 1968 doctoral thesis was subrecursive hierarchies of functions.”
The story was told by Robbert van Renesse, who had worked under Ritchie in 1990 on the distributed operating system Plan 9 at AT&T Bell Labs in Murray Hill.
In an article in the ACM SIGOPS Operating Systems Review, Renesse remembers how he’d started investigating whether Ritchie could be awarded the degree posthumously — soon learning that Bell Labs Alcatel Lucent had tried the same thing, only to discover that the university’s dean had stood firm. “At this point I threw in the towel, knowing that Dennis Ritchie actually would have enjoyed the irony of it all.”
But Ritchie’s classmate Meyer tells a story he’d heard from their advisor, Patrick Fischer. “As Pat tells the story, Dennis had submitted his thesis. It had been approved by his thesis committee, he had a typed manuscript of the thesis that he was ready to submit when he heard the library wanted to have it bound and given to them. And the binding fee was something noticeable at the time… not an impossible, but a nontrivial sum.” Dennis felt the library should pay for the bound copy that they themselves were going to keep, as Meyer heard the story, “And apparently, he didn’t give up on that. And as a result, never got a Ph.D.
“So he was more than ‘everything but thesis.’ He was ‘everything but bound copy.'”
The museum also got a different view from Dennis’s brother John, who remembered Dennis already had the job he’d dearly wanted as a researcher at Bell Labs — adding that Dennis “never really loved taking care of the details of living.”
In fact though they’d gotten to work on a few papers together, Meyer remembers feeling that “I would have loved to collaborate with him, because he seemed like a smart, nice guy who’d be fun to work with, but yeah, you know, he was already doing other things. He was staying up all night playing Spacewar!” (Spacewar being the pioneering 1960s videogame installed on Harvard’s PDP-1 minicomputer.)
But the Association for Computing Machinery went ahead and created the Dennis M. Ritchie Doctoral Dissertation award, first awarded at the 2013 Symposium on Operating Systems Principles. Showing the advances we’ve made in technology, its inaugural honorable mention award went to Roxana Geambasu, who went on to become an assistant professor of computer science at Columbia University, for her University of Washington thesis on “Regaining Control over Cloud and Mobile Data.” And the 2013 winner was Mona Attariyan, a Google senior software engineer (now head of analytics at Redfin), for her University of Michigan thesis “Improving Software Configuration Troubleshooting with Causality Analysis.”
“Although Dennis Ritchie still does not have a Ph.D.,” wrote Renesse, “I am extremely pleased with the outcome, as I am certain he would have been as well.”
And it fell to the Computer History Museum to fill in the final pieces. “After Dennis Ritchie’s death in 2011, his sister Lynn very caringly searched for an official copy and for any records from Harvard,” they remember in their blog post. “There were none, but she did uncover a copy from the widow of Ritchie’s former advisor.”
“Until very recently then, across a half-century, perhaps fewer than a dozen people had ever had the opportunity to read Ritchie’s dissertation.”
- Tinkerer makes his own Mars rover with a 3D printer.
- Robot maker Boston Dynamics begins selling its four-legged robot Spot.
- Purism releases a new line of “beautiful, secure privacy-respecting laptops.”
- The New Yorker explores how the Sony Walkman changed the world.
- Visualizing data with a soldering iron and APIs: makers build a real-time subway-tracking map on a circuit board.
- Will humans prefer an AI’s answer to philosophical questions?
- Stanford economists ponders whether the pandemic will permanently change our workplaces and our cities.
- Entrepreneur predicts new kinds of experimental startup anti-cities.
- The Wall Street Journal wonders how commutes will change.
Feature image via Wikipedia.