First of all, I hate when people say Happy Monday. Moving on. This is the final part in my much-anticipated trilogy (Here is Part 1, and here is Part 2). The Internet and its history are cool, but I miss literature. For my next post I’m going to write a strange experimental book review of something super arty like Taipei, I think.
In the 1980’s, MacArthur Grant fellow and MIT computer scientist Richard Stallman, along with many other university computer scientists, did the majority of his computing on an open operating system (OS) named Unix. Unix had been developed by Bell Labs in the 1960’s. It was “open” in the sense that anybody could see its source code and in the sense that it was free for anyone to use. Since it was free to use and its source code was available, that early generation of computer scientists could modify and add-on to it at whim. This naturally produced an environment conducive to growth and academic collaboration.
But in 1984 when AT&T was famously split due to antitrust laws, the company was free to make its Unix operating system a proprietary product. Unix became closed. The source code was sealed-off and universities and individuals had to pay to use it, much like the Windows operating system today.
Stallman, like many of his colleagues, felt undermined and betrayed. At this point in its still early history, Computer Science had always been a close-knit academic community used to easy and constant collaboration. To maintain this community many relied upon an open OS like Unix. In an ambitiously rebellious move Stallman started the GNU project. GNU is a discursive acronym that stands for “GNU is not Unix.” The GNU project’s goal was to build from the ground up a Unix-like OS that was completely free, both in the sense that it was free to use and that its source code was visible. Building an OS, however, is no easy feat. Fast-forward to 1991, and all the groundwork had been laid, yet the only thing the GNU project lacked was a kernel. A kernel is essentially, in Lawrence Lessig’s words, “the heart of an OS.”
Enter Linus Torvalds, an undergraduate at the University of Helsinki at the time. Torvalds wrote a very primitive kernel for the GNU project and posted it online. What happened next became the stuff of legend. Engineers from all over the world responded to Torvalds’ post immediately. The original kernel, as noted above, was very primitive, and it needed lots of work. Engineers would add to the kernel and send their additions back to Torvalds. Torvalds would make the changes he thought were best and re-post the updated kernel, and engineers would respond again. This back-and-forth went on for several years. The unbelievable fact of the phenomenon is that nobody received or expected to receive payment for their work, Torvalds included. This truly occurred outside the capitalist marketplace, and by 1996 a robust OS had been created. The OS is now referred to as Linux (a mash-up of Linus and Unix); or, more historically accurate, it is called GNU/Linux. This is often cited as the first major proof that “crowd-sourcing” – or, to use Harvard Law professor and digital activist Yochai Benkler’s term, “peer-production” – works better than anyone could have imagined.
The original document describing the Linux phenomenon was Eric Raymond’s article turned book The Cathedral & The Bazaar. It has since become The Bible of Open Source. Open Source means two things, already briefly referred to above. 1) The Source Code must be open so future engineers can see how the software functions; 2) The redistribution of the product must be free. Re-distribution is an important distinction here because the original distribution can be for a price. For example, the company Red Hat is the largest distributor of Open Source software such as GNU/Linux, and they are successful and large enough that they have become an extremely successful public company. However, anyone hacking the source code and modifying the software must re-distribute that hack for free. (N.B. the term “hack” here is not used in the pejorative sense, which connotes illegal activity. Rather, it simply means to modify a line of existing code.)
In The Cathedral & The Bazaar, Raymond gives an insider’s account of why the Linux project worked so well. While doing so he offers a set of maxims or guidelines that are meant to demonstrate why and how an Open Source piece of software can be effectively crowd-sourced. The most important maxim, which has proliferated in tech-related articles on the Web and elsewhere, is: “Given enough eyeballs, all bugs are shallow.” This is now known as “Linus’s Law.” What he means by this is that when thousands of engineers are slugging away through a buggy piece of software, even the most covert buggy line of code will surface rather quickly. The more people who are de-bugging, the faster the bug will present itself, and so the faster a fix will be written.
Raymond uses the “all bugs are shallow” maxim to distinguish between the cathedral and the bazaar. Raymond writes, “In the cathedral-builder view of programming, bugs and development problems are tricky, insidious deep-phenomena.” By “cathedral-builder,” he means a top-down hierarchy, where there is a lead writer who calls all the shots. The lead writer surrounds herself by lackeys, essentially, who do her bidding. This is in contrast to the “bazaar view.” Raymond writes, “In the bazaar view…you assume that bugs…turn shallow pretty quickly when exposed to a thousand eager co-developers pounding on every single new release.” The bazaar refers to the decentralized, individually motivated phenomenon of peer-production.
Also, just in case you didn’t see, check out this awesome Times’ article by one of my favorite tech writers Jaron Lanier.