Insights

Publication | Legaltech News

Nervous System: Formulae for Success

David Kalat

February 2, 2023

David Kalat writes about the unexpected story of a mathematician and computer scientist who had to convert a treatise on computer programming from traditional typeset to digital text, revolutionizing software in the process.

Download the article.

In the 1970s, a seminal figure in the field of computer science set out on an old-media venture of writing a book. As it happened, the disruptions that the Computer Age was bringing to the world of publishing were evident even back then, and they posed an unexpected challenge for this legendary scientist. His solution helped create the notion of freeware and became the industry standard for scientific publishing used to this day.

Donald Knuth had distinguished himself as a mathematician and computer scientist at a precocious age. As an eighth-grade student, he designed sophisticated algorithms to solve problems. As a physics student in college, he rewrote the assembly and compiler code for the school’s computer. When he completed his bachelor’s degree in mathematics from what is now called Case Western Reserve University, the administration thought his work was so outstanding they also awarded him a master’s degree.

While pursuing a PhD at Caltech, Knuth decided to write a scholarly study of algorithms, titled The Art of Computer Programming. He began writing the manuscript by hand, anticipating the finished product would be a twelve-chapter monograph. When he approached the publisher in 1965 with his completed draft, however, he discovered that, ironically for a math genius, he had sorely miscounted. Whereas Knuth had been under the misapprehension that five of his handwritten pages would correspond to a single printed page, he had in fact written enough material to encompass several very large volumes. The Art of Computer Programming morphed from a single book to a seven-volume set, of which four parts have been released to date, and Knuth continues to work on additional installments.

The process of converting Knuth’s handwritten text to a printed commercial item introduced some unexpected challenges—and equally unexpected consequences.

Historically, typesetting involved intermediaries who took the author’s work and composed it for the printing press. These choices—of font selection, page breaks, the amount of white space on the page, and so on—have communicative and creative aspects, which traditionally were beyond the reach of the author. In the case of a specialist text like Knuth’s The Art of Computer Programming, the need to render complex mathematical formulae in readable form posed additional considerations.

For the first printing of his books, the publisher attended to these concerns. The principal typesetting technology of the 1960s and early 1970s involved injecting molten lead into a mold to create “slugs” to press ink onto paper. The advent of the computer age was rendering this approach obsolete. The New York Times was the last holdout, still using hot metal typesetting until July 2, 1978. The future would be digital and photographic typesetting.

When Knuth developed a new edition of one of the volumes in 1977, he found that the old typesetting method was no longer in use and the fonts used for the previous printing no longer available. He was obliged to re-typeset the book, with all its complex algorithmic equations. The publisher created galley proofs for him using replacement fonts, and Knuth was bitterly disappointed.

He was not the first to face this problem, of course, and jerry-rigged solutions already existed. The computer scientists at Bell Labs had been developing typesetting software for years, and their UNIX-based “troff” software had the capability to render mathematical formulae.

Knuth was not satisfied, though. He felt that troff was a graceless set of kludges and that a more elegant answer was possible. If anyone knew the contours of the problem, and the possible solutions, it was Knuth. He was not only a gifted algorithmic thinker but also the son of a commercial printer. So Knuth set out to create his own typesetting software from scratch.

The result was TeX. As a purpose-built solution devised by a bona fide genius, TeX disintermediated the typesetting field and empowered authors—especially scientific and technical ones—to design their own layouts. He paired TeX with METAFONT, a tool for designing rasterized fonts.

From the first edition of his book, Knuth had sought the power of what we now call “crowdsourcing,” offering a monetary reward to any reader who identified an error in his books. Knuth brought the same mindset to TeX, offering the software for free to all users with publicly available source code. TeX is arguably the earliest example of open-source software or freeware. Other users could not only take advantage of TeX’s versatility and power, but also augment and improve it. This universality and robustness has led TeX to remain in use today as a standard throughout academic and scientific disciplines.

 

The views and opinions expressed in this article are those of the author and do not necessarily reflect the opinions, position, or policy of Berkeley Research Group, LLC or its other employees and affiliates.

BRG Experts

Related Professionals

David Kalat

Director

Chicago