Computing History

Computer Science has a long history. History is empowering and fun. We build on our past. So let’s learn where computing comes from, why we do it, and what we can to with it.

Pop Culture

[Computing is] complete pop culture. The big problem ... [is that] the electronic media we have is so much better suited for transmitting pop-culture content than it is for high-culture content. I consider jazz to be a developed part of high culture. [Jazz aficionados take deep pleasure in knowing the history of jazz, but] pop culture holds a disdain for history. Pop culture is all about identity and feeling like you're participating. It has nothing to do with cooperation, the past, or the future—it's living in the present. I think the same is true of most people who write code for money. They have no idea where [their culture came from]. —Alan Kay

Woah. Let’s do something about this.

A Brief History

Read Gabriel Robins’ A Brief History of Computing. It’s the best one out there.

Another Brief History

Here are some important events, people, works, and machines to be aware of.

In ancient times

Since antiquity, humans have been marking, matching, comparing, tallying, counting, measuring, and reckoning.

A big step

Early inventories featured pictorial markings like 🌽🌽🌽🌽🌽 🍓🍓🍓. Then someone figured out they could save space by writing something like 5🌽 3🍓. An incredibly powerful idea.

Numerals!

brahmi.jpg

Recipes for calculation

Recipes, or lists of instructions, for manipulating quantities and dimensions were sometimes recorded. Notable examples include:

babylon_tablet.png

rhind.jpeg

alkhwarizmi.png

Enter the machines

Naturally, mechanical devices were created to carry out (and speed up!) calculations. Examples:

antikythera.jpeg

jaquard.png

diffengine.jpeg

ada_lovelace.png

The Analytical Engine was fascinating historically. Although never built, it was perhaps the world’s first programmable, general-purpose computer. Several programs are known to have been written for the device, the most famous of which—a program for the computation of Bernoulli numbers—was presented by Babbage’s collaborator Ada Lovelace in her translator’s notes to Luigi Menabrea’s article in Taylor’s Scientific Memoirs in 1843. (Also see this shorter summary). Lovelace is often considered the world’s first programmer. She is also widely admired as being perhaps the first to recognize the wide application of computing beyond arithmetic, writing that the Analytic Engine

might act upon other things besides number, were objects found whose mutual fundamental relations could be expressed by those of the abstract science of operations, and which should be also susceptible of adaptations to the action of the operating notation and mechanism of the engine.... Supposing, for instance, that the fundamental relations of pitched sounds in the science of harmony and of musical composition were susceptible of such expression and adaptations, the engine might compose elaborate and scientific pieces of music of any degree of complexity or extent.

Mathematicians get involved

Work in the 1800s on formal logic and the foundations of mathematics. Wild times! We saw logic break free of Aristotlean syllogisms. Cantor introduced us to the paradise of multiple infinities. And we got a lot of paradoxes! We saw three major movements: the logicists (math is just logic), the intuitionists (math is only what we invent and can construct or demonstrate), and the formalists (math is done by manipulating symbols according to rules). Major results of this era include:

georgeboole.jpg

peano.jpeg

davidhilbert.jpg

bertrandrussell.png

Computation gets formalized

Hilbert asked for a logically consistent and complete formalization of all of mathematics, which Gödel proved impossible in 1931. He also asked for an effective, finite, decision procedure for all statements of mathematics (which became known as the Entscheidungsproblem). He thought such a procedure existed. Others did not. To prove him wrong, one needed to formalize the very idea of algorithm. The three main approaches of the 1930s were:

alonzochurch.png

kurtgodel.jpg

alanturing.jpg

Church and Turing both showed the Entscheidungsproblem had no solution. Poor Hilbert. There’s a fun story about Church, Gödel, and Turing that we will cover in class.

Turing

Turing’s paper is one of the most impactful and significant of all time.

Details coming soon

We’ll study Turing’s paper, Turing machines, and the notion of universal computation in detail later in this course.

The rise of the machines

Turing's Universal Machine (1936) opened up a flood of new work in general purpose machines. Wikipedia’s History of Computing Hardware page covers both ancient devices and a fair number of machines from the 1930s–1960s. Worth a look to see how we eventually got to the electronic devices that are ubiquitous today.

eniac.png

Programming Languages

von Neumann once angrily asked someone writing an assembler “Why would you ever want more than machine language?” Today that question sounds silly. Even assembly language is too low-level. One of the biggest proponents of high-level languages was Grace Hopper, who also happened to write the first compiler. Here are some notable high-level languages:

gracehopper.jpeg

johnmccarthy.jpeg

barbaraliskov.jpeg

matz.jpeg

Exercise: The languages above are informally grouped. Try to determine, for each line, which characteristic is shared by the languages on that line. You’ll likely need to do a lot of research, so don’t spend too much time on this. Just focus on the lines that may interest you the most.

Language Implementation

Languages have to not only be expressive, but they have to be efficiently implementable. This is why we study compilers and interpreters. Two major parts of this field are:

noamchomsky.jpeg

franallen.jpeg

Human-centric computing

People create programs for people. Areas with people-focus include:

alankay.jpeg

adelegoldberg.jpeg

bretvictor.jpeg

laurenleemccarthy.jpeg

Exercise: Read about Douglas Engelbart’s Mother of All Demos (1968).

Modern Trends

Trends in the modern computing era:

vintcerf.png

first-www-page.jpeg

jobs-iphone.jpg

quantum-computer.jpeg

Big Ideas

Here are six themes visited above that are worth committing to your long-term memory:

Other References

For more on computing history, don’t miss Wikipedia’s History and Timeline pages, and the fabulous book The Annotated Turing by Charles Petzold.

annotatedturing.jpeg

Recall Practice

Here are some questions useful for your spaced repetition learning. Many of the answers are not found on this page. Some will have popped up in lecture. Others will require you to do your own research.

Summary

We’ve covered:

  • Earliest notions of computing
  • Numbers as abstractions of quantities
  • Numerals as representations of numbers
  • Earliest known recordings of recipes for calculations
  • Early computing machines
  • Influence of the analytical engine
  • Ada Lovelace's insights
  • Math in the late 1800’s
  • The formalization of computing
  • Undecidability
  • Human-centric computing
  • The modern era in computing