Computing History
Computer Science has a long history. History is empowering and fun. We build on our past. So let’s learn where computing comes from, why we do it, and what we can to with it.
Pop Culture
[Computing is] complete pop culture. The big problem ... [is that] the electronic media we have is so much better suited for transmitting pop-culture content than it is for high-culture content. I consider jazz to be a developed part of high culture. [Jazz aficionados take deep pleasure in knowing the history of jazz, but] pop culture holds a disdain for history. Pop culture is all about identity and feeling like you're participating. It has nothing to do with cooperation, the past, or the future—it's living in the present. I think the same is true of most people who write code for money. They have no idea where [their culture came from]. —Alan Kay
Woah. Let’s do something about this.
A Brief History
Read Gabriel Robins’ A Brief History of Computing. It’s the best one out there.
Another Brief History
Here are some important events, people, works, and machines to be aware of.
In ancient times
Since antiquity, humans have been marking, matching, comparing, tallying, counting, measuring, and reckoning.
A big step
Early inventories featured pictorial markings like 🌽🌽🌽🌽🌽 🍓🍓🍓. Then someone figured out they could save space by writing something like 5🌽 3🍓. An incredibly powerful idea.
Numerals!
Recipes for calculation
Recipes, or lists of instructions, for manipulating quantities and dimensions were sometimes recorded. Notable examples include:
Enter the machines
Naturally, mechanical devices were created to carry out (and speed up!) calculations. Examples:
- Abacus (earliest known from Sumeria 2700–2300 BC)
- Astrolabe (~200 BC)
- Antikythera mechanism (~200 BC)
- Slide Rule (1620s)
- Pascal’s calculators (1640s)
- Jacquard Loom (1804)
- Babbage’s Difference Engine (1820s) and Analytical Engine (1837)
- Hollerith’s Tabulating Machine (1880s)
The Analytical Engine was fascinating historically. Although never built, it was perhaps the world’s first programmable, general-purpose computer. Several programs are known to have been written for the device, the most famous of which—a program for the computation of Bernoulli numbers—was presented by Babbage’s collaborator Ada Lovelace in her translator’s notes to Luigi Menabrea’s article in Taylor’s Scientific Memoirs in 1843. (Also see this shorter summary). Lovelace is often considered the world’s first programmer. She is also widely admired as being perhaps the first to recognize the wide application of computing beyond arithmetic, writing that the Analytic Engine
might act upon other things besides number, were objects found whose mutual fundamental relations could be expressed by those of the abstract science of operations, and which should be also susceptible of adaptations to the action of the operating notation and mechanism of the engine.... Supposing, for instance, that the fundamental relations of pitched sounds in the science of harmony and of musical composition were susceptible of such expression and adaptations, the engine might compose elaborate and scientific pieces of music of any degree of complexity or extent.
Mathematicians get involved
Work in the 1800s on formal logic and the foundations of mathematics. Wild times! We saw logic break free of Aristotlean syllogisms. Cantor introduced us to the paradise of multiple infinities. And we got a lot of paradoxes! We saw three major movements: the logicists (math is just logic), the intuitionists (math is only what we invent and can construct or demonstrate), and the formalists (math is done by manipulating symbols according to rules). Major results of this era include:
- Boole’s Boolean Algebra (1847)
- Frege’s Begriffsschrift (1879)
- Peano’s Axioms (1889)
- Hilbert’s Program (early 1920s)
- Russell and Whitehead’s Principia Mathematica (1910–1913)
Computation gets formalized
Hilbert asked for a logically consistent and complete formalization of all of mathematics, which Gödel proved impossible in 1931. He also asked for an effective, finite, decision procedure for all statements of mathematics (which became known as the Entscheidungsproblem). He thought such a procedure existed. Others did not. To prove him wrong, one needed to formalize the very idea of algorithm. The three main approaches of the 1930s were:
Church and Turing both showed the Entscheidungsproblem had no solution. Poor Hilbert.
There’s a fun story about Church, Gödel, and Turing that we will cover in class.
Turing
Turing’s paper is one of the most impactful and significant of all time.
- He gave a formalization of effective computation (algorithms) in a way people could all understand
- He gave evidence as to why his formalization, now called the Turing Machine, coincides with our intuitive notion of “computable” (the Church-Turing Thesis).
- He introduced computational universality: programs were just data (ok so far, not surprising) and, wait for it, a single machine could “run” any other machine (!!!!!!!!!!!!!!!!). Before this work, it was not known (but it may, we don’t know, have been suspected by some) that a truly universal computer could exist—separate machines were thought to be necessary for different tasks.
- He showed that some numbers were not computable, some functions were not computable, and that some questions could not be mechanically decided—and hence, that the Entscheidungsproblem had no solution.
Details coming soonWe’ll study Turing’s paper, Turing machines, and the notion of universal computation in detail later in this course.
The rise of the machines
Turing's Universal Machine (1936) opened up a flood of new work in general purpose machines. Wikipedia’s History of Computing Hardware page covers both ancient devices and a fair number of machines from the 1930s–1960s. Worth a look to see how we eventually got to the electronic devices that are ubiquitous today.
Programming Languages
von Neumann once angrily asked someone writing an assembler “Why would you ever want more than machine language?” Today that question sounds silly. Even assembly language is too low-level. One of the biggest proponents of high-level languages was Grace Hopper, who also happened to write the first compiler. Here are some notable high-level languages:
- Fortran, COBOL, LISP
- Algol, PL/1, Simula
- APL, J, K, Q
- CLU, Alphard, Euclid, Mesa, Cedar, Turing, Ada
- BCPL, B, C, C++, D, Rust, Zig, Odin, Swift
- Java, C#, Scala, Kotlin
- Perl, PHP, Hack, Tcl, Python, Ruby, Groovy, Lua
- Simula, Smalltalk, Object Pascal, Eiffel
- Concurrent Pascal, Occam, Linda, SR, Erlang, Elixir, Parasail, Chapel, Go
- Self, JavaScript, ActionScript, CoffeeScript, TypeScript
- Standard ML, CAML, Ocaml, F#, Elm, Reason
- Miranda, Haskell, PureScript, Idris
- Prolog, Mercury, Oz, Curry
- Agda, Coq, Lean
- Snobol, Icon
- Id, Val, Sisal
- SQL, Cypher, SPARQL
- Postscript
- MATLAB, R, Julia, Mojo
- Intercal, Unlambda, Brainfuck, Malbolge, Befunge, ZT, Java2K, LOLCODE
- GolfScript, Pyth, CJam, Jelly, 05AB1E, Stax
Exercise: The languages above are informally grouped. Try to determine, for each line, which characteristic is shared by the languages on that line. You’ll likely need to do a lot of research, so don’t spend too much time on this. Just focus on the lines that may interest you the most.
Language Implementation
Languages have to not only be expressive, but they have to be efficiently implementable. This is why we study compilers and interpreters. Two major parts of this field are:
- Linguistic Theories (from Chomsky and others) that enable parsing to be efficient
- Optimizations (Frances Allen was a major figure in this area)
Human-centric computing
People create programs for people. Areas with people-focus include:
- Graphics and user interfaces
- Computing in education
- Libraries for creative computing
Modern Trends
Trends in the modern computing era:
- Personal computers (1970s)
- Internet (1980s)
- Web (1990s)
- Mobile (2000s)
- Cloud (2010s)
- AI (esp. Generative AI)
- Quantum
Big Ideas
Here are six themes visited above that are worth committing to your long-term memory:
- Symbols and symbol manipulation amplify our thought processes.
- The functions that can be computed comprise an infinitesimal fraction of the functions we know that exist.
- Theoretical computing devices (such as Turing Machines) are important for showing the limits of computation, but are not suitable for real computation.
- Universal computation is far, far from obvious, but we take it for granted today.
- Modern computers speed up time to such an extent that what we know and what we can know is qualitatively different than what we could know before computing.
- Programming languages enable the expression of computations at such a high level that creative and impactful computing is accessible to many.
Other References
For more on computing history, don’t miss Wikipedia’s
History and
Timeline pages, and the fabulous book The Annotated Turing by Charles Petzold.
Recall Practice
Here are some questions useful for your spaced repetition learning. Many of the answers are not found on this page. Some will have popped up in lecture. Others will require you to do your own research.
- What is the significance of numerals?
- Where and when were abaci first known to be used?
- What could the Antikythera mechanism do?
- What are some of the algorithms from the famous Babylonian tablets?
- What were some of the accomplishments of Al-Khwārizmī?
- What was Ada Lovelace’s published program in her famous Notes?
- What was Lovelace most known for besides the first published program?
- Was the analytical engine ever built?
- What did the logicists think math was? What did the formalists think it was? The intuitionists?
- Who was George Boole?
- What was Hilbert's Program?
- What was the Entscheidungsproblem?
- How did Church formalize the notion of effective computability?
- How did Turing formalize the notion of effective computability?
- What was Gödel's initial reaction to Church’s Lambda Calculus?
- When did Gödel finally accept the Lambda Calculus a model for effective computability?
- Name four stunning achievements of Turing’s famous paper.
- Why was the ENIAC famous?
- Who wrote the first compiler and what was it called?
- Why are programming languages important?
- Why is LISP so loved?
- Why is Ruby so loved?
- Why is CLU so significant?
- What else is Barbara Liskov known for?
- What were Noam Chomsky’s contribution to computer science?
- What is Frances Allen known for?
- What year did Allen win the Turing Award?
- What even is Generative AI?
Summary
We’ve covered:
- Earliest notions of computing
- Numbers as abstractions of quantities
- Numerals as representations of numbers
- Earliest known recordings of recipes for calculations
- Early computing machines
- Influence of the analytical engine
- Ada Lovelace's insights
- Math in the late 1800’s
- The formalization of computing
- Undecidability
- Human-centric computing
- The modern era in computing