The final exam will be on May 1, 2018 at 2p.m. That’s 2018-05-01T14:00:00-07:00.
The final exam is a wonderful chance to bring together all the material from the semester and review it as a coherent whole.
Begin your review by checking your progress on the following low-level learning outcomes, which I’ll state as follows. Participants in this class should be able to:
- Explain the difference between hardware and software.
- Explain the difference between system software and application software.
- Explain what it means to encode and decode.
- Provide examples of how multiple interpretations of the same information unit give rise to different meanings.
- Explain the importance of binary representations of information.
- “Count” in binary and hex.
- Add and subtract in binary and hex.
- Recite from memory the powers of 2 from $2^0$ to $2^{16}$.
- Estimate the values of powers of 2 from $2^{17}$ to $2^{60}$.
- Convert between unsigned decimal, signed decimal, hex, and binary representations of $n$-bit words.
- Recite from memory the range of values representable in both unsigned and signed interpretations in 4-bit, 8-bit, and 16-bit words, while being able to estimate ranges for 32 and 64-bit words.
- Compute saturated and modular sums.
- Determine carry and overflow occurrences on modular sums.
- Express values using kibi, mebi, gibi (and similar) prefixes.
- Encode and decode floating-point values in the 32-bit IEEE-754 representation.
- Explain in detail the difference between normalized and denormalized floating-point values.
- Explain the difference between characters, glyphs, and grapheme clusters.
- Show how certain grapheme clusters (including but not limited emoji) are constructed from characters.
- Know what Unicode normalization forms are, and how to look up details about them.
- Produce UTF-8, UTF-16 (LE and BE), and UTF-32 (LE and BE) encodings of codepoints.
- Determine if a byte sequence is legal UTF-8.
- Determine if a byte sequence is legal UTF-16 (LE or BE).
- Explain the difference between big endian and little endian.
- Convert statements of logic involving AND, OR, and NOT (and other operators) into logic notation.
- Sketch gate-level representations of logic formulas.
- Generate truth tables for logic formulas.
- Rewrite logic formulas into equivalent formulas, particularly into an all-NAND formulation.
- Use logic formulas to carry out certain arithmetic tasks.
- Explain the difference between assembly and machine language.
- Write assembly language fragments for the simple computer, particulary for code involving variables, conditionals, and loops.
- Explain, via examples, how code and data are indistinguishable at the machine language level.
- Give an example of a program in which data is interpreted as code and vice versa.
- Translate, by hand, between assembly and machine language on the simple computer.
- Write simple Python scripts with variables, conditionals, loops, functions, numbers, strings, and lists.
- Write simple Python scripts with command line arguments.
- Write and execute simple C programs.
- Use printf in C.
- Write your own functions in C, using arrays, strings, and pointers.
- Explain malloc and free in C.
- Explain why C arrays aren’t like arrays in other languages.
- Process strings in C by processing the UTF-8 encoding.
- Read from standard input and write to standard output in C.
- Explain why “strings are hard” in not just C, but in systems programming in general.
- Write a hello-world program in NASM.
- Understand roughly where this class fits into a CS curriculum.
Go over all of the assigned readings for the course:
If you like, review some of the course lectures on YouTube.
Review all worksheets.
Review the answers to homework and exam questions, particularly for those questions you answered incorrectly.
Do more practice problems.
Come to course reviews to be held Monday, April 30, from 11:30-12:30 and Tuesday, May 1, from 12-1.
Use the course Slack channel to post questions, and answer each other’s questions.
Be prepared for short answer problems.