You’re not going to find an official definition of mathematics that everyone will agree on, but we can say it involves abstract reasoning on a symbolic level as a way to get at truths or even just explanations of phenomena. Or even just to exercise our brains to go beyond what we can perceive! It definitely involves working with patterns and dealing with quantity, change, structure, chance, causality, and space, so sayeth Wikipedia.
Hmmm, well, no one knows for sure, but there’s a great book on the topic that might be right. If you look at the evolution of life and consciousness you might get some ideas. The earliest life forms that needed to move toward their food probably randomly bumped in to food and survived. Perhaps the next step in evolution was some sensing ability to move toward the food. Then lifeforms capable of making “models of their surroundings” so they could swerve around obstacles, and remember where the food was and how to get back there, gained an evolutionary advantage.
I’m no expert, but hey, it does maybe explain why intelligent beings (including humans) can do models. And in some sense, math happens when we make models and reason about them. Everyone can do math!
This question has been asked millions of times. Seriously. Let’s go meta on this question:
the only worthwhile reason to ask "did we invent math or did we discover it?" is so we know who to blame
— sally (@sallycol) September 18, 2021
The answer is probably both!
Whatever your take on this question is, Math has proven to be unreasonably effective in the natural sciences.
Math is a big field of study. Here’s how Wikipedia breaks it down (this is just one way, and it’s not going to please everyone):
Branch | Topics |
---|---|
Foundations | Philosophy of Mathematics • Mathematical Logic • Information Theory • Set Theory • Type Theory • Category Theory |
Algebra | Abstract • Commutative • Elementary • Group Theory • Linear • Multilinear • Universal • Homological |
Analysis | Calculus • Real Analysis • Complex Analysis • Differential Equations • Functional Analysis • Harmonic Analysis • Measure Theory |
Discrete | Combinatorics • Graph Theory • Order Theory • Game Theory |
Geometry | Algebraic • Analytic • Differential • Discrete • Euclidean • Finite |
Number Theory | Arithmetic • Algebraic Number Theory • Analytic Number theory • Diophantine Geometry |
Topology | General • Algebraic • Differential • Geometric • Homotopy Theory |
Applied | Control theory • Operations Research • Probability • Statistics |
Another organization of topics can be found at Quanta Magazine’s Map of Mathematics. There’s also a cool Map of Mathematics video by Dominic Walliman, from which this poster can be found:
What follows is a reference (don’t go looking for deep explanations!) of some concepts and notations from math that are useful in theoretical computer science. It’s not meant to be complete by any means. It’s pretty much just a smattering of discrete math with some number theory and set theory and maybe some algebra. Not much. And yes, math is still evolving and there are always variations in notation, but what you’ll find here should be pretty widely accepted.
We said above that mathematics involves “abstract reasoning on a symbolic level.” Breaking that down:
Sounds like symbolic logic, right? Indeed! Math makes use of symbolic logic, featuring formulas made up of variables (that’s the abstract part!) and constants and operators. Variables are so cool. They allow us to make general statements, without worrying about specific entities! The statements can talk about whether they apply to some or all substitutions of entities for variables, or whether they will always, sometimes, etc. apply. Each formula has a truth value based on some interpretation of its variables.
Here are some formulas that are good to know (we use the convention that lowercase letters are variables or constants and uppercase letters are formulas):
Form | Meaning | Technical Term |
---|---|---|
$T$ | True | Truth |
$F$ | False | Falsity |
$\neg A$ | Not $A$ | Negation |
$A \wedge B$ | $A$ and $B$ | Conjunction |
$A \vee B$ | $A$ or $B$, or both | Disjunction |
$A \supset B$ | It is not the case that $A$ is true and $B$ is false (i.e., whenever $A$ is true, $B$ is true) | Material Implication |
$A \equiv B$ | $A$ and $B$ are either both true or both false | Material Equivalence |
$A \rightarrow B$ | If $A$, then $B$ (non-truth-functional) | Conditional |
$\forall x. A$ | For every $x$, $A$ | Universal quantification |
$\exists x. A$ | There exists an $x$ such that $A$ | Existential quantification |
$\iota x. A$ | The $x$ such that $A$ | Description |
$x = y$ | $x$ and $y$ are the same object | Equality |
$\Box A$ | In all possible worlds, $A$ | Necessity |
$\lozenge A$ | In some possible world, $A$ | Possibility |
$\mathbf{P} A$ | It was (at least once) the case that $A$ | Past |
$\mathbf{F} A$ | It will (at least at some point) be the case that $A$ | Future |
$\mathbf{H} A$ | It was always the case that $A$ | Always has been |
$\mathbf{G} A$ | It will forever be the case that $A$ | Always going to be |
$\mathscr{B}_x A$ | Agent $x$ believes $A$ | Belief |
$|A|$ | Truth value of $A$ (in 0.0 ... 1.0) | Numerical Truth Value |
$pr(A)$ | Probability of $A$ being true | Probability |
$pr(A|B)$ | Probability of $A$ being true given that $B$ is true | Conditional Probability |
$E(\sigma)$ | The expected value of doing $\sigma$ | Expectation |
That was just a start—a pitifully small list of formulas to be sure. My way more extensive notes get into logic in more detail, with coverage of non-bivalent, paraconsistent, fuzzy, non-monotonic and a bunch of other types of logic. They get into metalogical notions of soundness and completeness, so if you are looking for Gödelian stuff, go there.
Okay, so now what exactly are the entities we reason about in mathematics? And how might we...symbolize them?
Folks that have tried to “formalize as much of mathematics as they can” have found two theories to be great candidates: Set Theory and Type Theory. (Three if you count category theory, but that is for another time.) Set theory is more well-known, so let’s see what’s up with sets.
A set is an unordered collection of unique elements, built according to formation rules of a rigorous axiomatic theory. These notes do not cover the rigorous foundations; instead, we will only focus on using sets.
Sets model properties of things. To say that $x$ is a member of the set $S$ means that $x$ has a certain property (the property expressed by the set).
Examples:
Here are some sets that are universally used in the world of math:
This is perhaps the most fundamental notion of set theory.
$x \in A \textrm{ means } x \textrm{ is a member of set } A$ |
$x \notin A \textrm{ means } \neg (x \in A)$ |
Examples:
$3 \in \mathbb{N}$ |
$\textrm{'c'} \in \{\sigma \mid \sigma \textrm{ is a letter of the Roman alphabet}\}$ |
$\textrm{Manitoba} \notin \textrm{AustralianStates}$ |
Not everything you can write is a set! In particular, $\{x \mid x \notin x\}$ is NOT a set in classical mathematical theories of sets.
Be careful of the binary
With the sets we’re talking about here (sometimes called crisp sets), you’re either in the set or you’re not in the set. There do exist fuzzy sets where membership is, well, fuzzier.
Definitions:
$A \subseteq B$ | $=$ | $\forall x. x \in A \supset x \in B$ | (subset of) |
$A = B$ | $=$ | $A \subseteq B \wedge B \subseteq A$ | (equal to) |
$A \neq B$ | $=$ | $\neg (A = B)$ | (not equal to) |
$A \cup B$ | $=$ | $\{x \mid x \in A \vee x \in B \}$ | (union) |
$A \cap B$ | $=$ | $\{x \mid x \in A \wedge x \in B \}$ | (intersection) |
$A - B$ | $=$ | $\{x \mid x \in A \wedge x \notin B\}$ | (minus) |
$A \times B$ | $=$ | $\{(x,y) \mid x \in A \wedge y \in B\}$ | (cross product) |
$\mathcal{P}(A)$ | $=$ | $\{P \mid P \subseteq A\}$ | (powerset) |
$\bigcup A$ | $=$ | $\{x \mid x \in P \textrm{ for some } P \in A\}$ | (union) |
$\bigcap A$ | $=$ | $\{x \mid x \in P \textrm{ for all } P \in A\}$ | (intersection) |
$A^n$ | $=$ | $\{ (a_1, \ldots, a_n) \mid n \geq 0 \wedge \forall i. a_i \in A \}$ | (n-element sequence) |
$A^*$ | $=$ | $\bigcup_{i \geq 0}\,A^i$ | (sequence) |
Examples: Let $A = \{a, b, c\}$ and $B = \{b, f\}$. Then:
$A \cup B = \{a, b, c, f\}$ |
$A \cap B = \{b\}$ |
$A - B = \{a, c\}$ |
$A \times B = \{(a,b), (a,f), (b,b), (b,f), (c,b), (c,f)\}$ |
$B^1 = B = \{b, f\}$ |
$B^2 = B \times B = \{(b,b), (b,f), (f,b), (f,f)\}$ |
$B^3 = B \times B \times B = \{(b,b,b), (b,b,f), (b,f,b), (b,f,f), (f,b,b), (f,b,f), (f,f,b), (f,f,f)\}$ |
$\mathcal{P}(B) = \{\varnothing, \{b\}, \{f\}, \{b,f\}\}$ |
$(6, T, 7.3) \in \mathbb{N} \times \mathbb{B} \times \mathbb{R}$ |
$(2, 3, 8) \in \mathbb{N}^3$ |
$\{a,b\}^* = \{(), a, b, (a,a), (a,b), (b,a), (b,b), (a,a,a), \ldots\}$ |
Objects such as $()$, $(a,b)$, or $(a,a,b,a,b)$ are called tuples, or sometimes sequences.
Given a tuple $t$ you access the $i$th element with the notation $\:t\!\downarrow\!i$.
Examples: Let $\,t = (8, F, (2,5))$. Then:
$t\!\downarrow\!0 = 8$ |
$t\!\downarrow\!1 = F$ |
$t\!\downarrow\!2 = (2,5)$ |
$t\!\downarrow\!3\;$ has no value |
Note that $(a,(a,a)) \neq ((a,a),a)$.
Are Tuples Sets?NO. Sets are unordered collections of unique elements; tuples are ordered and can have duplicates.
However, in set theory, everything is ultimately represented as a set, and there are mechanisms to define tuples as sets, just like numbers can be represented as sets. In everyday mathematics, we keep them separate, saying everything (including numbers and tuples) can be represented as sets. But, yeah, when you are working in the deep foundations of mathematics you might say everything is a set, but only when your theoretical foundations hat is on.
Sequences can be concatenated. If t_1 is an $m$-tuple and $t_2$ is an $n$-tuple then $t_1 • t_2$ is an $(m+n)$-tuple:
$(a_1, \ldots, a_m) • (b_1, \ldots, b_n) = (a_1, \ldots, a_m, b_1, \ldots, b_n)$ |
We do not bother distinguishing one-element tuples from the contained element. So it is fine to say $(a,b) • c = (a,b,c)$, and even $a • b = (a,b)$.
Flattening TuplesSome authors, not all, will say $(a,a,a)=(a,(a,a))$. Why? Because they want $\times$ to be like this:
$\begin{array}{l} S^1 = S \\ S^{n+1} = S \times S^n \end{array}$
Under this definition, $\{a\}^3$ = $\{a\} \times \{a\}^2$. The left hand side is $\{(a,a,a)\}$ and the right hand side is $\{(a,(a,a))\}$ so this definition leads to having to treat the two tuples as distinct.
I’m not a fan of this approach. I don’t like it.
Definition: $A$ and $B$ are disjoint iff $A \cap B = \varnothing$
Definition: $\Pi = \{P_1, P_2, \ldots, P_n\} \subseteq \mathcal{P}(A)$ is a partition of $A$ iff
Example: { EVEN, ODD } is a partition of $\mathbb{Z}$.
These are fun to prove:
$A \cup A = A$ | (idempotency) |
$A \cap A = A$ | (idempotency) |
$A \cup B = B \cup A$ | (commutativity) |
$A \cap B = B \cap A$ | (commutativity) |
$A \cup (B \cup C) = (A \cup B \cup C)$ | (associativity) |
$A \cap (B \cap C) = (A \cap B \cap C)$ | (associativity) |
$A \cup (B \cap C) = (A \cup B) \cap (A \cup C)$ | (distributivity) |
$A \cap (B \cup C) = (A \cap B) \cup (A \cap C)$ | (distributivity) |
$A \cup (A \cap B) = A$ | (absorption) |
$A \cap (A \cup B) = A$ | (absorption) |
$A - (B \cup C) = (A - B) \cap (A - C)$ | (DeMorgan) |
$A - (B \cap C) = (A - B) \cup (A - C)$ | (DeMorgan) |
$\varnothing \subseteq A$ |
$A \subseteq A$ |
$A \subseteq A \cup B$ |
$A \cap B \subseteq A$ |
$A \cup \varnothing = A$ |
$A \cap \varnothing = \varnothing$ |
$A - \varnothing = A$ |
$\varnothing - A = \varnothing$ |
$A \subseteq B \Leftrightarrow A \cup B = B$ |
$A \subseteq B \Leftrightarrow A \cap B = A$ |
$A \subseteq B \supset A \cup C \subseteq B \cup C$ |
$A \subseteq B \supset A \cap C \subseteq B \cap C$ |
$A \times \varnothing = \varnothing$ |
$A \times (B \cup C) = (A \times B) \cap (A \times C)$ |
$A \times (B \cap C) = (A \times B) \cap (A \times C)$ |
$(A \times B) \cap (C \times D) = (A \cap C) \times (B \cap D)$ |
$(A \times B) \cup (C \times D) \subseteq (A \cup C) \times (B \cup D)$ |
A relation is a subset of a cross product, i.e. $R \subseteq A \times B$. $A$ is the domain and $B$ is the codomain. If $(a,b) \subseteq R$ we write $aRb$. The inverse $R^{-1} \subseteq B \times A$ is $\{(b,a) \mid aRb\}$.
$For R \subseteq A \times A$ we define:
More definitions:
If $R \subseteq A \times B$ and $S \subseteq B \times C$ then the composition of $R$ and $S$ is $S \circ R$ = $\{(a,c) \mid \exists b. aRb \wedge bSc\}$.
Example:
Let $R = \{(1,2),(2,5),(1,1)\}$ and $S = \{(5,10),(2,8),(9,2)\}$. Then:
Even though relations are sets, the superscript notation on relations means something different than it does for regular sets. This seems infuriating until you realize that math notation is very contextual and changes depending on what you are talking about. Anyway, here is what the superscripts mean for relations:
$R^2 = \{ (a,b) \mid \exists c. aRc \wedge cRb \}$ |
$R^3 = \{ (a,b) \mid \exists c. \exists d. aRc \wedge cRd \wedge dRb \}$ |
$R^4 = \{ (a,b) \mid \exists c. \exists d. \exists e. aRc \wedge cRd \wedge dRe \wedge eRb \}$ |
$\ldots$ |
$R^* = \bigcup_{i \geq 0}\,R^i$ |
Those definitions imply $R^2 = R \circ R$, $R^3 = R \circ R \circ R$, and so on.
These forms turn out to be very useful in computation theories, as we frequently use relations that represent a single computation step, so the * form of the relation will mean any number of steps.
Example:
Let $\;\longmapsto\;= \{ (x,y) \mid y = 2x\}$. Then:
$5 \longmapsto 10$ | (one step) |
$5 \longmapsto^2 20$ | (two step) |
$8 \longmapsto^3 64$ | (three step) |
$2 \longmapsto^* 131072$ | (any number of steps 🤯) |
A relation $f \subseteq A \times B$ is called a function iff every element of $A$ appears exactly once as the first element of a pair in $f$. In this case we write $f: A \rightarrow B$. If $f$ is a function and $(a,b) \in f$ we write $f a = b$, or $f(a) = b$, and say $b$ is the value of $f$ at $a$. The range of $f$ is $\{b \mid \exists a. b = f a\}$.
Since functions are relations and therefore sets, we can write them in set notation, but there is also a nice short-cut called lambda-notation. These examples all specify the same function:
$\{...(-1,7), (0,8), (1,9), (2,10), ...\}$ |
$\{(x, y) \mid x \in \mathbb{Z} \wedge y = x + 8\}$ |
$\{(x, x+8) \mid x \in \mathbb{Z}\}$ |
$\lambda x_\mathbb{Z} . x+8$ |
More examples of functions:
$\lambda x. x^2 - x + 4$ |
$\lambda n. n \log_2 n$ |
$\lambda x . \lambda y . 5x + 2y - 7$ |
$\lambda (x, y). 5x + 2y - 7$ |
$\lambda \theta . \lambda (x, y). (x \cos\theta - y \sin\theta, x\sin\theta + y\cos\theta)$ |
If $f$ is a function, do not say “the function $f(x)$” unless the function $f$, when applied to $x$, actually yields a function. Also do not say things like “the function $n^2$” because $n^2$ is a number. The function you probably have in mind is $\lambda n. n^2$.
Let-notation:
Where-notation
If-notation:
Substitution:
Function iteration:
Be careful, though, because many mathematicians purposefully blur the distinction between $\sin^2\theta$ and $(\sin \theta)^2$, leaving you, the poor reader, to figure out what they mean by context.
For trig functions, mathematicians love to mess you up by writing the iterative form for the power form, but for arbitrary functions you are on your own.
IN OTHER WORDS, MATH NOTATION IS OFTEN AMBIGUOUS.
For $f: A \rightarrow B$,
If $f$ is a bijection from $A$ to $B$, then $f^{-1}$ is a bijection from $B$ to $A$. Also $\forall a. f^{-1}(f a) = a$ and $\forall b . f(f^{-1}b) = b$.
The cardinality of a set $A$ is denoted $|A|$ and obeys these axioms:
Did you notice something about bijections? They are important! They tell us when two sets are the same size, without having to count them. Matching is more fundamental than counting.
Think about this
Matching is more fundamental than counting. Imagine yourself without words for what we now call numbers. You could still come up with ideas of greater than or less than, and allocate resources, and even trade. Counting is an advanced concept!
AGAIN, MATCHING IS MORE FUNDAMENTAL THAN COUNTING.
A set is infinite iff there is a bijection between it and a proper subset of itself. A set is finite iff it is not infinite.
So there are just as many even numbers as there are natural numbers. Here’s a video explanation, together with some additional historical context:
If $A$ is finite then there exists a bijection from it to $\{1, 2, ... k\}$ for some $k \geq 0$ in which case we define $|A| = k$. The cardinality of $\mathbb{N}$ is $\aleph_0$. $A$ is countable iff $A$ is finite or $|A| = \aleph_0$.
Examples:
$| \varnothing | = 0$ |
$| \{a, b, c\} | = 3$ |
$| \{8, 7, \{6, 2, 4\}, 3\} | = 4$ |
$| \{\varnothing\} | = 1$ |
$| \mathbb{Z} | = \aleph_0$ |
$\mathbb{Z}$ is countable because $\lambda n.(\textrm{if even}(n)\textrm{ then }-n/2\textrm{ else }(n+1)/2)$ is a bijection from $\mathbb{N}$ to $\mathbb{Z}$.
$\mathbb{N} \times \mathbb{N}$ is countable, too:
(0,0) (0,1) (1,0) (0,2) (1,1) (2,0) (0,3) (1,2) (2,1) (3,0) (0,4) (1,3) (2,2) (3,1) (4,0) (0,5) (1,4) (2,3) (3,2) (4,1) (5,0) ...
By a similar argument, $\mathbb{Q}$, $\mathbb{N}^i$, and any countable union of countable sets are all countable.
Cantor’s Theorem: For any set $A$, $|A| < |\mathcal{P}(A)|$. Proof: Clearly $|A| \leq |\mathcal{P}(A)|$, so we only have to show $|A| \neq |\mathcal{P}(A)|$. Proceed by contradiction. Assume $|A| = |\mathcal{P}(A)|$, which means there is a function $g$ mapping $A$ onto $\mathcal{P}(A)$. Consider $Y = \{x \mid x \in A \wedge x \notin g(x)\}$. Since $g$ is an onto function, there must exist a value $y \in A$ such that $g(y)=Y$. If we assume $y \in Y$ then by definition means that $y \notin Y$. But if we assume $y \notin Y$, then that means $y \in Y$. This is a contradiction, so the assumption that a surjectve $g$ exist cannot hold. Therefore $|A| \neq |\mathcal{P}(A)|$.
Similar arguments to this proof of Cantor’s theorem show that the following sets are uncountable:
The fact that $\{f \mid f:\mathbb{N}\rightarrow\mathbb{N}\}$ is uncountable means that there are functions for which no computer program can be written to solve, because programs are strings of symbols from a finite alphabet and thus there are only a countably infinite number of them.
Numbers can be defined in terms of sets, but we’re not showing how do to that here. Just know about:
Kind | Symbol | Notes |
---|---|---|
Natural Numbers | $\mathbb{N}$ | $\{0, 1, 2, 3, \ldots\}$ |
Integers | $\mathbb{Z}$ | $\{\ldots, -3, -2, -1, 0, 1, 2, 3, \ldots\}$ |
Rational Numbers | $\mathbb{Q}$ | $\{\frac{a}{b} \mid a, b \in \mathbb{Z} \wedge b \neq 0\}$ |
Real Numbers | $\mathbb{R}$ | 1-dimensional |
Complex Numbers | $\mathbb{C}$ | 2-dimensional ($a+bi$) |
Quaternions | $\mathbb{H}$ | 4-dimensional ($a+bi+cj+dk$) |
Octonions | $\mathbb{O}$ | 8-dimensional (Units are $e_0 ... e_7$) |
Sedenions | $\mathbb{S}$ | 16-dimensional |
Oh, there is the partition between algebraic vs. transcendental numbers, as in this amazing diagram by Keith Enevoldsen:
Things get interesting when you go beyond the reals. What isn’t “real” by the way? Imaginary perhaps? Not a good name, as Jade shows:
For natural numbers:
Addition, multiplication, and exponentiation work for real numbers too. There are also inverses:
Got all that? Do you have a good sense of these numbers and operations? Whether you do or don’t, watch Vi Hart’s piece:
Also important:
Many people get somewhat comfortable with addition and multiplication, but have a bit of a harder time with exponentiation (and logarithms). This section, courtesy of Phil Dorin, can help a bit.
First, train yourself to know where the base and the exponent appear in log-notation:
Check out these properties. You don’t have to necessarily memorize them all, but take the time to see why they are true:
$a^b a^c = a^{b+c}$ | $2^{10}2^8 = 2^{10+8} = 262144$ |
$\frac{a^b}{a^c} = a^{b-c}$ | $\frac{2^{10}}{2^8} = 2^{10-8} = 4$ |
$a^{bc} = (a^b)^c$ | $2^{15} = (2^5)^3 = (32)(32)(32) = 32768$ |
$\log_b a = \frac{1}{\log_a b}$ | $\log_{1024} 2 = \frac{1}{\log_2 1024} = \frac{1}{10} = 0.1$ |
$\log_b a = \frac{\log_c a}{\log_c b}$ | $\log_4 256 = \frac{\log_2 256}{\log_2 4} = \frac{8}{2} = 4$ |
$\log_c a = (\log_c b)(\log_b a)$ | $\log_{10} 2 = (\log_{10} 1024)(\log_{1024} 2) \approx 3 \times 0.1 = 0.3$ |
$\log_b (xy) = (\log_b x) + (\log_b y)$ | $\log_{10} 20 = \log_{10} (10 \times 2) = \log_{10} 10 + \log_{10} 2 \approx 1 + 0.3 = 1.3$ |
$\log_b(\frac{x}{y}) = (\log_b x) - (\log_b y)$ | $\log_2 (\frac{1024}{16}) = \log_2 1024 - \log_2 16 = 10 - 4 = 6$ |
$\log_b (x^y) = y(\log_b x)$ | $\log_2 (4^3) = 3(\log_2 4) = 6$ |
$a^{\log_c b} = b^{\log_c a}$ | $32^{\log_2 8} = 8^{\log_2 32} = 8^5 = 32768$ |
Now, check this out! You can derive approximations pretty much in your head:
How well did we do?
x | Our approximation | True value to five decimal places |
---|---|---|
1 | .00 | .00000 |
2 | .30 | .30103 |
3 | .48 | .47712 |
4 | .60 | .60206 |
5 | .70 | .69897 |
6 | .78 | .77815 |
7 | .84 | .84510 |
8 | .90 | .90309 |
9 | .95 | .95424 |
10 | 1.00 | 1.00000 |
Not bad! Let’s keep going...
Now use your superpowers to estimate:
Not bad! Calculator says $1.002786070953 \times 10^{64}$
Not bad! Calculator says $6.790604183554 \times 10^{73}$
This section is also courtesy of Phil Dorin:
Don’t miss Robert P. Munafo’s big numbers pages.
Seriously, don’t miss it!
Vsauce takes us through lots of lots of infinities and more! After this video you will know the difference between cardinals and ordinals.
This is huge field of math. Rather than me making notes about it, I should direct you to the amazing Seeing Theory: A Visual Introduction to Probability and Statistics. It’s really good!
Also useful in computer science, but not covered here:
You might also like: Philosophy of Math, History of Math, and Applied Mathematics.
Did you see the Vsauce video above? Kind of cool how Michael says we can do math that isn’t science and just invent things that we declare to be true, like transfinite infinities, and yeah, as long as we don’t have contradictions, let’s go. The first two frames of the famous XKCD Every Major’s Terrible has fun with this idea:
Sometimes our inventions do give us good models. Cohl Furey explains it in two minutes:
Intrigued? Get ready for a ride and watch her entire Division Algebras And the Standard Model playlist. Or checkout this article on her work.
Okay that’s quite a note to end on, right?
We’ve covered: