Recurrence Relations

So you need an infinite sequence? How are you going to give a finite description of it?

Definition

A recurrence relation is a recursive definition of a sequence.

Examples

Exercise: Hone your math skills by working out the first 10 values of each of these sequences.

Uses

These things show up all the time when doing complexity analysis of recursive functions.

Example — Sum of list elements:

function sum(a) {
    return a == [] ? 0 : head(a) + sum(tail(a));
}

Here $T(0) = 1$ and $T(n+1) = 1 + T(n)$.

Example — Mergesort:

function sort(a) {
    if length(a) <= 1 {
        return a;
    } else {
        mid = length(a) / 2;
        return merge(sort(a[0..mid-1]), sort(a[mid..length(a)-1]));
    }
}

$T(0) = 1$, $T(1) = 1$, and $T(n) = 2T(\frac{n}{2}) + n$.

Mergesort uses the divide and conquer strategy. Specifically Mergesort breaks the input into $2$ pieces each of size $\frac{n}{2}$, recurses on those pieces, then assembles a solution with $n$ operations. How do we compute the complexity of this?

The Master Theorem

The Master Theorem tells us how to find a closed form complexity function for algorithms that break the input into $a$ pieces each of size $\frac{n}{b}$, recurses on those pieces, then assembles a solution with $f(n)$ operations. Here is the theorem given without proof:

If $T(n) = c\;\mathrm{if}\;n=1\;\mathrm{else}\;aT(\frac{n}{b}) + f(n)$, where $f \in O(n^d)$, $a \ge 1$, $b \ge 2$, $c > 0$, and $d \geq 0$, then:

\[ T \in \left\{ \begin{array}{l l} O(n^d) & \textrm{ if } d > \log_b{a} \\ O(n^d \log n) & \textrm{ if } d = \log_b{a} \\ O(n^{\log_b{a}}) & \textrm{ if } d < \log_b{a} \end{array} \right. \]

Examples: