Algorithmic Patterns

The best architects, developers, engineers, and scientists realize that a solution to a problem they currently have kind of reminds them of something similar.

Definition

An algorithmic pattern, or algorithmic paradigm, is a method, strategy, or technique of solving a problem.

Some Common Patterns

The following is just a list of common paradigms; there aren’t any detailed examples here.

Brute Force

Enumerate all possible solutions, unintelligently, and try them all until you find a solution. Not really a “pattern”. You could in theory, do Traveling Salesperson, Knapsack, or Subset Sum this way, but don’t.

Divide and Conquer

Breaking down a problem into multiple independent subproblems, solving the subproblems (recursively), and combining those solutions into a solution for the original problem.

Examples:

Decrease and Conquer

A variant of divide and conquer where the problem is broken down into one subproblem.

Examples:

The Greedy Method

Solving a problem by doing the "best looking" thing at each step. (May miss a solution, or may miss the optimal one. But in some cases where it is known to work, it is a great approach.)

Examples

Dynamic Programming

Solving an optimization problem by breaking down a problem into multiple overlapping subproblems, solving the subproblems (recursively), and combining those solutions into a solution for the original problem. The idea is to cache the results of overlapping subproblems. Can be done bottom up (table construction) or top-down (recursive with memoization)

Examples:

Backtracking

A method for systematically generating possible solutions to a problem in which you sometimes have to back up when realizing your paritally generated candidate can’t possibly be extended to a real solution.

Examples:

Branch and Bound

Backtracking applied to optimization problems.

Examples:

Hill Climbing

Solving (or finding an approximate solution to) an optimization problem by generating candidate solutions that are (hopefully) improvements over the previous candidate. Basic Hill Climbing chooses the "best" next step, Genetic algorithms choose a genetic mutation of the previous candidate. Simulated Annealing makes the next choice based on a particular formula used in metallurgy.

Particle Swarm Optimization

Solving an optimization problem with a bunch of decentralized particles all searching for a solution with something that looks like its has a collective organization (e.g. ant colonies, bird flocks, animal herds, etc.)

Examples:

Las Vegas

A randomized algorithm that always produces the correct answer but makes no guarantees on how long it will run or how much space it will need (in the worst case).

Used as a defense against algorithm complexity attacks.

Examples:

Monte Carlo

A randomized algorithm that has time and space guarantees but has a small probablility of giving the wrong answer. The probability of error can be reduced by running the algorithm longer.

Used when all known deterministic algorithms for a problem are too slow, or when estimation is an inherent part of the problem.

Examples:

Reduction (Transformation)

Solving a problem by reducing, or transforming, it to a similar (usually easier) problem whose solution implies a solution to the original. Also known as transform and conquer.

Preprocessing

Playing tricks with the input (input enhancement) or building up a cache (prestructuring) prior to doing the official run.

Examples: