An algorithmic pattern, or algorithmic paradigm, is a method, strategy, or technique of solving a problem.
The following is just a list of common paradigms; there aren’t any detailed examples here.
Enumerate all possible solutions, unintelligently, and try them all until you find a solution. Not really a “pattern”. You could in theory, do Traveling Salesperson, Knapsack, or Subset Sum this way, but don’t.
Breaking down a problem into multiple independent subproblems, solving the subproblems (recursively), and combining those solutions into a solution for the original problem.
Examples:
A variant of divide and conquer where the problem is broken down into one subproblem.
Examples:
Solving a problem by doing the "best looking" thing at each step. (May miss a solution, or may miss the optimal one. But in some cases where it is known to work, it is a great approach.)
Examples
Solving an optimization problem by breaking down a problem into multiple overlapping subproblems, solving the subproblems (recursively), and combining those solutions into a solution for the original problem. The idea is to cache the results of overlapping subproblems. Can be done bottom up (table construction) or top-down (recursive with memoization)
Examples:
A method for systematically generating possible solutions to a problem in which you sometimes have to back up when realizing your paritally generated candidate can’t possibly be extended to a real solution.
Examples:
Backtracking applied to optimization problems.
Examples:
Solving (or finding an approximate solution to) an optimization problem by generating candidate solutions that are (hopefully) improvements over the previous candidate. Basic Hill Climbing chooses the "best" next step, Genetic algorithms choose a genetic mutation of the previous candidate. Simulated Annealing makes the next choice based on a particular formula used in metallurgy.
Solving an optimization problem with a bunch of decentralized particles all searching for a solution with something that looks like its has a collective organization (e.g. ant colonies, bird flocks, animal herds, etc.)
Examples:
A randomized algorithm that always produces the correct answer but makes no guarantees on how long it will run or how much space it will need (in the worst case).
Used as a defense against algorithm complexity attacks.
Examples:
A randomized algorithm that has time and space guarantees but has a small probablility of giving the wrong answer. The probability of error can be reduced by running the algorithm longer.
Used when all known deterministic algorithms for a problem are too slow, or when estimation is an inherent part of the problem.
Examples:
Solving a problem by reducing, or transforming, it to a similar (usually easier) problem whose solution implies a solution to the original. Also known as transform and conquer.
Playing tricks with the input (input enhancement) or building up a cache (prestructuring) prior to doing the official run.
Examples: