**Dynamic Programming in Computer Programming**

There are two key attributes that a problem must have in order for dynamic programming to be applicable: optimal substructure and overlapping subproblems. However, when the overlapping problems are much smaller than the original problem, the strategy is called "divide and conquer" rather than "dynamic programming". This is why mergesort, quicksort, and finding all matches of a regular expression are not classified as dynamic programming problems.

*Optimal substructure* means that the solution to a given optimization problem can be obtained by the combination of optimal solutions to its subproblems. Consequently, the first step towards devising a dynamic programming solution is to check whether the problem exhibits such optimal substructure. Such optimal substructures are usually described by means of recursion. For example, given a graph *G=(V,E)*, the shortest path *p* from a vertex *u* to a vertex *v* exhibits optimal substructure: take any intermediate vertex *w* on this shortest path *p*. If *p* is truly the shortest path, then the path *p _{1}* from

*u*to

*w*and

*p*from

_{2}*w*to

*v*are indeed the shortest paths between the corresponding vertices (by the simple cut-and-paste argument described in CLRS). Hence, one can easily formulate the solution for finding shortest paths in a recursive manner, which is what the Bellman-Ford algorithm or the Floyd-Warshall algorithm does.

*Overlapping* subproblems means that the space of subproblems must be small, that is, any recursive algorithm solving the problem should solve the same subproblems over and over, rather than generating new subproblems. For example, consider the recursive formulation for generating the Fibonacci series: *F*_{i} = *F*_{i−1} + *F*_{i−2}, with base case *F*_{1} = *F*_{2} = 1. Then *F*_{43} = *F*_{42} + *F*_{41}, and *F*_{42} = *F*_{41} + *F*_{40}. Now *F*_{41} is being solved in the recursive subtrees of both *F*_{43} as well as *F*_{42}. Even though the total number of subproblems is actually small (only 43 of them), we end up solving the same problems over and over if we adopt a naive recursive solution such as this. Dynamic programming takes account of this fact and solves each subproblem only once. Note that the subproblems must be only *slightly* smaller (typically taken to mean a constant additive factor) than the larger problem; when they are a multiplicative factor smaller the problem is no longer classified as dynamic programming.

This can be achieved in either of two ways:

*Top-down approach*: This is the direct fall-out of the recursive formulation of any problem. If the solution to any problem can be formulated recursively using the solution to its subproblems, and if its subproblems are overlapping, then one can easily memoize or store the solutions to the subproblems in a table. Whenever we attempt to solve a new subproblem, we first check the table to see if it is already solved. If a solution has been recorded, we can use it directly, otherwise we solve the subproblem and add its solution to the table.

*Bottom-up approach*: Once we formulate the solution to a problem recursively as in terms of its subproblems, we can try reformulating the problem in a bottom-up fashion: try solving the subproblems first and use their solutions to build-on and arrive at solutions to bigger subproblems. This is also usually done in a tabular form by iteratively generating solutions to bigger and bigger subproblems by using the solutions to small subproblems. For example, if we already know the values of*F*_{41}and*F*_{40}, we can directly calculate the value of*F*_{42}.

Some programming languages can automatically memoize the result of a function call with a particular set of arguments, in order to speed up call-by-name evaluation (this mechanism is referred to as *call-by-need*). Some languages make it possible portably (e.g. Scheme, Common Lisp or Perl), some need special extensions (e.g. C++, see). Some languages have automatic memoization built in, such as tabled Prolog and J, which supports memoization with the *M.* adverb. In any case, this is only possible for a referentially transparent function.

Read more about this topic: Dynamic Programming, Overview

### Famous quotes containing the words computer, dynamic and/or programming:

“The analogy between the mind and a *computer* fails for many reasons. The brain is constructed by principles that assure diversity and degeneracy. Unlike a *computer*, it has no replicative memory. It is historical and value driven. It forms categories by internal criteria and by constraints acting at many scales, not by means of a syntactically constructed program. The world with which the brain interacts is not unequivocally made up of classical categories.”

—Gerald M. Edelman (b. 1928)

“Magic is the envelopment and coercion of the objective world by the ego; it is a *dynamic* subjectivism. Religion is the coercion of the ego by gods and spirits who are objectively conceived beings in control of nature and man.”

—Richard Chase (b. 1914)

“If there is a price to pay for the privilege of spending the early years of child rearing in the driver’s seat, it is our reluctance, our inability, to tolerate being demoted to the backseat. Spurred by our success in *programming* our children during the preschool years, we may find it difficult to forgo in later states the level of control that once afforded us so much satisfaction.”

—Melinda M. Marshall (20th century)