Hallmarks of dynamic programming
#1
Optimal substructure: an optimal solution to a problem (instance) contains optimal solutions to subproblems.
e.g. if z is a longest common sequence (lcs) of x and y, then any prefix of z is an LCS of a prefix of x and a prefix of y.
LCS(x, y, i, j)
if x[i] == y[j]
then c[i,j] <- lcs(x,y,i-1, j-1) +1
else c[i,j] <- max{lcs(x,y,i-1,j), lcs(x,y,i,j-1)}
return c[i,j];
Worst case: x[i] != y[j] for all i, j
the height of the recursion tree would be m+n in the worst case.
Exponential time complexity
# 2
Overlapping subproblems: A recursive solution contains a "small" number of distinct subproblems repeated many times.
LCS: subproblems space contains mn distinct subproblems.
Memoization algorithm (not memorization, memo: keep a memo on what's done already)
O(mn) time and space complexity
Dynamic programming:
Idea: compute a table bottom up
Reconstruct LCS by tracing backwards
Notes of MIT OCW: Introduction to Algorithms
http://ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-046j-introduction-to-algorithms-sma-5503-fall-2005/