-
Foundation
-
What's more important than performance
- Correctness
- Programmer Time
- Maintainbility
- Robustness
- User-Friendliness
-
Random-Access machine Model
- No concurrent operations
- Each instruction takes a constant amount of time
-
Notation
- Θ
- O
- Ω
- o
- Common Function
-
Recurrence Analysis
-
Substitution Method
- 1. [Guess] the form of the solution
- 2. [Verify] by mathematical induction
-
Recursion-Tree Method
- Topic
-
Master Method
- Topic
- Topic
- Probabilistic Analysis
-
Divide & Conquer
-
Divide Paradigm
- 1. [Divide] the problem into subproblems
- 2. [Conquer] subproblems by solving them recursively
- 3. [Combine] subproblems' solutions
- Topic
-
Examples
- Merge Sort
- Binary Search
- Fibonacci Numbers
- Matrix Multiplication
- Closest Pair of Points
-
Sorting Algorithm
- Quick Sort
-
Counting Sort
- Topic
-
Radix Sort
-
Idea
- Digit-byDigit sort
- Bad idea: Sort on most-significant digit first
- Good idea: Sort on [least-significant digit first] with auxiliary [stable] sort
-
Correctness
- Topic
- Analysis
- Bucket Sort
- Sorting Network
-
Dynamic Programming
-
DP vs Divide & Conquer
-
Sameness
- 1. Partition the problem into subproblems
- 2. Combining the solutions from subproblems
-
Differences
- 1. Overlapping subproblems vs no overlapping subproblems
- 2. Dynamic programming is typically applied to [optimization problems]
-
Steps of DP
- 1. Characterize the structure of an optimal solution
- 2. Recursively define the value of an optimal solution
- 3. Compute the value of an optimal solution in a bottom-up fashion
- 4. Construct an optimal solution from computed information
-
Elements of DP
-
When look for a DP solution to a problem?
- 1. Optimal substructure
- 2. Overlapping subproblems
-
How to discover optimal substructure?
- 1. Make a choice to split the problem into one or more subproblems
- 2. Just assume you are given the choice that leads to an optimal solution
- 3. Given this choice, try to best characterize the resulting space of subproblems
- 4. Show the subproblems chosen are optimal by using a [cut-and-paste] technique
-
Memorization
-
Idea
- 1. A memorized recursive algorithm maintains an entry in a table for the solution to each subproblem
- 2. When the subproblem is first encountered, its solution is computed and then stored in the table
- 3. Each subsequent time that the subproblem is encountered, just return the stored value in the table
-
When to use?
- If some subproblems in the subproblem space need not be solved at all, the memorized solution has the advantage of solving only subproblems that are definitely required
-
Examples
- Assembly-line scheduling
-
Matrix-chain multiplication
-
Fully parenthesized
- A product of matrices is [fully parenthesized] if it is either a single matrix or the product of two fully parenthesized matrix products, surrounded by parentheses.
-
Why parenthesized?
- Different costs incurred by different parenthesizations
- Topic
- LCS(Longest Common String)
- Optimal binary search trees
- Greedy Algorithm
- Amortized Analysis, Heaps
- Graph Algorithm
- String Match
- NP-Completeness
- Approximation Algorithm