This page summarizes the lectures and mentions the suggested reading assignments.
This summary is quite brief, and its intention is to help the student and
instructor recollect what was covered. It is no substitute for participating
in the lectures. We will be using Jeff Erickson's notes.
Week 1
- Meeting 1: Stable Matching. Section 0.5 (lecture 0, Section 0.5) of
Jeff Erickson's notes. This discussion will help clarify
the "prerequisites" for this course. Also read Section 0.6
for a sense of what this course is really about.
- Meeting 2: Completing the stable matching discussion. Reduction and
Recursion (1.1 and 1.2), Towers of Hanoi (1.3),
Mergesort (1.4), and Quicksort (1.5).
Week 2
- Meeting 1: Run-time analysis of Mergesort and Quicksort, and introduction to Selection (1.7)
- Meeting 2: The linear-time selection algorithm (1.7), and introduction to integer multiplication (1.8)
Week 3
- Meeting 1: Karatsuba's sub-quadratic algorithm for integer multiplication (1.8), and a mention of Strassen's matrix multiplication algorithm.
- Meeting 2: More recursive thinking -- Lecture 3 on backtracking. Subset Sum (3.3), and starting on Longest Increasing Subsequence (3.6).
Week 4
- Meeting 1: Completing the recursive thinking for Longest Increasing Subsequence. Recursion + Memoization for the same, leading to section 5.2.
- Meeting 2: Dynamic Programming (Lecture 5) for Longest Increasing Subsequence (5.2) and for subset sum. Recursive thinking for Max-Weight independent set in a graph.
Week 5
- Meeting 1: Dynamic programming for max-weight independent set in a tree. This is similar to 5.7, where we look at size instead of weight. Recursive thinking for Edit Distance.
- Meeting 2: Recursive thinking for edit distance continued. This is similar to 5.5, where the thinking is based on a characterization of edit distance using alignment. We did not use the notion of alignment in our thinking. Dynamic programming for Edit Distance.
Week 6
- Meeting 1: Starting Randomized Algorithms. Protocol for Contention Resolution. This discussion is not from Jeff Erickson's notes. See posted notes on ICON.
- Meeting 2: Randomized minimum cut, from Lecture 14 of Jeff's notes.
Week 7
- Meeting 1: We introduce random variables, expectation, and linearity of expectation. We calculate expectations in some simple examples.
- Meeting 2: The expected running time of quicksort. This roughly corresponds to Lecture 9.
Week 8
- Meeting 1: The faster algorithm for randomized minimum cut, from Lecture 14 of Jeff's notes.
- Meeting 2: Hash Tables, a way of obtaining constant expected time for insert/delete/search operations with integer keys. Here we make no assumptions on the input data, but randomize the choice of the hash function. Introduction to universal hash families. Lecture 12 of Jeff's notes.
Week 9
- Meeting 1: Further discussion of hashing, and an explanation for why the prime multiplicative hashing from Lecture 12.5.1 is near-universal.
- Meeting 2: The max-flow and min-cut problems. Ford-Fulkerson algorithm
for computing max-flow. Lecture 23 of Jeff Erickson's notes.
Week 10
- Meeting 1: Proofs -- Flow value lemma, weak duality, correctness of Ford-Fulkerson, the algorithm also easily gives the minimum s-t cut. Running time considerations.
- Meeting 2: Midterm.
Week 11
- Meeting 1: Capacity scaling algorithm for maximum flow, as a way of improving the running time when we have large integer capacities. A similar, but not the same, algorithm is analyzed in Lecture 23.6.1.
- Meeting 2: Maximum matching in bipartite graphs. An algorithm similar to the one we studied is presented in Lecture 24.3.
Week 12
- Meeting 1: Image Segmentation, an application of minimum cuts. Notes in ICON.
- Meeting 2: The propositional satisfiability problem. Reducing testing bipartiteness of graphs to 2CNF-SAT. Reducing Longest Increasing Subsequence to Longest Path in graphs. These reductions are a prelude to Lecture 30 on NP-hardness.
Week 13
- Meeting 1: Reducing CNF-SAT to maximum independent set. Turing/Oracle reductions and polynomial time (Cook) reductions. Decision problems. The class NP, and examples of problems in NP. Most of this is in Lecture 30 of Jeff's notes.
- Meeting 2: More examples of problems in NP. P, NP, and the significance of NP-completeness.
Week 14
- Meeting 1: Many-one reductions and polynomial time (Karp) reductions. NP-hardness and NP-completeness. First NP-Complete problem: 3CNF-SAT. Using NP-hardness of 3CNF-SAT to show that independent set is NP-complete.
- Meeting 2: NP-completeness of vertex cover and set cover.
Week 15
- Meeting 1: Lecture Planning is NP-complete. A few words on how 3CNF-SAT is shown to be NP-complete.
- Meeting 2: Algorithms for NP-complete problems: a very brief discussion.