cs170 review

Runtime Analysis

  1. Notation
    $Omega() < theta() < O()$
  2. The master theorem
    $T(n) = a T(frac{n}{b}) + n^d log^k n$
Runtime when
$n^d$ $d > log_b a$
$n^d log^k n$ $d = log_b a$
$n^{log_b^a}$ $d < log_b a$
  1. Typical Examples
    Mergesort, Find Median and the fast fourier transformation.

Randomized Algorithm

There are two types of randomized algorithms: Las Vegas algorithm and Monte-Carlo algorithm. The Las Vegas algorithm will always output the correct answer, but they are allowed to have large runtime for a small probability. The Monte-Carlo algorithms will always have a low runtime, but they are allowed to give wrong return values for a small probability.

Typical examples including quick sort, universal hashing, bloom filter and stream algorithm

Graph

  1. LinkedList and Matrix Representation
  2. Deep First Search
    1. A directed graph has a circle if and only if its depth-first search reveals a back edge.
    2. Topological Sort: In a dag, every edge leads to a vertex with a lower post number.
    3. Find Strongly Connected Components: Run DFS on reversed graph and run DFS again on the highest post numbers.
  3. Breath First Search
    1. Dijkstra’s Algorithm: Runtime $|V| log |V| + |E|$
    2. Bellman-Ford Algorithm: Runtime $|V||E|$. Work on negative edges. Can detect negative circle.

Greedy Algorithm

  1. A special case in dynamic programming. There are two main ways to prove the correctness of the greedy approach: staying ahead or exchange arguments.
  2. MST:
    Kruskal’s algorithm: Union Find. Cut Property. Runtime $|E| log |E|$. Space: $|V|$
    Prim algorithm: Runtime $|V| log |V| + |E|$. Space: $|V|$
  3. Other examples:
    Huffman Encoding, Horn Formula and Set Cover.

Dynamic Programming

  1. For $x_1, x_2, …, x_n$, the subproblem is $x_1, x_2, …, x_i$.
  2. For $x_1, x_2, …, x_n$, the subproblem is $x_i, x_{i+1}, …, x_j$.
  3. For $x_1, x_2, …, x_n; y_1, y_2, …, y_n$, the subproblem is $x_1,…, x_i, y_1,…, y_j$.
  4. For a tree problem, the subproblem is node $i$ and its children and grandchildren.

Linear Programming

Minmax duality; Max flow problems

Any problem can reduce to boolean combinational circuit, and the circuit value problem can reduce to a LP problem. Thus everything that is solvable can reduce to LP.

NP Problems

  1. Some definations
    1. Search Problem: Any proposed solution can be checked for correctness in polynomial time
    2. P: Search problems that can be solved in polynomial time
    3. NP: All search problems
    4. NP hard: problems that All NP problems can reduce to
    5. NP complete: problems that are both NP and NP hard
  2. Reduction:
    All NP -> CSAT -> SAT -> 3SAT -> Independent Set and 3D Matching

Local Improvement

  1. Backtracking: Construct subproblem that can be checked quickly.
  2. Simulated Annealing