Algorithms and complexity, the practical view
Big-O without the academic framing. The 12 algorithm patterns that solve 80% of interview problems and most of the real ones.
Prerequisites
02.1 (DSA, built not memorized)
Stack
Python 3.12pytestyour hand-rolled DSA from 02.1
By the end of this module
- Recognize the algorithmic pattern from a problem statement, before writing any code.
- Reach for the right pattern in roughly 30 seconds on the 12 that recur.
- Reason about Big-O the way working engineers do — worst, average, amortized, and the constants.
- Solve 25 hand-picked problems mapped to the patterns, with tests.
The traditional algorithms course teaches you to memorize a hundred named algorithms. Most of them are dead weight. In real engineering — and in most interview rooms — what matters is pattern recognition: looking at a problem and within a minute knowing it’s a sliding window, or a topological sort, or a binary-search-on-the-answer. Once you’ve named the pattern, the code is mostly mechanical.
This module gives you the small set of patterns that actually recur, with one canonical example each, and a curated set of 25 problems mapped to them. It is not a LeetCode marathon. The goal is to wire the pattern-matching circuitry, not to grind 500 problems and forget them.
If you skipped 02.1 — go back. The patterns in this module assume you can already build and reason about hash maps, heaps, queues, stacks, and graphs. You will have a bad time otherwise.
Set up
mkdir algos && cd algos
uv venv .venv && source .venv/bin/activate
uv pip install pytest hypothesis
mkdir patterns tests
One file per pattern in patterns/, with tests next to it. The discipline of one-file-per-pattern is what trains the recognition reflex.
Read these first
In this order. Stop reading once you can answer the checkpoints — do not collect resources for their own sake.
- Sedgewick & Wayne — Algorithms (4th ed.). book site · 3–4 hours skim · the cleanest treatment of complexity and the canonical algorithms. Chapter 1 is your Big-O grounding.
- MIT 6.006 — Introduction to Algorithms. course · 2–3 hours, lectures 1, 2, 7, 11, 13, 21 · proofs only when proofs matter.
- NeetCode — Roadmap. site · ignore the marketing, use the taxonomy. The 12-pattern view in this module mirrors his groups.
- Jeff Erickson — Algorithms. free PDF · pick this up only if you want a more rigorous, prose-heavy alternative to Sedgewick.
After roughly six hours of reading you have diminishing returns. Close the books. Start solving.
Big-O without the academic apparatus
The textbook spends three weeks on formal asymptotics. You need three paragraphs.
Big-O is the upper bound on growth as input size goes to infinity, ignoring constants. It tells you how an algorithm scales. It does not tell you whether it’s fast.
Three flavors you actually use:
- Worst case — the bound under any input. This is what most interviewers mean by “Big-O” without qualifying.
- Average case — the expected bound under a stated distribution. Useful for hash maps (O(1) avg, O(n) worst).
- Amortized — the bound averaged over a sequence of operations. Useful for dynamic arrays (O(1) amortized append, even though some appends trigger an O(n) resize).
What the textbook hides: constants matter enormously up to roughly n = 10^6. An O(n log n) algorithm with a 50× constant loses to an O(n^2) algorithm with a tiny constant when n = 1000. Cache locality and branch prediction, not asymptotics, decide most real performance fights. (Module 02.7 makes this concrete.)
A working rule of thumb for problem sizing:
| n is at most | Algorithm class that fits in 1 second |
|---|---|
| 10 | O(n!) backtracking |
| 20 | O(2^n) bitmask DP |
| 500 | O(n^3) Floyd-Warshall, dense DP |
| 5,000 | O(n^2) two-loop scans, 2D DP |
| 10^6 | O(n log n) sorts, segment trees |
| 10^8 | O(n) linear scans only |
Memorize this table. It is the single most useful thing in competitive programming and a surprising amount of real engineering.
The 12 patterns that recur
For each: a one-line signature (the phrasing in the problem that screams this pattern), one canonical example, and one note on when it’s the wrong answer.
| # | Pattern | Signature | Canonical example | Don’t reach for it when |
|---|---|---|---|---|
| 1 | Two pointers | ”in a sorted array / linked list” | 2-sum sorted, remove dups | input isn’t ordered or pairable |
| 2 | Sliding window | ”longest/shortest subarray with property X” | longest substring with k distinct chars | property doesn’t compose monotonically |
| 3 | Binary search on answer | ”min/max X such that predicate holds” | min capacity to ship in D days | predicate isn’t monotonic |
| 4 | Prefix sums | ”range sum / count over many queries” | subarray sum equals K | array is mutating between queries |
| 5 | Monotonic stack/queue | ”next greater/smaller element” | daily temperatures, largest rectangle | order doesn’t matter |
| 6 | BFS | ”shortest path in unweighted graph” | word ladder, level-order | edges are weighted (use Dijkstra) |
| 7 | DFS | ”explore all / detect cycle / count components” | number of islands | path memoization is needed (use DP) |
| 8 | Topological sort | ”schedule with dependencies” | course schedule | graph has cycles |
| 9 | Union-find | ”online connectivity / clusters merge over time” | accounts merge, Kruskal’s | you also need disconnects |
| 10 | DP (memo + tab) | “count ways / min cost over choices” | coin change, edit distance | greedy actually works |
| 11 | Greedy | ”local optimum gives global optimum” | interval scheduling, Huffman | you have proof it doesn’t |
| 12 | Backtracking | ”enumerate all valid configurations” | n-queens, sudoku, permutations | n is bigger than ~20 |
If you internalize this single table, you will outperform most CS undergrads on algorithms interviews. The skill is naming the pattern in the first 60 seconds, not coding fluency.
The 25 problems
Solve these in order. One per file. Test each with at least 4 cases (empty, single, typical, edge).
Two pointers → 1. 2-sum sorted, 2. container with most water, 3. trapping rain water
Sliding window → 4. longest substring no-repeat, 5. min window substring
Binary search on ans → 6. ship within D days, 7. koko bananas, 8. median of two sorted
Prefix sums → 9. subarray sum = k, 10. range sum query 2D
Monotonic stack → 11. daily temperatures, 12. largest rectangle in histogram
BFS → 13. rotting oranges, 14. word ladder
DFS → 15. number of islands, 16. clone graph
Topological sort → 17. course schedule II
Union-find → 18. accounts merge, 19. number of connected components
Dynamic programming → 20. house robber, 21. coin change, 22. edit distance, 23. LIS
Greedy → 24. jump game II
Backtracking → 25. n-queens
Aim for 30 to 60 minutes per problem. If you can’t see the pattern in five minutes, look up the pattern — never the solution. Then walk away from the lookup and re-derive the code yourself. Re-deriving from a hint is the practice that builds transfer; copying the answer builds nothing.
When you’re done, you have done more genuine algorithm practice than most CS graduates do in their entire degree. More importantly, you have built the index in your head — a problem statement now lights up the right pattern automatically.
Pattern recognition as a skill
The opinionated take of this module: pattern recognition matters more than algorithmic depth. Most working engineers will never implement a custom segment tree. They will, however, recognize “I need range queries that mutate” three or four times a year, and that recognition is the difference between a one-day fix and a one-week incident.
Train the recognition deliberately. After every problem you solve, write a single sentence at the top of the file: “This is a sliding window because the window’s invariant — at-most-k-distinct — is monotonic in window size.” That sentence is the lesson. The code is the byproduct.
Anti-pattern to watch for: students who can recite Dijkstra’s algorithm by heart but cannot tell you, given a problem, whether Dijkstra is the right tool. The whole point of this module is to fix that direction.
Going deeper
When you have specific questions, in this order:
- Sedgewick & Wayne — Algorithms (4th ed.). book · the deepest reference once you have intuition.
- Competitive Programmer’s Handbook (Laaksonen). free PDF · 300 pages, dense, pattern-first. Pair with the CSES problem set.
- CSES Problem Set. site · 300 problems sorted by topic. The best progression after the 25 above.
- Skiena — The Algorithm Design Manual. book · the war-stories chapter is the closest a textbook gets to real engineering.
- CLRS — only after the above. Reference, not a teacher.
Skip “100 LeetCode patterns in 5 hours” videos. They are the symptom of pattern memorization without depth.
Checkpoints
If any one wobbles, redo the corresponding problems from the 25.
- Given an input of size n = 10^7, which of the 12 patterns are off the table on a 1-second budget? Why?
- You see “find the longest contiguous subarray with sum at most K.” Name the pattern in one sentence and explain the invariant.
- Why is amortized O(1) the right bound for dynamic-array append, and what does that imply about a worst-case latency target of 1 ms per operation?
- You’re given a problem that looks like binary search but the predicate isn’t monotonic. What do you do?
- Pick any two patterns from the 12 and describe a problem statement where one is the right answer and the other is the tempting wrong one.
If you can answer all five from memory, you’ve earned 02.2. Move on to 02.3 (Discrete math, ranked by what shows up) — that’s the small slice of math that earns its place under the patterns you just learned.