The Recursion Mindset
Recursion solves a problem by defining it in terms of a smaller version of the same problem. The three requirements: (1) Base case — the simplest input that can be solved directly (no recursion needed). (2) Recursive case — reduce the problem to a smaller instance and combine results. (3) Progress toward base case — each recursive call must move toward the base case (termination guarantee). Most dynamic programming problems are recursive problems with memoization added.
Fibonacci and Memoization
The classic recursion example: fib(n) = fib(n-1) + fib(n-2), fib(0) = 0, fib(1) = 1. Naive recursion is O(2^n) — exponential — because it recomputes the same subproblems. Memoization caches results:
from functools import lru_cache
@lru_cache(maxsize=None)
def fib(n):
if n <= 1:
return n
return fib(n-1) + fib(n-2)
With memoization, each unique input is computed once → O(n) time. This is top-down dynamic programming. The bottom-up DP equivalent: compute fib(0), fib(1), …, fib(n) iteratively, storing only the last two values for O(1) space.
Tree Recursion Pattern
Fibonacci has two recursive calls — a binary tree of recursion. Many problems have this structure. The key is identifying the subproblem and the recombination:
Count ways to climb stairs (1 or 2 steps at a time): ways(n) = ways(n-1) + ways(n-2). Same as Fibonacci structure. Base cases: ways(0) = 1, ways(1) = 1.
Generate all binary strings of length n: at each position, choose 0 or 1. Recurse with n-1 remaining positions. The recursion tree has 2^n leaves = 2^n strings. No memoization (all 2^n results are distinct).
Coin change (count ways): ways(amount, coins) = sum of ways(amount – coin, coins) for each coin ≤ amount. With memoization: O(amount × |coins|).
Classic Recursive Algorithms
Tower of Hanoi: move n disks from source to target using auxiliary peg. Move top n-1 disks from source to aux (recursive), move the nth disk from source to target (base step), move n-1 disks from aux to target (recursive). T(n) = 2T(n-1) + O(1) = O(2^n) — the minimum number of moves is exactly 2^n – 1 (provably optimal).
Generate Parentheses (LC 22): generate all valid combinations of n pairs of parentheses. At each step, track open_count and close_count. Can add ‘(‘ if open_count < n. Can add ')' if close_count < open_count. Base case: total length = 2n. This is backtracking with state — 2^(2n) branches pruned to Catalan(n) valid strings.
Letter Combinations of Phone Number (LC 17): each digit maps to 2-4 letters. For each digit, try each letter, recurse for the next digit. Time O(4^n × n) — at most 4 choices per digit, n digits, O(n) to build each string.
Decode Ways (LC 91): a string of digits can be decoded where ‘1’→’A’, …, ’26’→’Z’. Count ways. dp(i) = dp(i+1) (decode one digit) + dp(i+2) (decode two digits if valid). With memoization: O(n) time. This is a key pattern: consider 1 or 2 characters at a time, validate, recurse.
Mutual Recursion
Two functions that call each other. Example: isEven(n) = isOdd(n-1) if n > 0 else True; isOdd(n) = isEven(n-1) if n > 0 else False. Mutual recursion is uncommon in interviews but appears in certain parsing and grammar problems.
Recursion vs Iteration: When to Convert
Every recursive algorithm can be converted to an iterative one using an explicit stack. Reasons to convert: (1) Stack overflow risk on deep inputs — Python’s default recursion limit is 1,000 frames. (2) Performance — function call overhead adds constant factors. (3) Interviewer asks for iterative solution. Conversion pattern: replace the call stack with an explicit stack (deque). Push the initial state; while the stack is not empty: pop state, process, push next states. For DFS, the stack order matters — push in reverse order of desired processing.
Recursion Complexity Analysis
The recurrence relation directly gives complexity via the Master Theorem or drawing the recursion tree:
- Linear recursion T(n) = T(n-1) + O(1): depth n, O(n) total
- Binary recursion T(n) = 2T(n-1) + O(1): depth n, O(2^n) total — exponential
- Divide-and-conquer T(n) = 2T(n/2) + O(n): depth log n, O(n log n) total (merge sort)
- With memoization, the number of unique subproblems × work per subproblem gives total complexity
Interview Tips
- Always identify the base case(s) first — missing or incorrect base cases cause infinite recursion.
- “Assume the recursive call works correctly” — trust it to solve the smaller problem; focus on the combination step.
- Draw a small recursion tree for the example input to verify correctness before coding.
- When the recursion tree has repeated subproblems → add memoization (lru_cache) → becomes DP.
- Call stack space: O(recursion depth). For divide-and-conquer (depth = log n), space is O(log n). For linear recursion (depth = n), space is O(n) — may need to convert to iterative for large n.