Recursion Interview Patterns: Memoization, Tree Recursion, Classic Problems

The Recursion Mindset

Recursion solves a problem by defining it in terms of a smaller version of the same problem. The three requirements: (1) Base case — the simplest input that can be solved directly (no recursion needed). (2) Recursive case — reduce the problem to a smaller instance and combine results. (3) Progress toward base case — each recursive call must move toward the base case (termination guarantee). Most dynamic programming problems are recursive problems with memoization added.

Fibonacci and Memoization

The classic recursion example: fib(n) = fib(n-1) + fib(n-2), fib(0) = 0, fib(1) = 1. Naive recursion is O(2^n) — exponential — because it recomputes the same subproblems. Memoization caches results:

from functools import lru_cache

@lru_cache(maxsize=None)
def fib(n):
    if n <= 1:
        return n
    return fib(n-1) + fib(n-2)

With memoization, each unique input is computed once → O(n) time. This is top-down dynamic programming. The bottom-up DP equivalent: compute fib(0), fib(1), …, fib(n) iteratively, storing only the last two values for O(1) space.

Tree Recursion Pattern

Fibonacci has two recursive calls — a binary tree of recursion. Many problems have this structure. The key is identifying the subproblem and the recombination:

Count ways to climb stairs (1 or 2 steps at a time): ways(n) = ways(n-1) + ways(n-2). Same as Fibonacci structure. Base cases: ways(0) = 1, ways(1) = 1.

Generate all binary strings of length n: at each position, choose 0 or 1. Recurse with n-1 remaining positions. The recursion tree has 2^n leaves = 2^n strings. No memoization (all 2^n results are distinct).

Coin change (count ways): ways(amount, coins) = sum of ways(amount – coin, coins) for each coin ≤ amount. With memoization: O(amount × |coins|).

Classic Recursive Algorithms

Tower of Hanoi: move n disks from source to target using auxiliary peg. Move top n-1 disks from source to aux (recursive), move the nth disk from source to target (base step), move n-1 disks from aux to target (recursive). T(n) = 2T(n-1) + O(1) = O(2^n) — the minimum number of moves is exactly 2^n – 1 (provably optimal).

Generate Parentheses (LC 22): generate all valid combinations of n pairs of parentheses. At each step, track open_count and close_count. Can add ‘(‘ if open_count < n. Can add ')' if close_count < open_count. Base case: total length = 2n. This is backtracking with state — 2^(2n) branches pruned to Catalan(n) valid strings.

Letter Combinations of Phone Number (LC 17): each digit maps to 2-4 letters. For each digit, try each letter, recurse for the next digit. Time O(4^n × n) — at most 4 choices per digit, n digits, O(n) to build each string.

Decode Ways (LC 91): a string of digits can be decoded where ‘1’→’A’, …, ’26’→’Z’. Count ways. dp(i) = dp(i+1) (decode one digit) + dp(i+2) (decode two digits if valid). With memoization: O(n) time. This is a key pattern: consider 1 or 2 characters at a time, validate, recurse.

Mutual Recursion

Two functions that call each other. Example: isEven(n) = isOdd(n-1) if n > 0 else True; isOdd(n) = isEven(n-1) if n > 0 else False. Mutual recursion is uncommon in interviews but appears in certain parsing and grammar problems.

Recursion vs Iteration: When to Convert

Every recursive algorithm can be converted to an iterative one using an explicit stack. Reasons to convert: (1) Stack overflow risk on deep inputs — Python’s default recursion limit is 1,000 frames. (2) Performance — function call overhead adds constant factors. (3) Interviewer asks for iterative solution. Conversion pattern: replace the call stack with an explicit stack (deque). Push the initial state; while the stack is not empty: pop state, process, push next states. For DFS, the stack order matters — push in reverse order of desired processing.

Recursion Complexity Analysis

The recurrence relation directly gives complexity via the Master Theorem or drawing the recursion tree:

  • Linear recursion T(n) = T(n-1) + O(1): depth n, O(n) total
  • Binary recursion T(n) = 2T(n-1) + O(1): depth n, O(2^n) total — exponential
  • Divide-and-conquer T(n) = 2T(n/2) + O(n): depth log n, O(n log n) total (merge sort)
  • With memoization, the number of unique subproblems × work per subproblem gives total complexity

Interview Tips

  • Always identify the base case(s) first — missing or incorrect base cases cause infinite recursion.
  • “Assume the recursive call works correctly” — trust it to solve the smaller problem; focus on the combination step.
  • Draw a small recursion tree for the example input to verify correctness before coding.
  • When the recursion tree has repeated subproblems → add memoization (lru_cache) → becomes DP.
  • Call stack space: O(recursion depth). For divide-and-conquer (depth = log n), space is O(log n). For linear recursion (depth = n), space is O(n) — may need to convert to iterative for large n.

{
“@context”: “https://schema.org”,
“@type”: “FAQPage”,
“mainEntity”: [
{
“@type”: “Question”,
“name”: “What is memoization and how does it convert exponential recursion to polynomial time?”,
“acceptedAnswer”: {
“@type”: “Answer”,
“text”: “Memoization caches the result of each unique function call so that repeated calls with the same arguments return the cached result instead of recomputing. The classic example is Fibonacci: fib(n) = fib(n-1) + fib(n-2). Without memoization, fib(50) makes ~2^50 function calls u2014 exponential u2014 because fib(48) is computed separately for both fib(50) and fib(49). With memoization (using a dictionary or @lru_cache), fib(48) is computed once and cached. All subsequent calls return in O(1). Total unique calls = n (fib(0) through fib(n)), each doing O(1) work u2192 O(n) total. The pattern: whenever a recursive function has repeated subproblems (the same arguments appear in multiple recursive calls), add memoization. The number of unique argument combinations determines the time complexity: unique_states u00d7 work_per_state. For string DP with two indices (i, j) over strings of length m and n: unique states = m u00d7 n, giving O(m u00d7 n) with memoization.”
}
},
{
“@type”: “Question”,
“name”: “How do you identify when a problem can be solved with recursion versus iteration?”,
“acceptedAnswer”: {
“@type”: “Answer”,
“text”: “Recursion is naturally suited when: (1) The problem has a recursive structure u2014 a tree, graph, or can be defined in terms of smaller instances (subsets, permutations, divide-and-conquer). (2) The depth of recursion is bounded and manageable (O(log n) for balanced trees, O(n) for simple recursion u2014 Python’s default limit is 1,000, Java’s is ~10,000). (3) The subproblems overlap and memoization can be applied (DP problems). Use iteration when: (1) The recursion depth could be very large (unbalanced trees, linear recursion on n > 10,000). (2) The iterative solution is clearly simpler (computing sum, product, or any single-pass operation). (3) Performance is critical (function call overhead and stack frame allocation add ~10ns per call in Python). Converting recursion to iteration: use an explicit stack (deque) u2014 push the initial state, then repeatedly pop, process, and push child states. Inorder traversal (recursive): visit left, root, right. Iterative: push nodes while going left, pop to visit, go right. The explicit stack simulates the call stack. Tail-recursive functions can be trivially converted to loops without an explicit stack.”
}
},
{
“@type”: “Question”,
“name”: “What is the call stack depth for recursive solutions and when does it cause problems?”,
“acceptedAnswer”: {
“@type”: “Answer”,
“text”: “Each recursive call adds a frame to the call stack, consuming stack memory. The call stack is limited: Python defaults to 1,000 frames (configurable with sys.setrecursionlimit but not recommended for production), Java defaults to ~10,000 frames (varies by JVM and stack size setting), C/C++ typically ~8,000-10,000 frames. Stack overflow occurs when the recursion depth exceeds the limit. Problematic scenarios: recursion on an n-element sorted array (degenerate quicksort is O(n) depth), DFS on a path graph of 10,000 nodes (depth = 10,000), and naive tree recursion on a skewed binary tree. Solutions: (1) Use iterative DFS with an explicit stack for tree/graph traversal u2014 stack frames on the heap have no depth limit. (2) Use bottom-up DP instead of memoized recursion for DP problems u2014 eliminates recursion entirely. (3) Increase the recursion limit for competitive programming (not production). In interviews, mention stack depth as a consideration for problems with potentially deep recursion, then offer the iterative alternative if the interviewer probes. For balanced trees (depth O(log n)), recursion is always safe in practice.”
}
}
]
}

  • LinkedIn Interview Guide
  • Snap Interview Guide
  • Atlassian Interview Guide
  • Uber Interview Guide
  • Apple Interview Guide
  • Companies That Ask This

    Scroll to Top