Advanced Greedy Algorithm Interview Patterns: Jump Game, Task Scheduling, and Huffman Coding (2025)

When Does Greedy Work?

A greedy algorithm makes the locally optimal choice at each step and hopes it leads to a global optimum. It works when the problem has two properties: greedy choice property (a locally optimal choice leads to a globally optimal solution) and optimal substructure (the optimal solution contains optimal solutions to subproblems). Greedy fails when an early local choice forecloses a globally better path. The key skill: recognize when greedy applies vs when DP is needed. Greedy is often simpler and faster (O(n log n) vs O(n^2) for DP). When in doubt, try greedy and then verify the local choice is globally safe.

Pattern 1: Jump Game I and II (LC 55, 45)

Jump Game I: can you reach the last index? Track the farthest reachable position. At each index, update farthest = max(farthest, i + nums[i]). If i > farthest at any point: impossible. If farthest >= last index: possible.

def can_jump(nums):
    farthest = 0
    for i, jump in enumerate(nums):
        if i > farthest: return False
        farthest = max(farthest, i + jump)
    return True

def jump(nums):
    # Jump Game II: minimum jumps to reach end
    jumps = 0
    current_end = 0  # end of current jump range
    farthest = 0     # farthest reachable from current range
    for i in range(len(nums) - 1):
        farthest = max(farthest, i + nums[i])
        if i == current_end:  # must jump
            jumps += 1
            current_end = farthest
    return jumps

Jump II greedy: at each position, choose the jump that extends the reachable range the most. When you reach the end of the current jump range, you must jump — count it and set the new range to farthest. This is optimal because always extending the range as far as possible minimizes the number of jumps needed.

Pattern 2: Task Scheduler (LC 621)

Schedule tasks with a cooldown n between the same task type. Minimize total time. Greedy insight: the most frequent task is the bottleneck. Arrange tasks around the most frequent task.

from collections import Counter

def least_interval(tasks, n):
    counts = sorted(Counter(tasks).values(), reverse=True)
    max_count = counts[0]
    # Number of tasks with max_count
    max_count_tasks = counts.count(max_count)
    # Minimum time: (max_count - 1) * (n + 1) + max_count_tasks
    # But might need more time to fit all tasks
    return max(len(tasks), (max_count - 1) * (n + 1) + max_count_tasks)

Intuition: think of slots in a grid with (n+1) columns. Fill the most frequent task in the first column of each row. The remaining tasks fill remaining slots. If there are enough tasks to fill all slots: answer is len(tasks). If not enough: idle slots pad the time.

Pattern 3: Huffman Coding

Build an optimal prefix-free encoding where frequent characters get shorter codes. Greedy: always merge the two least-frequent nodes. Uses a min-heap.

import heapq

def huffman(freq):
    heap = [[f, [c, ""]] for c, f in freq.items()]
    heapq.heapify(heap)
    while len(heap) > 1:
        lo = heapq.heappop(heap)
        hi = heapq.heappop(heap)
        for pair in lo[1:]: pair[1] = '0' + pair[1]
        for pair in hi[1:]: pair[1] = '1' + pair[1]
        heapq.heappush(heap, [lo[0] + hi[0]] + lo[1:] + hi[1:])
    return sorted(heap[0][1:], key=lambda x: len(x[1]))

Optimality proof: if two characters have the lowest frequencies, they should be the deepest nodes in the tree (longest codes). Merging them first gives them an extra bit but saves bits for all other characters. This local choice is globally optimal.

Pattern 4: Gas Station and Candy

Gas Station (LC 134): can you complete the circular route? If total gas >= total cost: always possible. The starting station is the one after the last station where the running sum goes negative. Greedy: track running sum; reset start when running sum ratings[i-1], candy[i] = candy[i-1] + 1. (2) right-to-left: if ratings[i] > ratings[i+1], candy[i] = max(candy[i], candy[i+1] + 1). O(n) time. The two-pass approach ensures both neighbor constraints are satisfied independently. Minimum number of arrows to burst balloons (LC 452): sort by end coordinate. Greedily shoot at the end of the first balloon — this burst all overlapping balloons. Move to the next non-burst balloon. Exactly the non-overlapping intervals greedy in disguise.


{“@context”:”https://schema.org”,”@type”:”FAQPage”,”mainEntity”:[{“@type”:”Question”,”name”:”Why does a greedy approach work for Jump Game II but requires proof for Jump Game I?”,”acceptedAnswer”:{“@type”:”Answer”,”text”:”Jump Game I (can you reach the end?): a greedy proof by exchange argument shows that always tracking the maximum reachable index is optimal — if you can reach index i, you can reach any index j <= max_reachable. Jump Game II (minimum jumps): greedily choose the jump that extends your reach the furthest within the current reachable window. This works because at each jump point you have exhausted all positions reachable in the previous jump count, and the furthest reach from the current window is always at least as good as any alternative — there is no benefit to saving a jump for later. The greedy works because future choices are never harmed by maximizing current reach.”}},{“@type”:”Question”,”name”:”How does the Task Scheduler frequency-grid approach work?”,”acceptedAnswer”:{“@type”:”Answer”,”text”:”Task Scheduler (LC 621): you have CPU tasks with a cooldown n. The key insight: the bottleneck is the most frequent task. Let max_freq be the highest task frequency and max_count be the number of tasks with that frequency. Lower bound on time = (max_freq – 1) * (n + 1) + max_count. This is the idle-slot formula: you need (max_freq – 1) "frames" of size (n+1), plus the final partial frame with max_count tasks. The answer is max(total_tasks, lower_bound). If there are enough diverse tasks to fill all idle slots, no idle time is needed (total_tasks >= lower_bound).”}},{“@type”:”Question”,”name”:”How does Huffman coding achieve optimal prefix-free compression?”,”acceptedAnswer”:{“@type”:”Answer”,”text”:”Huffman coding builds a variable-length, prefix-free code by greedily merging the two lowest-frequency symbols into a combined node, repeating until one root node remains. Lower-frequency symbols get longer codes; higher-frequency get shorter. The resulting tree gives the minimum expected code length (in bits per symbol) for a symbol-by-symbol code — proved optimal by Shannon's source coding theorem for prefix-free codes. The greedy choice (always merge the two smallest) is optimal by an exchange argument: swapping any two nodes in the tree cannot decrease the expected length.”}},{“@type”:”Question”,”name”:”Why does the Gas Station greedy guarantee finding the unique valid starting station?”,”acceptedAnswer”:{“@type”:”Answer”,”text”:”Gas Station (LC 134): if a solution exists, it is unique. The greedy works in one pass: track cumulative surplus (gas[i] – cost[i]). If surplus goes negative at station i, no station from the current start through i can be the answer (they would all have a deficit when passing through i). Reset the start to i+1 and reset the running surplus to 0. At the end: if total_gas >= total_cost, the candidate start is guaranteed to be the answer. The proof: if total gas >= total cost, a solution must exist. The candidate is the only station where the journey from it to the end never goes negative — any earlier reset station is provably invalid.”}},{“@type”:”Question”,”name”:”How does the two-pass approach solve the Candy distribution problem?”,”acceptedAnswer”:{“@type”:”Answer”,”text”:”Candy (LC 135): each child gets at least 1 candy; children with higher rating than a neighbor get more candies than that neighbor. Two passes guarantee the minimum total: Left-to-right pass: for each child with rating higher than the left neighbor, give one more candy than the left neighbor (otherwise 1). Right-to-left pass: for each child with rating higher than the right neighbor, ensure they have at least one more candy than the right neighbor (take max of current and right+1). The two passes handle the left-neighbor and right-neighbor constraints independently and their max combination satisfies both constraints simultaneously with the minimum total.”}}]}

See also: Meta Interview Prep

See also: Atlassian Interview Prep

See also: Coinbase Interview Prep

Scroll to Top