Why Greedy and Interval Problems
Greedy algorithms appear frequently in FAANG interviews because they test whether you can identify when a locally optimal choice leads to a globally optimal solution. Interval problems — merging, scheduling, covering — are the most common greedy category. Master the patterns and you can solve a wide family of problems by recognizing the underlying structure.
Pattern 1: Merge Intervals
Given a list of intervals, merge all overlapping intervals. Sort by start time. Iterate: if the current interval overlaps the last merged interval (current.start <= merged.end), merge by extending the end. Otherwise, add the current interval as a new entry.
def merge(intervals):
intervals.sort(key=lambda x: x[0])
merged = [intervals[0]]
for start, end in intervals[1:]:
if start <= merged[-1][1]:
merged[-1][1] = max(merged[-1][1], end)
else:
merged.append([start, end])
return merged
# LeetCode 56: Merge Intervals
# LeetCode 57: Insert Interval (insert + merge)
# LeetCode 986: Interval List Intersections (two-pointer merge)
Time: O(N log N) for sorting, O(N) for merging. The key insight: after sorting by start, intervals can only overlap with the most recently merged interval — no need to look backward.
Pattern 2: Meeting Rooms — Minimum Rooms Required
Given meeting time intervals, find the minimum number of conference rooms required. Classic greedy approach: sort meetings by start time. Use a min-heap storing end times of ongoing meetings. For each meeting: if the earliest-ending meeting ends before this one starts (heap.top <= current.start), reuse that room (pop and push new end time). Otherwise, add a new room (push to heap). The heap size at the end is the minimum rooms needed.
import heapq
def min_meeting_rooms(intervals):
if not intervals:
return 0
intervals.sort(key=lambda x: x[0])
heap = [] # end times of ongoing meetings
for start, end in intervals:
if heap and heap[0] <= start:
heapq.heapreplace(heap, end) # reuse room
else:
heapq.heappush(heap, end) # new room
return len(heap)
# LeetCode 253: Meeting Rooms II
# Variant: LeetCode 1094 (Car Pooling) — same pattern with capacity
Alternative: chronological events scan. Create events: (start, +1) and (end, -1). Sort all events. Scan and track running sum; the maximum is the answer. This handles ties correctly (sort end before start if same time).
Pattern 3: Activity Selection / Maximum Non-Overlapping Intervals
Given intervals, select the maximum number of non-overlapping intervals. Greedy choice: always select the interval that ends earliest. Sort by end time. Greedily select an interval if it starts after (or at) the last selected interval’s end.
def erase_overlap_intervals(intervals):
# Returns minimum number of intervals to REMOVE
# Equivalently: N - max non-overlapping
intervals.sort(key=lambda x: x[1]) # sort by end
count = 0
last_end = float('-inf')
for start, end in intervals:
if start >= last_end:
count += 1 # select this interval
last_end = end
return len(intervals) - count
# LeetCode 435: Non-overlapping Intervals
# LeetCode 452: Minimum Arrows to Burst Balloons (same greedy)
Why sort by end? If we sort by start, we might greedily pick a long interval that blocks many future short ones. Picking the earliest-ending interval always leaves maximum room for future selections — this is the exchange argument proof: any solution that picks a later-ending interval can be improved by swapping to the earlier-ending one without reducing the count.
Pattern 4: Task Scheduler with Cooldown
Given tasks with a cooldown period n (same task cannot repeat within n intervals), find the minimum time to complete all tasks. Greedy: always execute the most frequent remaining task. Use a max-heap of (count, task) pairs. After executing a task, it enters a cooldown queue. Each round, execute one task from the heap (if available), advance time by 1, and release tasks from the cooldown queue.
import heapq
from collections import deque
def least_interval(tasks, n):
counts = {}
for t in tasks:
counts[t] = counts.get(t, 0) + 1
heap = [-c for c in counts.values()] # max-heap (negate for Python min-heap)
heapq.heapify(heap)
queue = deque() # (count, available_at_time)
time = 0
while heap or queue:
time += 1
if heap:
cnt = heapq.heappop(heap) + 1 # execute once, increment (closer to 0)
if cnt < 0:
queue.append((cnt, time + n))
if queue and queue[0][1] == time:
heapq.heappush(heap, queue.popleft()[0])
return time
# LeetCode 621: Task Scheduler
Mathematical shortcut: answer = max(len(tasks), (max_count – 1) * (n + 1) + num_tasks_with_max_count). Use this formula for O(N) solution, but the heap-based approach generalizes to variants.
Pattern 5: Jump Game
Can you reach the last index? At each position, you can jump up to nums[i] steps. Greedy: track the furthest reachable index. For each position, if it is within reach, update the furthest. If you ever reach a position beyond the current furthest, you’re stuck.
def can_jump(nums):
reach = 0
for i, jump in enumerate(nums):
if i > reach:
return False # can't reach this position
reach = max(reach, i + jump)
return True
def jump_game_ii(nums):
# Minimum jumps to reach end (LeetCode 45)
jumps = 0
current_end = 0
farthest = 0
for i in range(len(nums) - 1):
farthest = max(farthest, i + nums[i])
if i == current_end:
jumps += 1
current_end = farthest
return jumps
Pattern 6: Interval Scheduling Maximization (Weighted)
When intervals have weights (profit), greedy doesn’t work — you need DP. Sort by end time. For each interval, binary search for the last non-overlapping interval (latest end <= current start). dp[i] = max(dp[i-1], weight[i] + dp[last_non_overlap]). LeetCode 1235: Maximum Profit in Job Scheduling.
Greedy Recognition Guide
- Merge intervals: sort by start, merge if overlap — O(N log N)
- Minimum rooms / resources: min-heap of end times — O(N log N)
- Maximum non-overlapping: sort by end, greedy select — O(N log N)
- Task cooldown: max-heap by frequency, advance time — O(N log N)
- Jump game: track furthest reach, greedy scan — O(N)
- Weighted intervals: greedy fails → binary search + DP — O(N log N)
Key Problems by Pattern
- LeetCode 56: Merge Intervals
- LeetCode 57: Insert Interval
- LeetCode 252/253: Meeting Rooms I and II
- LeetCode 435: Non-Overlapping Intervals
- LeetCode 452: Minimum Arrows to Burst Balloons
- LeetCode 621: Task Scheduler
- LeetCode 55/45: Jump Game I and II
- LeetCode 1235: Maximum Profit in Job Scheduling
{
“@context”: “https://schema.org”,
“@type”: “FAQPage”,
“mainEntity”: [
{
“@type”: “Question”,
“name”: “What is the greedy strategy for interval scheduling maximization?”,
“acceptedAnswer”: {
“@type”: “Answer”,
“text”: “The interval scheduling maximization problem asks: given N intervals, select the maximum number of non-overlapping intervals. The greedy strategy is to sort intervals by end time and repeatedly select the interval that ends earliest, skipping any interval that overlaps with the last selected. This works because selecting an earlier-ending interval never makes things worse u2014 it leaves at least as much room for future intervals as any other choice. Formally, prove by exchange argument: if an optimal solution has a different first interval, swapping it with the earliest-ending interval yields a solution at least as good. This greedy runs in O(n log n) for sorting, then O(n) for selection.”
}
},
{
“@type”: “Question”,
“name”: “How do you find the minimum number of meeting rooms required for a set of meetings?”,
“acceptedAnswer”: {
“@type”: “Answer”,
“text”: “Sort meetings by start time. Use a min-heap containing the end times of meetings currently in progress. For each meeting, if the earliest-ending meeting in the heap ends at or before the current meeting’s start, that room is free u2014 pop it from the heap and replace with the current meeting’s end time (reuse the room). Otherwise, push the current meeting’s end time (allocate a new room). The heap size at any point equals the number of rooms currently in use; the maximum heap size across all meetings is the answer. Time complexity O(n log n): sorting + heap operations. Alternative: sort start and end times separately and use two pointers to count overlapping meetings at each moment.”
}
},
{
“@type”: “Question”,
“name”: “When does a greedy algorithm fail and dynamic programming is needed instead?”,
“acceptedAnswer”: {
“@type”: “Answer”,
“text”: “Greedy fails when the locally optimal choice at each step can preclude a globally optimal solution u2014 specifically when a decision now affects which future choices are available in a way that requires looking ahead. Classic example: 0/1 Knapsack u2014 taking the highest-value item per unit weight now might prevent fitting items that together have higher total value. Coin change with arbitrary denominations (e.g., coins [1, 3, 4], amount 6: greedy takes 4+1+1=3 coins, optimal is 3+3=2 coins). The key test: can you prove by exchange argument that the greedy choice is always safe? If yes, greedy works. If counterexamples exist (even for small inputs), use DP to consider all possibilities.”
}
}
]
}