Graph Algorithm Selection Guide
Choosing the right graph algorithm is the first step in any graph interview question. BFS: unweighted shortest path, level-order traversal, connected components. DFS: cycle detection, topological sort, SCC, path enumeration. Dijkstra: single-source shortest path with non-negative weights. Bellman-Ford: shortest path with negative weights, detects negative cycles. Floyd-Warshall: all-pairs shortest paths, O(V^3). Prim/Kruskal: minimum spanning tree. Topological sort: dependency ordering in a DAG. Tarjan/Kosaraju: strongly connected components. The key: if edge weights are all equal (or unweighted), BFS is sufficient and faster. Only use Dijkstra when weights differ.
Dijkstra’s Algorithm: O((V + E) log V)
Single-source shortest path for non-negative weights. Use a min-heap (priority queue). Start: add (0, source) to heap. On each pop: if distance is outdated (already found a shorter path), skip. Otherwise: for each neighbor, compute tentative distance; if shorter than known, update and push to heap.
import heapq
def dijkstra(graph, src, n):
# graph[u] = [(weight, v), ...]
dist = [float('inf')] * n
dist[src] = 0
heap = [(0, src)]
while heap:
d, u = heapq.heappop(heap)
if d > dist[u]: # stale entry
continue
for w, v in graph[u]:
if dist[u] + w < dist[v]:
dist[v] = dist[u] + w
heapq.heappush(heap, (dist[v], v))
return dist
Why skip stale entries: the heap may contain outdated (larger) distances for a node. When we pop a stale entry, the node has already been settled at a shorter distance — skip it. This is correct because Dijkstra’s guarantee: the first time a node is popped from the heap, it has its final shortest distance. Key applications: LC 743 (Network Delay Time), LC 1631 (Minimum Effort Path — use Dijkstra with edge weight = max height difference), LC 778 (Swim in Rising Water — binary search + BFS, or Dijkstra).
Bellman-Ford: Negative Weights and Cycle Detection
Relaxes all edges V-1 times. After V-1 rounds, all shortest paths are found (a shortest path has at most V-1 edges if no negative cycle). Run one more round: if any distance still decreases, a negative cycle is reachable. O(VE) time. Use when: edge weights can be negative, or you need to detect negative cycles.
def bellman_ford(n, edges, src):
# edges = [(u, v, w), ...]
dist = [float('inf')] * n
dist[src] = 0
for _ in range(n - 1):
for u, v, w in edges:
if dist[u] + w < dist[v]:
dist[v] = dist[u] + w
# Check for negative cycles
for u, v, w in edges:
if dist[u] + w < dist[v]:
return None # negative cycle
return dist
Topological Sort: Kahn’s Algorithm
Topological order of a DAG: process nodes in the order such that all dependencies come before dependents. Kahn’s BFS: compute in-degrees; add all 0-in-degree nodes to queue; repeatedly remove a node, decrement neighbors’ in-degrees, add newly 0-in-degree nodes. If all nodes are processed: valid topological order. If not: cycle exists.
from collections import deque
def topo_sort(n, prerequisites):
adj = [[] for _ in range(n)]
in_degree = [0] * n
for a, b in prerequisites:
adj[b].append(a)
in_degree[a] += 1
q = deque(i for i in range(n) if in_degree[i] == 0)
order = []
while q:
node = q.popleft()
order.append(node)
for nei in adj[node]:
in_degree[nei] -= 1
if in_degree[nei] == 0:
q.append(nei)
return order if len(order) == n else [] # empty = cycle
Strongly Connected Components: Kosaraju’s Algorithm
An SCC is a maximal set of nodes where every node is reachable from every other. Applications: finding circular dependencies, condensation graph (DAG of SCCs). Kosaraju’s two-pass DFS: (1) DFS on original graph, push nodes to stack in finish-time order. (2) Transpose the graph (reverse all edges). (3) DFS on transposed graph in reverse finish-time order (pop from stack). Each DFS tree in pass 2 is one SCC.
def kosaraju(n, edges):
adj = [[] for _ in range(n)]
radj = [[] for _ in range(n)]
for u, v in edges:
adj[u].append(v)
radj[v].append(u)
visited = [False] * n
order = []
def dfs1(u):
visited[u] = True
for v in adj[u]:
if not visited[v]: dfs1(v)
order.append(u)
for i in range(n):
if not visited[i]: dfs1(i)
visited = [False] * n
sccs = []
def dfs2(u, comp):
visited[u] = True
comp.append(u)
for v in radj[u]:
if not visited[v]: dfs2(v, comp)
while order:
u = order.pop()
if not visited[u]:
comp = []
dfs2(u, comp)
sccs.append(comp)
return sccs
{“@context”:”https://schema.org”,”@type”:”FAQPage”,”mainEntity”:[{“@type”:”Question”,”name”:”When should you use Dijkstra vs BFS for shortest path?”,”acceptedAnswer”:{“@type”:”Answer”,”text”:”Use BFS when all edge weights are equal (or the graph is unweighted) — it is O(V+E), simpler, and finds shortest paths correctly. Use Dijkstra when edge weights differ and are non-negative — it runs in O((V+E) log V) with a min-heap. Never use Dijkstra on unweighted graphs when BFS suffices. Use Bellman-Ford only when edge weights can be negative or when you need to detect negative cycles.”}},{“@type”:”Question”,”name”:”Why do we skip stale heap entries in Dijkstra's algorithm?”,”acceptedAnswer”:{“@type”:”Answer”,”text”:”When we find a shorter path to a node and push a new (shorter_dist, node) entry to the heap, the old (longer_dist, node) entry is not removed — it remains in the heap. When the old entry is eventually popped, the node has already been settled at its final (shorter) distance. Checking if popped_dist > dist[node] identifies stale entries and skips them, avoiding incorrect relaxations. This is correct because Dijkstra guarantees that the first pop of a node is always at its shortest distance.”}},{“@type”:”Question”,”name”:”What does Bellman-Ford do in the V-th relaxation round to detect negative cycles?”,”acceptedAnswer”:{“@type”:”Answer”,”text”:”After V-1 rounds of relaxation, all shortest paths in a graph without negative cycles are finalized (a shortest path uses at most V-1 edges). In the V-th round, if any edge (u, v, w) still allows dist[v] to be reduced (dist[u] + w < dist[v]), it means the "shortest path" to v keeps getting shorter — only possible if there's a negative cycle reachable from the source. This cycle can be traversed repeatedly to decrease the path cost indefinitely.”}},{“@type”:”Question”,”name”:”What is the difference between Kahn's algorithm and DFS-based topological sort?”,”acceptedAnswer”:{“@type”:”Answer”,”text”:”Kahn's BFS algorithm processes nodes in topological order by repeatedly removing 0-in-degree nodes, and naturally detects cycles (if not all nodes are processed, a cycle exists). DFS-based topological sort uses post-order DFS — add each node to a stack when all its successors are processed; the stack's reverse is the topological order. DFS cycle detection requires 3-state coloring (unvisited, in-progress, done). Kahn's is often preferred for interview coding as it's more intuitive and produces the order directly without reversing.”}},{“@type”:”Question”,”name”:”What is a strongly connected component (SCC) and how does Kosaraju's algorithm find them?”,”acceptedAnswer”:{“@type”:”Answer”,”text”:”An SCC is a maximal subgraph where every node can reach every other node. Kosaraju's runs two DFS passes: (1) DFS on the original graph, recording finish times (push to stack). (2) Transpose all edges. (3) DFS on the transposed graph in reverse finish order (pop from stack) — each DFS tree in this pass is one SCC. The key insight: in the transposed graph, if SCC A could reach SCC B in the original, then B can reach A — so processing in reverse finish order ensures we find one SCC at a time without crossing boundaries.”}}]}
See also: Meta Interview Prep
See also: Atlassian Interview Prep
See also: Databricks Interview Prep