Shortest Path Algorithms in Interviews
Shortest path problems ask: given a graph with weighted or unweighted edges, what is the minimum cost to travel from source to destination? Different algorithms apply based on graph characteristics: negative edges, all-pairs vs. single-source, dense vs. sparse graphs. Choosing the right algorithm is itself part of the interview answer.
Algorithm 1: BFS for Unweighted Graphs
For unweighted graphs (or graphs where all edges have equal weight), BFS finds the shortest path in O(V + E) — faster than any weighted algorithm. BFS explores level by level, so the first time you reach the destination, you’ve found the shortest path.
def bfs_shortest_path(graph, start, end):
from collections import deque
queue = deque([(start, [start])])
visited = {start}
while queue:
node, path = queue.popleft()
if node == end:
return path
for neighbor in graph[node]:
if neighbor not in visited:
visited.add(neighbor)
queue.append((neighbor, path + [neighbor]))
return [] # no path
For grid problems (0/1 matrix, shortest path in maze): treat each cell as a node with 4 neighbors. BFS level = minimum steps from source.
Algorithm 2: Dijkstra’s Algorithm (Non-negative Weights)
Dijkstra finds the shortest path from a single source to all other vertices in a graph with non-negative edge weights. Uses a min-heap to always process the nearest unvisited vertex.
import heapq
def dijkstra(graph, start):
dist = {node: float('inf') for node in graph}
dist[start] = 0
heap = [(0, start)] # (distance, node)
while heap:
d, u = heapq.heappop(heap)
if d > dist[u]:
continue # stale entry
for v, weight in graph[u]:
if dist[u] + weight < dist[v]:
dist[v] = dist[u] + weight
heapq.heappush(heap, (dist[v], v))
return dist
Time: O((V + E) log V) with binary heap. The “stale entry” check is critical — without it, you process the same node multiple times. Once a node is popped with its final shortest distance, any future heap entries for it will have a larger distance and be skipped.
Algorithm 3: Bellman-Ford (Negative Weights)
Dijkstra fails with negative edges (can produce incorrect results). Bellman-Ford handles negative weights and also detects negative cycles.
def bellman_ford(vertices, edges, start):
dist = {v: float('inf') for v in vertices}
dist[start] = 0
for _ in range(len(vertices) - 1): # relax V-1 times
for u, v, w in edges:
if dist[u] + w < dist[v]:
dist[v] = dist[u] + w
# Check for negative cycles: if any edge still relaxes, cycle exists
for u, v, w in edges:
if dist[u] + w < dist[v]:
return None # negative cycle detected
return dist
Time: O(V * E). Much slower than Dijkstra — only use when negative edges exist or negative cycle detection is needed. Classic application: arbitrage detection in currency exchange (negative log-weights).
Algorithm 4: Floyd-Warshall (All-Pairs Shortest Path)
Finds shortest paths between ALL pairs of vertices in O(V³). Works with negative edges (but not negative cycles). Use when V is small (≤ 500) and you need all-pairs distances.
def floyd_warshall(n, edges):
INF = float('inf')
dist = [[INF] * n for _ in range(n)]
for i in range(n):
dist[i][i] = 0
for u, v, w in edges:
dist[u][v] = w # directed; add dist[v][u]=w for undirected
for k in range(n): # intermediate vertex
for i in range(n): # source
for j in range(n): # destination
if dist[i][k] + dist[k][j] < dist[i][j]:
dist[i][j] = dist[i][k] + dist[k][j]
return dist
The key insight: dist[i][j] via intermediate k = min(dist[i][j] directly, dist[i][k] + dist[k][j]). Try all V vertices as intermediates in sequence.
Algorithm 5: 0-1 BFS (Mixed 0/1 Weights)
When edges have weight 0 or 1, use a deque instead of a priority queue: add weight-0 neighbors to the front, weight-1 neighbors to the back. O(V + E) — faster than Dijkstra’s O((V+E) log V).
from collections import deque
def bfs_01(graph, start, end):
dist = {node: float('inf') for node in graph}
dist[start] = 0
dq = deque([start])
while dq:
u = dq.popleft()
for v, w in graph[u]:
if dist[u] + w < dist[v]:
dist[v] = dist[u] + w
if w == 0:
dq.appendleft(v)
else:
dq.append(v)
return dist[end]
Choosing the Right Algorithm
| Graph type | Algorithm | Complexity |
|---|---|---|
| Unweighted | BFS | O(V + E) |
| Non-negative weights, single source | Dijkstra | O((V+E) log V) |
| Negative weights, single source | Bellman-Ford | O(V * E) |
| All-pairs, small graph | Floyd-Warshall | O(V³) |
| 0/1 weights | 0-1 BFS | O(V + E) |
| DAG | Topological sort + relax | O(V + E) |
Common Interview Problem Patterns
- Shortest path in a grid: BFS (unweighted) or Dijkstra (weighted cells)
- Network delay time: Dijkstra from source, return max distance
- Cheapest flights within K stops: modified Dijkstra or Bellman-Ford with stop count
- Word ladder: BFS on implicit graph (each word = node, edges between words differing by 1 char)
- Path with minimum effort: Dijkstra where edge weight = |height difference|
Interview Tips
- Always clarify: weighted or unweighted? Negative weights? Single-source or all-pairs? This determines your algorithm choice.
- The stale entry check in Dijkstra (if d > dist[u]: continue) is frequently forgotten — mention it explicitly.
- For grid shortest path: start with BFS, upgrade to Dijkstra only if cells have different weights.
- Bellman-Ford is O(VE) not O(V²E) — V-1 relaxation passes each touching E edges = O(VE).
{
“@context”: “https://schema.org”,
“@type”: “FAQPage”,
“mainEntity”: [
{
“@type”: “Question”,
“name”: “Why does Dijkstra fail with negative edge weights?”,
“acceptedAnswer”: { “@type”: “Answer”, “text”: “Dijkstra relies on a greedy invariant: once a vertex is popped from the min-heap with distance d, d is final and optimal. This holds only when all edge weights are non-negative. With a negative edge, a later path could undercut an already-finalized distance: suppose dist[A]=5 is finalized, but then a path A→B→C with weights 3 and -10 gives dist[C] = -2, which is less than a direct edge A→C=4. Dijkstra already finalized A and might miss updating C via the negative path. Counter-example: nodes A, B, C; edges A→C=4, A→B=3, B→C=-10. Dijkstra finalizes C with distance 4 (direct), but the true shortest path is A→B→C = -7. Bellman-Ford handles this by relaxing all edges V-1 times without the greedy finalization step.” }
},
{
“@type”: “Question”,
“name”: “What is the time complexity of Dijkstra and when does it matter?”,
“acceptedAnswer”: { “@type”: “Answer”, “text”: “Dijkstra with a binary heap (Python's heapq): O((V + E) log V). Each vertex is pushed/popped from the heap at most once in the ideal case, but with lazy deletion (the 'stale entry' pattern) edges can push duplicate entries — O(E log E) in the worst case, which is O(E log V) since E <= V^2. With a Fibonacci heap: O(E + V log V) — better for dense graphs but complex to implement. In practice, the binary heap version is used everywhere. When does it matter: for sparse graphs (E ≈ V), the log factor is minor. For dense graphs (E ≈ V^2), Floyd-Warshall's O(V^3) may be competitive or better for all-pairs queries. For interviews, always use the binary heap version and state O((V+E) log V).” }
},
{
“@type”: “Question”,
“name”: “How do you find the shortest path with at most K stops (Cheapest Flights)?”,
“acceptedAnswer”: { “@type”: “Answer”, “text”: “The Cheapest Flights Within K Stops problem requires finding the cheapest path from source to destination using at most K intermediate stops (K+1 edges). Standard Dijkstra does not track stop count and may find a path that violates the K constraint. Two approaches: (1) Modified Bellman-Ford: run K+1 rounds of relaxation (instead of V-1). After round i, dist[v] = minimum cost path from source to v using at most i edges. Run K+1 rounds, return dist[destination]. This is O(K * E) — efficient. (2) Modified Dijkstra with state (cost, node, stops_remaining): pop the minimum-cost state, add neighbors if stops_remaining > 0. Use dist[node][stops] to avoid revisiting. O(K * E * log(K*V)). The Bellman-Ford approach is simpler and more commonly used for this specific problem pattern.” }
}
]
}