Sorting Algorithms Interview Patterns: Merge Sort, Quick Sort, Heap Sort, Counting Sort (2025)

Why Sorting Matters in Interviews

Sorting is foundational: dozens of problems become O(n log n) after sorting (two-sum, 3-sum, interval merge, meeting rooms, closest pairs). Know the trade-offs: stability, in-place vs extra space, best/average/worst case. Interviewers often ask why you chose a specific sort or expect you to implement one.

Merge Sort

Divide the array in half recursively until single elements; merge sorted halves. Merge: two-pointer merge of two sorted arrays into a third. Time: O(n log n) guaranteed (all cases). Space: O(n) auxiliary. Stable. Use merge sort when: stability is required, you are sorting linked lists (no random access needed), external sorting (data does not fit in memory — merge sorted chunks from disk).

def merge_sort(arr):
    if len(arr) <= 1: return arr
    mid = len(arr) // 2
    left = merge_sort(arr[:mid])
    right = merge_sort(arr[mid:])
    return merge(left, right)

def merge(left, right):
    result, i, j = [], 0, 0
    while i < len(left) and j < len(right):
        if left[i] <= right[j]: result.append(left[i]); i += 1
        else: result.append(right[j]); j += 1
    return result + left[i:] + right[j:]

Quick Sort

Choose a pivot; partition array into elements less than pivot, the pivot, and elements greater. Recurse on both halves. Time: O(n log n) average, O(n^2) worst (sorted array with naive pivot). Space: O(log n) stack (average). Not stable. In practice, faster than merge sort due to cache locality and low constant factor. Mitigate worst case: random pivot selection or median-of-three. Used in Python (Timsort uses merge sort + insertion sort), Java (dual-pivot quicksort for primitives).

Heap Sort

Build a max-heap from the array (O(n) via heapify). Repeatedly extract the max (O(log n)) and place at the end. Time: O(n log n) guaranteed. Space: O(1) in-place. Not stable. Less used in practice due to poor cache performance (non-sequential memory access). Good when guaranteed O(n log n) and O(1) space are both required.

Counting Sort and Radix Sort

Counting Sort: for integers in range [0, k]. Count occurrences, compute prefix sums, place elements. Time O(n+k), space O(k). Stable. Use when k is small (e.g., sorting by age, grade). Not comparison-based — bypasses the O(n log n) comparison lower bound.

Radix Sort: sort by each digit from least significant to most significant (LSD) using counting sort as the stable subroutine. Time O(d*(n+k)) where d = number of digits, k = base (10 for decimal). Used for sorting large integers or strings of fixed length. O(n) for fixed-size integers.

Timsort (Python / Java default)

Hybrid of merge sort and insertion sort. Identifies naturally occurring sorted runs in the input (many real-world arrays are partially sorted). Merges runs using a stack-based strategy. O(n) on already-sorted inputs, O(n log n) worst case. Stable. This is why Python sort and Java Arrays.sort(Object[]) are O(n) best case.

When to Use Which Sort

Sort Time Space Stable Use When
Merge Sort O(n log n) O(n) Yes Stability needed, linked lists, external sort
Quick Sort O(n log n) avg O(log n) No General purpose, primitives
Heap Sort O(n log n) O(1) No Guaranteed O(n log n) + O(1) space
Counting Sort O(n+k) O(k) Yes Small integer range
Radix Sort O(d*(n+k)) O(n+k) Yes Fixed-length integers/strings

Key Interview Points

  • Sorting a list of objects by multiple keys: use Python key=(lambda x: (x.a, x.b)) — Timsort is stable so multi-pass sorting works
  • Finding the kth largest element: QuickSelect — O(n) average, O(n^2) worst — or a min-heap of size k
  • External sort: merge sort chunks that fit in memory; merge sorted files using a k-way merge with a min-heap

Asked at: Meta Interview Guide

Asked at: Apple Interview Guide

Asked at: LinkedIn Interview Guide

Asked at: Coinbase Interview Guide

Scroll to Top