foreach-ui logo
codeLanguages
account_treeDSA

Quick Actions

quizlock Random Quiz
trending_uplock Progress
  • 1
  • 2
  • 3
  • 4
  • quiz
Algorithms
  • Understand how merge sort divides, conquers recursively, and merges efficiently
  • See why merge sort is O(n log n) guaranteed with O(n) space
  • Learn when to use merge sort: stability, guaranteed performance, linked lists

Merge Sort Algorithm

You have a pile of 1000 unsorted numbers. Sorting them by comparing every pair would take roughly 500,000 comparisons. That's bubble sort.

Merge sort does it in about 10,000 comparisons. How?

Split the pile in half. Sort each half (recursively). Merge the two sorted halves.

That's it. The magic is in the merging.

The Algorithm

def merge_sort(arr):
    # Base case: arrays of 0 or 1 elements are already sorted
    if len(arr) <= 1:
        return arr
    
    # Divide: split in half
    mid = len(arr) // 2
    left = arr[:mid]
    right = arr[mid:]
    
    # Conquer: sort each half recursively
    left = merge_sort(left)
    right = merge_sort(right)
    
    # Combine: merge the sorted halves
    return merge(left, right)

def merge(left, right):
    result = []
    i = j = 0
    
    # Compare elements from left and right, take the smaller
    while i < len(left) and j < len(right):
        if left[i] <= right[j]:
            result.append(left[i])
            i += 1
        else:
            result.append(right[j])
            j += 1
    
    # Add remaining elements (one array is exhausted)
    result.extend(left[i:])
    result.extend(right[j:])
    
    return result

Step-by-Step Trace

Sort [38, 27, 43, 3, 9, 82, 10].

Level 0 (original): [38, 27, 43, 3, 9, 82, 10]

Split at mid=3:

  • Left: [38, 27, 43]
  • Right: [3, 9, 82, 10]

Level 1 - Sort left [38, 27, 43]:

Split at mid=1:

  • Left: [38]
  • Right: [27, 43]

[38] is size 1, already sorted.

Level 2 - Sort [27, 43]:

Split at mid=1:

  • Left: [27]
  • Right: [43]

Both size 1, already sorted.

Merge [27] and [43]:

Compare 27 and 43 → take 27
Result: [27, 43]

Back to Level 1: Merge [38] and [27, 43]:

Compare 38 and 27 → take 27
Compare 38 and 43 → take 38
Take 43
Result: [27, 38, 43]

Left half is sorted: [27, 38, 43]

Level 1 - Sort right [3, 9, 82, 10]:

Split at mid=2:

  • Left: [3, 9]
  • Right: [82, 10]

Level 2 - Sort [3, 9]:

Split at mid=1:

  • Left: [3]
  • Right: [9]

Both size 1, already sorted.

Merge [3] and [9]:

Compare 3 and 9 → take 3
Take 9
Result: [3, 9]

Level 2 - Sort [82, 10]:

Split at mid=1:

  • Left: [82]
  • Right: [10]

Both size 1, already sorted.

Merge [82] and [10]:

Compare 82 and 10 → take 10
Take 82
Result: [10, 82]

Back to Level 1: Merge [3, 9] and [10, 82]:

Compare 3 and 10 → take 3
Compare 9 and 10 → take 9
Compare 10 and 82 → take 10
Take 82
Result: [3, 9, 10, 82]

Right half is sorted: [3, 9, 10, 82]

Level 0 - Final merge: Merge [27, 38, 43] and [3, 9, 10, 82]:

Compare 27 and 3 → take 3
Compare 27 and 9 → take 9
Compare 27 and 10 → take 10
Compare 27 and 82 → take 27
Compare 38 and 82 → take 38
Compare 43 and 82 → take 43
Take 82
Result: [3, 9, 10, 27, 38, 43, 82]

Done. Sorted.

The Merge Process in Detail

Merging [27, 38, 43] and [3, 9, 10, 82]:

Left:  [27, 38, 43]    i=0
Right: [3, 9, 10, 82]  j=0
Result: []

Step 1: left[0]=27, right[0]=3. 3 < 27 → take 3, j++
Result: [3]

Step 2: left[0]=27, right[1]=9. 9 < 27 → take 9, j++
Result: [3, 9]

Step 3: left[0]=27, right[2]=10. 10 < 27 → take 10, j++
Result: [3, 9, 10]

Step 4: left[0]=27, right[3]=82. 27 < 82 → take 27, i++
Result: [3, 9, 10, 27]

Step 5: left[1]=38, right[3]=82. 38 < 82 → take 38, i++
Result: [3, 9, 10, 27, 38]

Step 6: left[2]=43, right[3]=82. 43 < 82 → take 43, i++
Result: [3, 9, 10, 27, 38, 43]

i=3, left exhausted. Add remaining from right: [82]
Result: [3, 9, 10, 27, 38, 43, 82]

Each comparison takes the smaller element. When one array runs out, dump the rest of the other array.

Time Complexity

Divide: O(1) to find the midpoint.

Conquer: Two recursive calls on arrays of size n/2.

Combine: Merging takes O(n)—you touch each element once.

Recurrence: T(n) = 2T(n/2) + O(n)

This resolves to O(n log n).

Why? The recursion tree has log n levels (halving each time). Each level does O(n) work (merging all elements). Total: O(n) × log n = O(n log n).

For n=1,000,000:

  • Merge sort: ~20 million operations
  • Bubble sort: ~500 billion operations

That's a 25,000x speedup.

Space Complexity

Merge sort uses O(n) extra space for temporary arrays during merging.

Each merge creates a new result array. At the deepest level, you have n arrays of size 1. As you merge up, you're creating arrays that sum to n.

At any point in the recursion, the total extra space is O(n).

This is merge sort's downside—it's not in-place like quicksort or heapsort.

Stability

Merge sort is stable: equal elements maintain their original relative order.

In the merge, when left[i] == right[j], we take from left first. This preserves order for equal elements.

Example: Sort [(3, a), (1, b), (3, c)] by the number.

Result: [(1, b), (3, a), (3, c)]

The two 3's maintain their original order (a before c). Stable.

Quicksort and heapsort are not stable (without extra work).

Iterative Merge Sort

You can implement merge sort iteratively (bottom-up):

def merge_sort_iterative(arr):
    n = len(arr)
    size = 1  # Start with subarrays of size 1
    
    while size < n:
        for start in range(0, n, size * 2):
            mid = min(start + size, n)
            end = min(start + size * 2, n)
            
            # Merge arr[start:mid] and arr[mid:end]
            arr[start:end] = merge(arr[start:mid], arr[mid:end])
        
        size *= 2
    
    return arr

Start with size 1 (individual elements). Merge pairs to get size 2. Merge those to get size 4. Double size each iteration until you've merged everything.

Same O(n log n) time, but avoids recursion overhead.

When to Use Merge Sort

Guaranteed O(n log n): Unlike quicksort (which can degrade to O(n²)), merge sort is always O(n log n). Good for worst-case guarantees.

Stable sorting: When you need to preserve the order of equal elements.

Linked lists: Merge sort works well on linked lists because merging doesn't require random access. Quicksort struggles with linked lists.

External sorting: When data doesn't fit in memory. Sort chunks in memory with merge sort, then merge the sorted chunks from disk.

When NOT to Use Merge Sort

Limited memory: O(n) extra space is a problem for constrained environments. Use in-place sorts like quicksort or heapsort.

Small arrays: For small n, the overhead of recursion and array copying makes merge sort slower than insertion sort.

Average case performance: Quicksort is often faster in practice (better cache locality, in-place). Merge sort's guarantee comes at a cost.

Optimizations

Switch to insertion sort for small subarrays: When a subarray gets small (< 10-20 elements), insertion sort is faster. Hybrid approach.

def merge_sort_optimized(arr):
    if len(arr) <= 15:
        return insertion_sort(arr)
    
    # ...rest of merge sort

In-place merge: It's possible to merge in-place with clever techniques, reducing space to O(log n) for the recursion stack. But it's complex and slower in practice.

Parallel merge sort: Sorting left and right halves can happen in parallel. Merge sort parallelizes well—each subproblem is independent.

Merge Sort vs Quicksort

Aspect Merge Sort Quicksort
Time (worst) O(n log n) O(n²)
Time (average) O(n log n) O(n log n)
Space O(n) O(log n)
Stable Yes No
In-place No Yes
Cache locality Poor Good

Merge sort: predictable, stable, but uses extra space.
Quicksort: fast average case, in-place, but no guarantees.

Choose based on your constraints.

The Beauty

Merge sort is elegant. The code is simple. The analysis is clean. The performance is guaranteed.

It demonstrates divide and conquer perfectly: split the problem, solve recursively, merge efficiently.

Once you understand merge sort, you understand the paradigm. And you'll recognize where else to apply it—problems that split naturally and combine efficiently.

That's the power of divide and conquer.

© 2026 forEach. All rights reserved.

Privacy Policy•Terms of Service