- Learn the 4-step process for analyzing algorithm complexity
- Analyze common patterns: single loops, nested loops, and recursive functions
- Distinguish between best-case, worst-case, and average-case complexity
- Understand both time and space complexity in algorithm analysis
Analyzing Simple Algorithms
You know the notation. Now let's actually analyze some algorithms and see how this works in practice.
The process
Here's how you approach it:
- Identify the input size (usually n)
- Count operations in terms of n
- Find the dominant term
- Drop constants and lower-order terms
Let's walk through some examples.
Finding the maximum in an array
def find_max(arr):
max_val = arr[0] # O(1)
for i in range(1, len(arr)): # Loop runs n-1 times
if arr[i] > max_val: # O(1) comparison
max_val = arr[i] # O(1) assignment
return max_val
The loop executes n-1 times (we start at index 1, not 0). Each iteration does O(1) work — a comparison and maybe an assignment. Total: O(n-1), which simplifies to O(n) since we drop constants.
Linear search
def linear_search(arr, target):
for i in range(len(arr)): # n iterations
if arr[i] == target: # O(1) comparison
return i # O(1)
return -1 # O(1)
This one's interesting because it depends on where the target is. Best case: you find it immediately — O(1). Worst case: it's not there or it's at the end — O(n). Average case: somewhere in the middle, still O(n). Big O typically describes worst case, so we say O(n).
Nested loops - printing a matrix
def print_matrix(matrix):
for i in range(len(matrix)): # n rows
for j in range(len(matrix[i])): # n columns
print(matrix[i][j]) # O(1)
Outer loop runs n times. For each outer iteration, the inner loop runs n times. That's n × n = n² total operations. This is O(n²).
Binary search
def binary_search(arr, target):
left, right = 0, len(arr) - 1
while left <= right: # Log n iterations
mid = (left + right) // 2
if arr[mid] == target:
return mid
elif arr[mid] < target:
left = mid + 1
else:
right = mid - 1
return -1
Each iteration cuts the search space in half. Start with n elements, then n/2, then n/4, and so on. How many times can you halve n before you get to 1? That's log₂(n). So this is O(log n).
Patterns you'll see everywhere
Single loop:
for i in range(n): # O(n)
do_something() # O(1)
This is O(n) — you're doing constant work n times.
Nested loops (same size):
for i in range(n): # O(n)
for j in range(n): # O(n)
do_something() # O(1)
This is O(n²) — n times n.
Nested loops (different sizes):
for i in range(n): # O(n)
for j in range(i): # varies, but sums to O(n)
do_something() # O(1)
Even though the inner loop varies (0, 1, 2, ... n-1), the total across all iterations is still O(n²). It's roughly n²/2, which drops to O(n²).
Divide and conquer:
def recursive_function(n):
if n <= 1:
return
recursive_function(n/2) # halving each time
do_work() # O(1)
Halving repeatedly gives you O(log n) depth.
When complexity depends on input
def complex_function(arr):
if len(arr) > 100: # O(1)
return sort_large(arr) # O(n log n)
else:
return sort_small(arr) # O(n²)
This doesn't fit into a single Big O class — it depends on the input size. For small arrays, it's O(n²). For large ones, it's O(n log n). Sometimes you describe this as "O(n²) for n ≤ 100, O(n log n) otherwise."
Don't forget space complexity
Memory matters too:
def create_copy(arr):
copy = [] # O(1)
for item in arr: # O(n)
copy.append(item) # O(1) amortized
return copy # we've created an n-sized array
Time: O(n) — one pass through the array.
Space: O(n) — we're creating a new array the same size as the input.
When you're analyzing algorithms, focus on loops — they dominate complexity. Count how many times work repeats. Nested loops multiply complexity. Consider best, worst, and average cases. With recursion, think about how many calls happen and how deep the stack gets.
The more you practice this, the more automatic it becomes. You'll start seeing O(n) and O(n²) patterns just by glancing at code.
