- Understand FIFO (First In, First Out) with enqueue/dequeue in O(1)
- Avoid list.pop(0) which is O(n); use deque.popleft() instead
- Apply queues to BFS, task scheduling, buffering, level-order traversal
Introduction to Queues
You're building a print queue. Users submit documents. The printer processes them in order—first submitted, first printed.
A stack won't work. LIFO means the last document submitted prints first. Unfair.
You need FIFO: First In, First Out. That's a queue.
What's a Queue?
A queue is like waiting in line at a coffee shop. First person in line is first served.
FIFO: First thing you put in is the first thing you take out.
from collections import deque
queue = deque()
queue.append('Person 1') # Enqueue (add to back)
queue.append('Person 2')
queue.append('Person 3')
# Queue now: ['Person 1', 'Person 2', 'Person 3']
# Front is Person 1, Back is Person 3
person = queue.popleft() # Dequeue (remove from front)
print(person) # 'Person 1'
# Queue now: ['Person 2', 'Person 3']
Two operations:
- enqueue(x): Add x to back
- dequeue(): Remove and return front element
Both O(1) with proper implementation.
Implementation: Array (Naive)
Use a list. Add to back, remove from front.
class Queue:
def __init__(self):
self.items = []
def enqueue(self, item):
self.items.append(item) # O(1)
def dequeue(self):
if self.is_empty():
raise Exception("Queue is empty")
return self.items.pop(0) # O(n) - shifts all elements!
def is_empty(self):
return len(self.items) == 0
Problem: pop(0) is O(n). Removes first element, shifts everything left.
For a queue, this is bad.
Implementation: Deque (Better)
Use collections.deque. Doubly linked list optimized for both ends.
from collections import deque
class Queue:
def __init__(self):
self.items = deque()
def enqueue(self, item):
self.items.append(item) # O(1)
def dequeue(self):
if self.is_empty():
raise Exception("Queue is empty")
return self.items.popleft() # O(1)
def peek(self):
if self.is_empty():
return None
return self.items[0]
def is_empty(self):
return len(self.items) == 0
def size(self):
return len(self.items)
Both enqueue and dequeue are O(1). Perfect.
Implementation: Circular Array
Use a fixed-size array with head and tail pointers.
class Queue:
def __init__(self, capacity):
self.buffer = [None] * capacity
self.capacity = capacity
self.head = 0
self.tail = 0
self.size = 0
def enqueue(self, item):
if self.size == self.capacity:
raise Exception("Queue is full")
self.buffer[self.tail] = item
self.tail = (self.tail + 1) % self.capacity
self.size += 1
def dequeue(self):
if self.is_empty():
raise Exception("Queue is empty")
item = self.buffer[self.head]
self.head = (self.head + 1) % self.capacity
self.size -= 1
return item
def is_empty(self):
return self.size == 0
O(1) for both. But fixed capacity.
Implementation: Linked List
Use a singly linked list. Head is front, tail is back.
class Node:
def __init__(self, data):
self.data = data
self.next = None
class Queue:
def __init__(self):
self.head = None
self.tail = None
self.size = 0
def enqueue(self, item):
new_node = Node(item)
if self.tail is None:
self.head = self.tail = new_node
else:
self.tail.next = new_node
self.tail = new_node
self.size += 1
def dequeue(self):
if self.is_empty():
raise Exception("Queue is empty")
data = self.head.data
self.head = self.head.next
if self.head is None:
self.tail = None
self.size -= 1
return data
def is_empty(self):
return self.head is None
O(1) for both. Dynamic size.
Why Queues?
Task scheduling: Process tasks in order. CPU scheduler, print queue, job queue.
task_queue = Queue()
def submit_task(task):
task_queue.enqueue(task)
def process_tasks():
while not task_queue.is_empty():
task = task_queue.dequeue()
execute(task)
Breadth-First Search (BFS): Explore a graph level by level.
def bfs(graph, start):
visited = set()
queue = Queue()
queue.enqueue(start)
visited.add(start)
while not queue.is_empty():
node = queue.dequeue()
print(node)
for neighbor in graph[node]:
if neighbor not in visited:
visited.add(neighbor)
queue.enqueue(neighbor)
graph = {
'A': ['B', 'C'],
'B': ['D', 'E'],
'C': ['F'],
'D': [],
'E': [],
'F': []
}
bfs(graph, 'A') # A, B, C, D, E, F (level order)
BFS uses a queue. DFS uses a stack. Different traversal order.
Buffering: Buffer data between producer and consumer. Producer adds to queue. Consumer removes from queue.
buffer = Queue()
# Producer thread
def producer():
while producing:
data = produce()
buffer.enqueue(data)
# Consumer thread
def consumer():
while consuming:
if not buffer.is_empty():
data = buffer.dequeue()
process(data)
Decouples producer and consumer. They run at their own pace.
Level Order Traversal (Trees)
Traverse a binary tree level by level.
def level_order(root):
if root is None:
return
queue = Queue()
queue.enqueue(root)
while not queue.is_empty():
node = queue.dequeue()
print(node.value)
if node.left:
queue.enqueue(node.left)
if node.right:
queue.enqueue(node.right)
Process root, enqueue children. Process children, enqueue their children. FIFO gives level order.
Sliding Window Maximum (Deque)
Find maximum in each sliding window of size k.
Array: [1, 3, -1, -3, 5, 3, 6, 7], k=3
Use a deque to store indices of potential maximums in decreasing order.
from collections import deque
def max_sliding_window(arr, k):
dq = deque()
result = []
for i in range(len(arr)):
# Remove indices outside window
while dq and dq[0] < i - k + 1:
dq.popleft()
# Remove indices of smaller elements
while dq and arr[dq[-1]] < arr[i]:
dq.pop()
dq.append(i)
# Add max of current window
if i >= k - 1:
result.append(arr[dq[0]])
return result
print(max_sliding_window([1, 3, -1, -3, 5, 3, 6, 7], 3))
# [3, 3, 5, 5, 6, 7]
O(n). Each element added and removed from deque at most once.
When to Use Queues
Order matters: Process items in the order they arrived.
Fairness: First-come, first-served.
Level-by-level processing: BFS, level order traversal.
Buffering: Decouple producer from consumer.
Stacks vs Queues
| Feature | Stack | Queue |
|---|---|---|
| Order | LIFO | FIFO |
| Add | Push to top | Enqueue to back |
| Remove | Pop from top | Dequeue from front |
| Use case | Undo, DFS, backtracking | BFS, scheduling, buffering |
Priority Queue
What if some items are more important? Regular queue doesn't help.
Use a priority queue (heap). Items with higher priority are dequeued first, regardless of order.
import heapq
pq = []
heapq.heappush(pq, (2, 'Task B')) # (priority, task)
heapq.heappush(pq, (1, 'Task A'))
heapq.heappush(pq, (3, 'Task C'))
while pq:
priority, task = heapq.heappop(pq)
print(task)
# Output: Task A, Task B, Task C (by priority)
Not FIFO. Priority order.
Common Mistakes
Using list.pop(0): O(n). Use deque.popleft() instead.
Forgetting to check empty: Always check before dequeue.
# Wrong
item = queue.dequeue() # Might crash if empty
# Right
if not queue.is_empty():
item = queue.dequeue()
The Beauty
Queues are simple. Enqueue at back, dequeue from front. FIFO.
But they solve real problems:
- Scheduling tasks fairly
- BFS traversal
- Buffering data streams
- Managing resources
Understanding queues means understanding when you need to process items in order.
Master queues, and you'll recognize FIFO patterns everywhere.
