Recursion & Backtracking Fundamentals
Recursion breaks a problem into smaller instances of itself. Backtracking extends recursion by systematically exploring all candidates, abandoning a path as soon as it violates constraints. Together, they solve problems where you must search through an exponential space of possibilities.
How Recursion Works
Every recursive function has two parts: a base case that stops the recursion, and a recursive case that reduces the problem. The call stack stores each function's state until the base case is reached.
def factorial(n: int) -> int:
"""Classic recursion example.
Base case: n <= 1 returns 1
Recursive case: n * factorial(n-1)
Call stack for factorial(4):
factorial(4) -> 4 * factorial(3)
factorial(3) -> 3 * factorial(2)
factorial(2) -> 2 * factorial(1)
factorial(1) -> 1 (base case)
returns 2 * 1 = 2
returns 3 * 2 = 6
returns 4 * 6 = 24
Time: O(n), Space: O(n) call stack
"""
if n <= 1:
return 1
return n * factorial(n - 1)
Recursion vs Iteration
| Aspect | Recursion | Iteration |
|---|---|---|
| State | Stored on call stack implicitly | Stored in variables explicitly |
| Readability | Natural for tree/graph problems | Simpler for linear problems |
| Space | O(depth) stack space | O(1) typically |
| Risk | Stack overflow for deep recursion | No stack overflow risk |
| Python limit | Default 1000 frames (sys.setrecursionlimit()) | No inherent limit |
The Backtracking Template
Backtracking is a refined recursion that chooses, explores, and un-chooses at each step. This template solves nearly every backtracking problem:
def backtrack(candidates, path, result, start=0):
"""Universal backtracking template.
1. Check if current path is a valid solution (base case)
2. For each candidate starting from 'start':
a. CHOOSE: add candidate to path
b. EXPLORE: recurse with updated state
c. UN-CHOOSE: remove candidate from path (backtrack)
This template handles:
- Permutations (no start index, use visited set)
- Combinations (use start index to avoid duplicates)
- Subsets (collect path at every node, not just leaves)
- Constraint satisfaction (add validity check before choosing)
"""
# Base case: found a valid solution
if is_solution(path):
result.append(path[:]) # append a copy
return
for i in range(start, len(candidates)):
# Pruning: skip invalid choices early
if not is_valid(candidates[i], path):
continue
# CHOOSE
path.append(candidates[i])
# EXPLORE
backtrack(candidates, path, result, i + 1) # i+1 for combinations
# backtrack(candidates, path, result, i) # i for reuse allowed
# backtrack(candidates, path, result, 0) # 0 for permutations
# UN-CHOOSE (backtrack)
path.pop()
Visualizing the Decision Tree
For candidates [1, 2, 3] generating subsets, the decision tree looks like this. Each node represents a choice: include or skip the current element.
[]
/ | \
[1] [2] [3]
/ \ |
[1,2] [1,3] [2,3]
|
[1,2,3]
Subsets collected at EVERY node:
[], [1], [2], [3], [1,2], [1,3], [2,3], [1,2,3]
Total subsets of n elements = 2^n
Recursion Patterns You Must Know
Linear Recursion
Single recursive call per frame. Examples: factorial, linked list traversal, binary search. Easily converted to iteration.
Tree Recursion
Multiple recursive calls per frame. Examples: Fibonacci, merge sort, tree traversals. Creates an exponential call tree without memoization.
Tail Recursion
Recursive call is the last operation. Can be optimized to O(1) space by compilers (not Python). Rewrite as iteration for Python.
Backtracking
Tree recursion with choose/explore/un-choose. Systematically searches all possibilities while pruning invalid branches early.
Complexity Analysis for Backtracking
| Problem Type | Time Complexity | Space Complexity | Why |
|---|---|---|---|
| Subsets | O(2n) | O(n) | Each element: include or exclude |
| Permutations | O(n!) | O(n) | n choices, then n-1, then n-2... |
| Combinations C(n,k) | O(C(n,k)) | O(k) | Number of ways to choose k from n |
| N-Queens | O(n!) | O(n) | At most n choices per row, decreasing |
| Sudoku | O(9m) | O(m) | m = empty cells, 9 choices each |
ML & AI Applications
Backtracking and recursive search are foundational to many ML and AI systems:
Hyperparameter Search
Grid search and random search explore a tree of hyperparameter combinations. Tree-structured Parzen Estimators (TPE) in Optuna use backtracking-style search with pruning.
Neural Architecture Search
NAS explores a search space of network architectures (layers, connections, operations). Early methods used recursive enumeration with performance-based pruning.
Constraint Satisfaction
CSP solvers (used in scheduling, planning, and SAT) use backtracking with arc consistency and constraint propagation to prune the search tree.
Game Tree Search
Minimax with alpha-beta pruning (chess, Go) is backtracking on game states. Monte Carlo Tree Search (AlphaGo) combines backtracking with random sampling.
Lilly Tech Systems