Big O Analysis Master Sheet
Complete reference for asymptotic notation, complexity analysis techniques, and time/space complexity bounds for algorithms and data structures.
Cheat Sheet Overview
Quick reference and study metrics
Reading Time
Sections
Downloads
Category
Master asymptotic notation and complexity analysis with this comprehensive reference guide.
Quick Reference Table
| Notation | Name | Definition | Example |
|---|---|---|---|
| Big O | Upper bound | ||
| Big Omega | Lower bound | ||
| Big Theta | Tight bound | ||
| Little o | Strict upper bound | ||
| Little omega | Strict lower bound |
Common Complexity Classes
Time Complexities (Best to Worst)
O(1) - Constant
O(log n) - Logarithmic
O(n) - Linear
O(n log n) - Linearithmic
O(n²) - Quadratic
O(n³) - Cubic
O(2^n) - Exponential
O(n!) - Factorial
Growth Rate Comparison
For large n, the following inequality holds:
Big O Notation Deep Dive
Definition
if there exist positive constants and such that:
Key Properties
- Transitivity: If and , then
- Scaling: for any constant
- Addition:
- Multiplication:
Algorithm Analysis Techniques
1. Counting Operations
def linear_search(arr, target):
for i in range(len(arr)): # O(n) iterations
if arr[i] == target: # O(1) comparison
return i # O(1) return
return -1 # O(1) return
# Total: O(n)
2. Recurrence Relations
Master Theorem: For recurrences of the form :
| Case | Condition | Solution |
|---|---|---|
| 1 | where | |
| 2 | where | |
| 3 | where |
Example: Merge Sort
- , so Case 2 applies
3. Loop Analysis
Nested Loops:
for i in range(n): # O(n)
for j in range(n): # O(n) for each i
# O(1) operation
# Total: O(n²)
Dependent Loops:
for i in range(n): # O(n)
for j in range(i): # O(i) for each i
# O(1) operation
# Total: O(1 + 2 + ... + n) = O(n²)
Data Structure Complexities
Arrays and Lists
| Operation | Array | Dynamic Array | Linked List |
|---|---|---|---|
| Access | |||
| Search | |||
| Insertion | amortized | ||
| Deletion |
Trees
| Tree Type | Search | Insert | Delete | Notes |
|---|---|---|---|---|
| Binary Search Tree | can be worst case | |||
| AVL Tree | Self-balancing | |||
| Red-Black Tree | Self-balancing | |||
| B-Tree | Optimized for disk I/O |
Hash Tables
| Operation | Average | Worst Case |
|---|---|---|
| Search | ||
| Insert | ||
| Delete |
Load Factor: where = elements, = table size
Sorting Algorithm Complexities
| Algorithm | Best Case | Average Case | Worst Case | Space | Stable |
|---|---|---|---|---|---|
| Bubble Sort | Yes | ||||
| Selection Sort | No | ||||
| Insertion Sort | Yes | ||||
| Merge Sort | Yes | ||||
| Quick Sort | No | ||||
| Heap Sort | No |
Graph Algorithm Complexities
| Algorithm | Time Complexity | Space Complexity | Use Case |
|---|---|---|---|
| BFS | Shortest path (unweighted) | ||
| DFS | Topological sort, cycles | ||
| Dijkstra | Shortest path (weighted) | ||
| Floyd-Warshall | All-pairs shortest path | ||
| Kruskal's MST | Minimum spanning tree | ||
| Prim's MST | Minimum spanning tree |
Space Complexity Analysis
Stack Space in Recursion
def factorial(n):
if n <= 1:
return 1
return n * factorial(n - 1)
# Space: O(n) due to call stack
Auxiliary Space vs Total Space
- Auxiliary Space: Extra space used by algorithm
- Total Space: Auxiliary space + input space
Common Space Patterns
- In-place algorithms: extra space
- Recursive algorithms: where is recursion depth
- Dynamic programming: Often or for memoization
Analysis Cheat Sheet
Quick Rules
- Drop constants:
- Drop lower-order terms:
- Different inputs use different variables: not
- Nested loops multiply:
- Consecutive loops add:
Common Pitfalls
- Confusing average vs worst case
- Ignoring space complexity
- Not considering different input sizes
- Assuming best-case scenarios
Interview Tips
- Always clarify: What are we optimizing for? Time or space?
- State assumptions: Input size, data distribution, constraints
- Analyze all cases: Best, average, worst
- Consider trade-offs: Time vs space complexity
- Verify with examples: Test your analysis with small inputs
Practical Examples
Example 1: Two Sum
def two_sum_naive(nums, target):
for i in range(len(nums)): # O(n)
for j in range(i+1, len(nums)): # O(n)
if nums[i] + nums[j] == target:
return [i, j]
return []
# Time: O(n²), Space: O(1)
def two_sum_optimal(nums, target):
seen = {} # O(n) space
for i, num in enumerate(nums): # O(n) time
complement = target - num
if complement in seen: # O(1) average
return [seen[complement], i]
seen[num] = i # O(1) average
return []
# Time: O(n), Space: O(n)
Example 2: Finding Duplicates
def has_duplicate_sort(nums):
nums.sort() # O(n log n)
for i in range(1, len(nums)): # O(n)
if nums[i] == nums[i-1]:
return True
return False
# Time: O(n log n), Space: O(1)
def has_duplicate_set(nums):
seen = set() # O(n) space
for num in nums: # O(n) time
if num in seen: # O(1) average
return True
seen.add(num) # O(1) average
return False
# Time: O(n), Space: O(n)
Mathematical Foundations
Logarithm Properties
- - Binary logarithm (common in CS)
- - Base doesn't matter for Big O
Summation Formulas
Quick Reference Summary
Remember: Big O analysis helps us understand how algorithms scale. Focus on the growth rate for large inputs, not exact operation counts.
Golden Rule: When in doubt, trace through your algorithm with increasing input sizes and observe the pattern!