Big O notation is a way to describe how fast or slow an algorithm grows as the input gets larger.
It doesn’t measure time in seconds — instead, it measures how the number of operations increases as data grows.
Big O helps you answer questions like:
💡 Big O is about patterns, not about measuring exact speed.
⚠️ You do not need math skills to understand Big O — just logic.
Real-world programs often handle huge amounts of data:
Inefficient algorithms become slow as data grows.
Big O helps developers:
These patterns appear all the time in software development:
| Big O | Name | Meaning |
|---|---|---|
| O(1) | Constant | Same time, no matter how big input is |
| O(n) | Linear | Time grows in direct proportion to input |
| O(n²) | Quadratic | Time grows MUCH faster → nested loops |
| O(log n) | Logarithmic | Very efficient; input size shrinks each step |
| O(n log n) | Linearithmic | Fast sorting algorithms |
Let’s break these down.
Algorithm does the same amount of work no matter the input size.
const first = array[0]; // Always one operation.
It doesn’t matter if the array has 5 items or 5 million — accessing an index is always O(1).
Input size: 1 10 1000 1,000,000
Work done: 1 1 1 1
💡 O(1) is the fastest time complexity.
Work grows directly with input size — usually caused by a single loop.
for (let i = 0; i < array.length; i++) {
console.log(array[i]);
}
If the array doubles in size, the work doubles.
Input size: 1 10 100 1000
Work done: 1 10 100 1000
Examples of O(n):
💡 If you loop through the whole array once → O(n).
Usually caused by nested loops.
for (let i = 0; i < array.length; i++) {
for (let j = 0; j < array.length; j++) {
console.log(i, j);
}
}
If the input doubles, the work becomes four times larger.
Input: 10 → 100 operations
Input: 20 → 400 operations
Input: 40 → 1600 operations
Examples:
⚠️ O(n²) becomes slow very quickly. Avoid if possible.
One of the fastest patterns.
The algorithm reduces the problem size dramatically at each step.
Example: Binary Search
Each step cuts the remaining data in half.
100 → 50 → 25 → 12 → 6 → 3 → 1
JavaScript example (simplified):
while (start <= end) {
const middle = Math.floor((start + end) / 2);
if (target === array[middle]) break;
if (target < array[middle]) {
end = middle - 1;
} else {
start = middle + 1;
}
}
💡 O(log n) algorithms are extremely efficient and scale well.
Common in efficient sorting algorithms like:
.sort())These split the data (log n) and process each piece (n).
You don’t need to fully understand sorting yet — just remember:
💡 O(n log n) is faster than O(n²) and used in most practical sorting.
Fastest → Slowest
O(1) █
O(log n) ██
O(n) ███████
O(n log n) ████████████
O(n²) ████████████████████████████
for (...) {}
→ O(n)
for (...) {
for (...) {}
}
→ O(n²)
arr[999];
→ O(1)
for (...) doSmallWork();
→ O(n) (assuming the function is O(1))
arr.sort(); // O(n log n)
for (...) {} // O(n)
Total → O(n log n) (the dominant term)
💡 Big O only cares about the dominant term, not small details.