Big O Notation
/bɪɡ əʊ nəʊˈteɪʃən/
Definition
A mathematical notation describing how an algorithm's time or space grows relative to input size.
Example in context
"Nested loops = O(n²) — for 1,000 items, that's 1,000,000 operations. Binary search is O(log n)."
Practice this term
Master Big O Notation in context by working through exercises in the Data Structures & Algorithms module. You'll see the term used in real engineering scenarios with multiple-choice, fill-in-the-blank, and matching drills.