Loading section...
Time Complexity Basics
Time complexity describes how runtime grows as input size increases. We use Big O notation to classify algorithms by their worst-case scaling behavior. O(1) - Constant Time O(n) - Linear Time O(n²) - Quadratic Time The difference between these classes becomes dramatic as input size grows. These numbers have real consequences when you run code on large datasets.