
BigO Notation
Big O notation is a way to describe how the time or space needed for an algorithm increases as the input size grows. It provides a high-level understanding of an algorithm’s efficiency by expressing its worst-case performance in terms of input size (e.g., 10, 100, 1,000). For example, an algorithm with O(n) time complexity grows linearly with input size, while one with O(n²) grows quadratically. This helps compare algorithms and choose the most efficient one for large datasets.