Image for Landau notation

Landau notation

Landau notation, also known as Big O notation, is a way to describe how the running time or complexity of an algorithm grows as the input size increases. It provides a high-level understanding of an algorithm’s efficiency by focusing on its dominant behavior rather than exact details. For example, if an algorithm's time grows proportionally to the input size, we say it’s O(n). This helps compare algorithms and predict performance for large inputs, emphasizing the most significant factors as the problem scales, while ignoring smaller, less impactful terms.