BigO BigONotation Time Complexity is a measure of the amount of time taken by an algorithm to run as a function of the length of the input. It is the core mechanism used in Data Structures & Algorithms to analyze and compare the efficiency and scalability of different solutions.

Crucially, Time Complexity does not measure wall-clock execution time (which is dependent on many variables such as hardware, programming language, and OS), but rather counts the number of operations that scale with the amount of data performed by the algorithm.


Asymptotic Notations

Asymptotic notation describes the limiting behavior of a function when the argument tends towards a particular value (). We use it to describe how an algorithm’s running time scales with the size of the input (n). The definitions below rely on Universal and Existential Quantifiers to establish strict bounds.

Big-O (): “Upper Bound” (The Ceiling) Big-O describes the maximum growth rate. It guarantees the algorithm will never be slower than this.

  • Note: Usually used for the Worst Case because we care about the safety limit.

Omega (): “Lower Bound” (The Floor) Omega describes the minimum growth rate. It guarantees the algorithm will never be faster than this.

  • Note: Usually used for the Best Case.

Theta (): “Tight Bound” (The Exact Fit) Theta is the most precise. It is used when the Upper Bound () and Lower Bound () are the same.

  • It is never faster, never slower. So it is .
NotationNameFormal DefinitionWhat it Represents
(Big-Oh)Upper Bound for all Worst-Case or maximum time required. E.g., A function is no worse than .
(Big-Omega)Lower Bound for all Best-Case or minimum time required. E.g., A function is no better than .
(Big-Theta)Tight Bound for all Average-Case running time. The algorithm is exactly proportional to .

Common Complexities (Scalability Hierarchy)

The efficiency of algorithms is ranked by their growth rate. When is large, slower growing functions are preferred.

NameComplexityDescription
ConstantTime is independent of the input size (). E.g., Array index lookup, Linked List(s) head/tail insertion.
LogarithmicTime grows slowly as the input size doubles. E.g., Binary Search in a sorted array, searching in a Binary-Search Tree(s).
LinearTime is directly proportional to the input size (). E.g., Linear search, traversal of a Linked List(s).
LinearithmicTime increases slightly faster than linear. E.g., Efficient sorting algorithms like Quick Sort and Merge Sort.
QuadraticTime is proportional to the square of the input size. E.g., Inefficient sorting like Bubble Sort, nested loops iterating over an array.
ExponentialTime doubles for every single element added to the input. E.g., Solving the Traveling Salesperson Problem (unoptimized).

Examples: Analyzing Code

To determine the complexity, we ignore constants and lower-order terms because we are only concerned with the dominant term as approaches infinity.

Example 1: Constant Time

int get_element(int arr[], int index) {
    return arr[index]; // Single array access
}