Duration : 00:06:00

Big O notation is important for a variety of reasons:

- Evaluates the efficiency of algorithms.
- Describes the amount of time or space required as the input size grows.
- Allows for more efficient algorithm comparison.
- Understands scalability and predicts performance as input size increases.
- Improves code and improves overall performance.

The following are the processes to determine Big O notation:

- Define the dominant term.
- Identify the Order of Growth
- Use the Big O Notation
- Simplify Notation (Optional)

- Asymptotic analysis is a technique for analyzing how an algorithm behaves or performs as the input size changes.
- For example: When the input array is in reverse order, bubble sort takes the maximum amount of time (quadratic) to sort the items, which is the worst case case.
- However, when the input array is already sorted, the algorithm's temporal complexity is linear, which is the best-case case.
- When the input array is neither sorted nor reversed, it takes average time, also known as the average case.
- These durations are represented by asymptotic notations.

There are three basic asymptotic notations for analyzing algorithm complexity. They are as follows.

- Big-O notation
- Omega notation
- Theta notation

Big O notation represents the upper bound of an algorithm's running time or the largest amount of time it takes to perform its operations. As a result, it provides a worst-case complexity for an algorithm.

The omega notation (Ω-notation) represents the bottom bound of an algorithm's running time. Thus, it provides an algorithm's best-case complexity. It determines how fast an algorithm can run.

Theta notation represents the upper and lower bounds of an algorithm's runtime. In real-world issues, an algorithm does not perform worst or best; rather, it varies between the worst and best cases, giving the method's average-case complexity.

- O(1) – Constant
- O(logn) – Logarithmic
- O(n) – Linear
- O(nlogn) – Linearithmic
- O(nc) – Polynomial Like Quadratic (n2), Cubic (n3) etc.
- O(cn) – Exponential (2n)
- O(n!) – Factorial

It always takes the same amount of time to be executed. The complexity remains constant regardless of the input size. For example, accessing an element of an array at a specific index. This will only execute once and is a constant.

The complexity increases linearly as the input increases exponentially. As input size increases, O(log n) notation outperforms O(n) and O(n²).

The time required to perform the method is directly proportional to the input size n. Complexity increases linearly over time as the number of inputs increases, implying that the more inputs, the greater the complexity.

The time required to execute the method will increase at a rate that can vary from linear to exponential. Slower than exponential, but faster than linear.

The time required to run the algorithm is directly related to the power (c) of the input size n. It is the square of the input size n, similar to the Quadratic equation.

The time required to perform the method doubles with each addition to the input dataset. Usually, these are recursive algorithms that solve two smaller issues of size n-1 iteratively to answer a larger problem of size n.

The runtime for computing factorials increases exponentially with the amount of inputs. For example, 5! is 120, and 10! is 3628800. This algorithm grows swiftly and has the worst temporal complexity.

In the case of several difficulties, take the worst case, which will be your algorithm complexity.

**Constant O(1)**: Accessing an element in an Array by index number.**Linear O(n)**: Linear Search.**Logarithmic O(long)**: Binary Search.**Linearithmic O(nlogn)**: Heap Sort, Merge Sort.**Polynomial O(n^c)**: Bubble Sort, Selection Sort, Insertion Sort, Bucket Sort.**Exponential O(c^n)**: Tower of Hanoi.**Factorial O(n!)**: Permutations of a string, Brute force Search algorithm for Traveling Salesman Problem.

- Single and Nested Loops
- Decrease and Conquer, Divide and Conquer, Transform and Conquer
- Binary Search
- Two Pointers and Sliding Window
- Breath First Search and Depth First Search
- Data Structures: Stack, Queue, Hash Table, Trie, Heap, BST, etc.
- Dynamic Programming
- Greedy Algorithms