What is Big O?
Big O notation is a mathematical notation that describes the limiting behavior of a function when the argument tends towards a particular value or infinity. In computer science, big O notation is used to classify algorithms according to how their run time or space requirements grow as the input size grows. In analytic number theory, big O notation is often used to express a bound on the difference between an arithmetical function and a better-understood approximation; a famous example of such a difference is the remainder term in the prime number theorem. Big O notation is also used in many other fields to provide similar estimates.
The general procedure for Big-O runtime analysis is as follows:
- Figure out what the input is and what n represents.
- Express the maximum number of operations, the algorithm performs in terms of n.
- Eliminate all excluding the highest order terms.
- Remove all the constant factors.
In general cases, we mainly used to measure and compare the worst-case theoretical running time complexities of algorithms for the performance analysis.
The fastest possible running time for any algorithm is O(1), commonly referred to as Constant Running Time. In this case, the algorithm always takes the same amount of time to execute, regardless of the input size. This is the ideal runtime for an algorithm, but it’s rarely achievable.
In actual cases, the performance (Runtime) of an algorithm depends on n, that is the size of the input or the number of operations is required for each input item.
Algorithms can be classified from best to worst as follows:
- A logarithmic algorithm — O(log) Runtime grows logarithmically in proportion to n.
2. A linear algorithm — O(n)
Runtime grows directly in proportion to n.
3. A superlinear algorithm — O(nlogn)
Runtime grows in proportion to n.
4. A polynomial algorithm — O(nc)
Runtime grows quicker than previous all based on n.
5. An exponential algorithm — O(cn)
Runtime grows even faster than polynomial algorithm based on n.
6. A factorial algorithm — O(n!)
Runtime grows the fastest and becomes quickly unusable for even
small values of n.
Thank you for reading, I hope you learned something, and have a great day!