Time Complexity is the study of the efficiency of algorithms. It tells us how much time is taken by an algorithm to process a given input.
Let's understand this concept with the help of an example:
Consider two developers Shubham and Rohan, who created an algorithm to sort ānā numbers independently. When I made the program run for some input size n, the following results were recorded:
No. of elements (n)
Time Taken By Shubham's Algo
Time Taken By Rohan's Algo
10
90 ms
122 ms
70
110 ms
124 ms
110
180 ms
131 ms
1000
2 s
800 ms
We can see that at first, Shubham's algorithm worked well with smaller inputs; however, as we increase the number of elements, Rohan's algorithm performs much better.
In order to calculate the order(time complexity), the most impactful term containing n is taken into account (Here n refers to Size of input). And the rest of the smaller terms are ignored. Let us assume the following formula for the algorithms in terms of input size n:
Here, we ignored the smaller terms in algo 1 and carried the most impactful term, which was the square of the input size. Hence the time complexity became n^2. The second algorithm followed just a constant time complexity.
Note that these are the formulas for the time taken by their program.
Putting it simply, big O stands for 'order of' in our industry, but this is pretty different from the mathematical definition of the big O. Big O in mathematics stands for all those comlexitites our program runs in. But in industry, we are asked the minimum of them. So this was a subtle difference.
If we were to plot O(1) and O(n) on a graph, they would look something like this.