Let’s say we have a problem of size n. Now for each step of our algorithm(which we need write), our original problem becomes half of its previous size(n/2).
So at each step, our problem becomes half.
Step | Problem | —— | —— | 1 | n/2 | 2 | n/4 | 3 | n/8 | 4 | n/16 |
When the problem space is reduced(i.e solved completely), it cannot be reduced any further(n becomes equal to 1) after exiting check condition.
problem-size = 1
problem-size = n/2k
n/2<sup>k</sup> = 1 or
n = 2<sup>k</sup>
log<sub>e</sub> n = k log<sub>e</sub>2
or
k = log<sub>e</sub> n / log<sub>e</sub> 2
k = log2 n
or simply k = log n
Now we know that our algorithm can run maximum up to log n, hence time complexity comes as
O( log n)
A very simple example in code to support above text is :
for(int i=1; i<=n; i=i*2)
{
// perform some operation
}