Please explain show work and explain how you got the answer
Please explain show work and explain how you got the answer. Thank you
For each of the following program fragments, give an analysis of the running time (Big-Oh will do).
Give the exact value leading to the Big-Oh conclusion, and show your work.
1.)
sum = 0; //1
for( i = 0; i < n; ++i )
++sum;
2.)
sum = 0; //2
for( i = 0; i < n; ++i )
for( j = 0; j < n; ++j )
++sum;
Solution
1. We break down the code into each execution steps.
Now we will try to determine the number of times each steps will be executed
sum = 0 -> 1
i=0 -> 1
i<n -> n + 1
++i -> n
++sum -> n
So the total steps in execution = 1 + 1 + n + 1 + n + n = 3n + 3
From this we can avoid the constant values as the execution time for this step is contant. So we are left with 3n.
Now we perform an approximation and represent the value in standard form. This is done because there is very small difference between the growth of complexity between 3n and n. Both these have linear growth of complexity. So we perform an approximation and represent the complexity as O(n).
2. Similar to part 1, we divide the code into smaller execution steps
Now we determine the number of times each step is executed.
sum = 0 -> 1
i=0 -> 1
i < n -> n+1
++i -> n
j = 0 -> n
j < n -> n * (n+1)
++j -> n*n
++sum -> n*n
So the total number of exection steps = 1 + 1 + (n + 1) + n + n + n * (n+1) + n*n + n*n = 3*n^2 + 4*n + 3
Dropping the constant value will result in 3*n^2 + 4*n as time of execution remains constant.
Out of the two terms, as value of n approaches infinity, the value of 3*n^2 will be very large compared to 4*n. So we ignore 4*n and is left with 3*n^2.
The value of 3*n^2 varies similar to n^2. So we perform an approximation and represent the final complexity in more standard form as O(n^2)

