The two parts of Question 1 test the basics of asymptotic an

The two parts of Question 1 test the basics of asymptotic analysis. All logarithms in this question are to the base 2. Order the following functions by increasing growth rates. Justify whenever needed. n^3/2, 2n(log n)^2, Squareroot log n, n^0.5, 2^Squareroot n, 100. Compute the running time of the following algorithm in terms of n using O-notation. Show your calculations. Algorithm Loop(n) s leftarrow 0 for i leftarrow 1 to 2n do for j leftarrow 1 to i do s leftarrow s + i

Solution

1)

n^3/2 = n^1.5
2n * (log n)^2
Lets see its comparison with n^2
n ^2 & n*(log n)^2
n & (log n)^2 -> Divide by n
underroot (n) & logn
polynomial vs logarithmic -> Hence n*(log n)^2 is bigger

underroot(log n) -> This clearly is the smallest of the lot
n ^ 0.5 -> n ^ 1/2
2 ^ underroot(n) -> 2 ^ n/2 -> This is the highest of the lot
100 -> constant

Hence it becomes

underroot(log n) < 100 < n ^ 0.5 < n ^3/2 < n*(log n)^2 < 2 ^ n/2

2)Running time
s =0
for i=0 to 2n do
for j=1 to i do
s = s + 1

Running time for for i=0 to 2n is O(n * running time of its inner loop)

Running time for inner loop for j=1 to i do is O(j) which in the worst case will be O(n), when i = n

s = s+1 This is constant and negligible with comparision to the above expressions

Hence the running time of the algorithm is

O(n * n(inner loop running time)) = O(n ^ 2)

 The two parts of Question 1 test the basics of asymptotic analysis. All logarithms in this question are to the base 2. Order the following functions by increas

Get Help Now

Submit a Take Down Notice

Tutor
Tutor: Dr Jack
Most rated tutor on our site