Suppose a machine on average takes 108 seconds to execute a
Solution
Hello. I would like to answer the questions accordingly.
1. In the first question, it is mentioned the a machine can take an average of 10^-8 seconds to excecute a single algorithm step. Now, let us assume that the largest size which the machine the machine can excecute the algorithm in 2 sec assuming that the number of steps taken by the algorithm is T(n)=log n
2. Next, we have to find the time taken for the machine to run the algorithm for an input of size 1000 assuming the time complexities ie, T(n)= log n
Here, the input size is already given as 1000. Now, let us assume that the time taken is 1 second. It can be defined as, 1 sec= 100000000. So by taking the values from the above example, the time taken for an algorith for an input of size 1000 with the time complexity T(n)= log n would be derived from 2^(log n) ie, 2^ log(1000)<=2^100000000, were 1000 is the input size.
Thank you.
