Suppose a machine on average takes 108 seconds to execute a

Suppose a machine on average takes 10^-8 seconds to execute a single algorithm step. What is the largest input size for which the machine will execute the algorithm in 2 seconds assuming the number of steps of the algorithm is T(n) = a. log n b. Squareroot n c.n d. n^2 e. n^3 f. 2^n For the machine in the previous example, how long will it take to run the algorithm for an input of size 1,000, assuming the time complexities from the same example?

Solution

Hello. I would like to answer the questions accordingly.

1. In the first question, it is mentioned the a machine can take an average of 10^-8 seconds to excecute a single algorithm step. Now, let us assume that the largest size which the machine the machine can excecute the algorithm in 2 sec assuming that the number of steps taken by the algorithm is T(n)=log n

2. Next, we have to find the time taken for the machine to run the algorithm for an input of size 1000 assuming the time complexities ie, T(n)= log n

Here, the input size is already given as 1000. Now, let us assume that the time taken is 1 second. It can be defined as, 1 sec= 100000000. So by taking the values from the above example, the time taken for an algorith for an input of size 1000 with the time complexity T(n)= log n would be derived from 2^(log n) ie, 2^ log(1000)<=2^100000000, were 1000 is the input size.

Thank you.

 Suppose a machine on average takes 10^-8 seconds to execute a single algorithm step. What is the largest input size for which the machine will execute the algo

Get Help Now

Submit a Take Down Notice

Tutor
Tutor: Dr Jack
Most rated tutor on our site