Suppose three algorithms are run on a machine that can execu

Suppose three algorithms are run on a machine that can execute one instruction every microsecond (10\'6 seconds). They require, respectively, n log_2(.n), n log2(n)+5n, and n^2 operations for a data set of size n. Compare the time the algorithms require for data sets of size 100 100,000 100,000,000 Comment on any patterns you see in your table of results.

Solution

Table containing time taken by algorithms in seconds

1) n^2 takes the maximum time and log2(n) + 5n takes minimum time to run any algorithm for any size of input

2) As the input size increases the time taken increases most for the case of n^2

n log2(n) n log2(n)+5n n^2
100 0.000664 0.0001 0.000507 0.01
100000 1.660964 0.1 0.500017 10000
100000000 2657.542 100 500 1E+10
 Suppose three algorithms are run on a machine that can execute one instruction every microsecond (10\'6 seconds). They require, respectively, n log_2(.n), n lo

Get Help Now

Submit a Take Down Notice

Tutor
Tutor: Dr Jack
Most rated tutor on our site