Algorithm A solves problems by dividing them into 5 subprobl
Algorithm A solves problems by dividing them into 5 subproblems of half the size of n (a large positive integer in power of 2), recursively solving each subproblem, and then combining the solutions in linear time. (b) Algorithm B solves problems of size n (a large positive integer) by recursively solving 2 subproblems of size n – 1 and then combining the solutions in constant time. (c) Algorithm C solves problems of size n (a large positive integer in power of 3) by dividing them into 9 subproblems of size n/3, recursively solving each subproblem, and then combining the solutions in O(n 2 ) time. What are the running times of each of these algorithms (in big-O notation), and which would you choose? We assume T(1) = 1 for all the three algorithms.
Solution
Writing Recurrance Relation for all Algorithms:-
 Algorithm A: 5 Subproblems of size n/2 , and linear time for combine so
 
 A : T(n) = 5T(n/2) + n
 
 Algorithm B: 2 Subproblems of size n-1 , and constant time for combine so
 
 B : T(n) = 2T(n-1) + 1
 
 Algorithm C: 9 Subproblems of size n/3 , and quadratic time for combine so
 
 C : T(n) = 9T(n/3) + n2
 ==================================================
 
 Running time for A : T(n) = 5T(n/2) + n   
 nlog 25 > f(n) , So it falls into Case 1
 
 T(n) = O(nlog 25 )
 Using Masters Theorem : nlog 25 > f(n)
 
 
 
 
 
 
 B : T(n) = 2T(n-1) + 1
 T(n) = 2n -1
 T(N) = O(2n)
 
 
 
 C : T(n) = 9T(n/3) + n2
 
 Using Masters Theorem : nlog 39 => n2
 
 
 f(n) = n2 , So by Case 2 of Masters Theorem we get T(n) = O(  n2 log n)
 
 
 So we have got all the three complexities,
 
 and which would you choose? I would Chose C T(n) = O(  n2 log n) because it is asymptotically smaller than
 T(n) = O(  n2 .6) and T(N) = O(2n)
 
 
 Thanks, let me know if there is any concern.
 
 


