Determining running time of algorithm

107 Views Asked by At

I've been trying to determine the asymptotic running time for the following made up-algorithm.

Both algorithms Alg-1 and Alg-2 take as input and generate an array with n entries.

Alg-1 would be

ALG-1(A[1,...,n])
1: B= ALG-1(A[1,...,⎿n/2⏌])
2: C= ALG-2(A[⎿n/2⏌+1,...,n])
3: return ALG-2(B,C)

I'd like to determine the asymptotic running time of ALG-1, considering that running time f(n) of ALG-2 is Θ(√n).

I'm trying to build the recurrence for the algorithm and then solve it via Master theorem but I don't even know how to start. I'd be happy if someone could help me out a bit with that.

1

There are 1 best solutions below

2
On BEST ANSWER

Let T_k denote the time it takes to run ALG-1 on input of size n = 2^k. I'm going to assume that your ALG-2(B, C) means ALG-2 run on the concatenation of B and C, which is of length n. Then, it looks like you have the recurrence

T_k = T_{k - 1} + sqrt(n / 2) + sqrt(n) = T_{k - 1} + C * sqrt(2^k),

where C is a constant. If you keep expanding that out by writing T_{k - 1} in terms of T_{k - 2}, etc., you get

T_k = C * sqrt(2^k) + C * sqrt(2^{k - 1}) + C * sqrt(2^{k - 2}) + ...

This is a geometric series, so the sum is of the same order as the leading term (basically, the first term is about as large as the sum of all the remaining terms). So T_k is of the order sqrt(2^k), or sqrt(n).