I've been trying to determine the asymptotic running time for the following made up-algorithm.
Both algorithms Alg-1 and Alg-2 take as input and generate an array with n entries.
Alg-1 would be
ALG-1(A[1,...,n])
1: B= ALG-1(A[1,...,⎿n/2⏌])
2: C= ALG-2(A[⎿n/2⏌+1,...,n])
3: return ALG-2(B,C)
I'd like to determine the asymptotic running time of ALG-1, considering that running time f(n) of ALG-2 is Θ(√n).
I'm trying to build the recurrence for the algorithm and then solve it via Master theorem but I don't even know how to start. I'd be happy if someone could help me out a bit with that.
Let
T_k
denote the time it takes to runALG-1
on input of sizen = 2^k
. I'm going to assume that yourALG-2(B, C)
meansALG-2
run on the concatenation ofB
andC
, which is of lengthn
. Then, it looks like you have the recurrencewhere
C
is a constant. If you keep expanding that out by writingT_{k - 1}
in terms ofT_{k - 2}
, etc., you getThis is a geometric series, so the sum is of the same order as the leading term (basically, the first term is about as large as the sum of all the remaining terms). So
T_k
is of the ordersqrt(2^k)
, orsqrt(n)
.