Asymptotic run time complexity of an expression

248 Views Asked by At

Can I say that:

log n + log (n-1) + log (n-2) + .... + log (n - k) = theta(k * log n)?

Formal way to write the above:

Sigma (i runs from 0 to k) log (n-i) = theta (k* log n)?

If the above statement is right, how can I prove it?

If it is wrong, how can I express it (the left side of the equation, of course) as an asymptotic run time function of n and k?

Thanks.

2

There are 2 best solutions below

7
On

Denote:

LHS = log(n) + log(n-1) + ... + log(n-k)

RHS = k * log n

Note that:

LHS = log(n*(n-1)*...*(n-k)) = log(polynomial of (k+1)th order)

It follows that this is equal to:

(k+1)*log(n(1 + terms that are 0 in limit))

If we consider a division:

(k+1)*log(n(1 + terms that are 0 in limit)) / RHS

we get in limit:

(k+1)/k = 1 + 1/k

So if k is a constant, both terms grow equally fast. So LHS = theta(RHS).

Wolfram Alpha seems to agree.

When n is constant, terms that previously were 0 in limit don't disappear but instead you get:

(k+1) * some constant number / k * (some other constant number)

So it's:

(1 + 1/k)*(another constant number). So also LHS = theta(RHS).

2
On

When proving Θ, you want to prove O and Ω.

Upper bound is proven easily:
log(n(n-1)...(n-k)) ≤ log(n^k) = k log n = O(k log n)

For the lower bound, if k ≥ n/2, then in the product there is n/2 terms greater than n/2:
log(n(n-1)...(n-k)) ≥ (n/2)log(n/2) = Ω(n log n) ≥ Ω(k log n)

and if k ≤ n/2, all terms are greater than n/2:
log(n(n-1)...(n-k)) ≥ log((n/2)^k) = k log(n/2) = Ω(k log n)