WHY do logarithms grow slower than any polynomial? What is the (understandable) proof for this?
Similarly,
WHY do exponentials always grow faster than any polynomial?
Given two (nonnegative) real-valued functions f
and g
, you want to compute
lim_{x -> infinity} f(x) / g(x)
This limit is:
0
if and only if f
grows slower than g
infinity
if and only if f
grows faster than g
c
for some constant 0 < c < infinity
if and only if f
and g
grow at the same rateNow you can take any examples you like and compute the limits to see which grows faster.
You could consider the derivatives.
d(x^n)/dx = nx^(n-1)
d(ln x)/dx = 1/x
for n >= 1 nx^(n-1) increases with x or stays the same, whereas 1/x decreases with x, so the polynomial grows quicker.
The logarithm of e^x is x, whereas the logarithm of n^x is n ln x, so using the above argument to compare the logarithm of e^x and the logarithm of x^n, e^x grows quicker.
EDIT: This answer is essentially doing what PengOne said.
We take the limit of
for constant p > 0 and show that the limit is zero. Since both log_2(x) and x^p go to infinity as x grows without bound, we apply l'Hopital's rule. This means our limit is the same as the limit of
Using simple rules of fractions, we reduce this to
Since the denominator goes to infinity while the numerator is constant, we can evaluate the limit - it's zero, which means that log_2(x) grows asymptotically more slowly than x^p, regardless of the (positive) value of p.
EDIT2:
By the way, if you are interested in this question and the posted answers, consider showing support for the new Computer Science StackExchange site by following this link and committing to the movement:
http://area51.stackexchange.com/proposals/35636/computer-science-non-programming?referrer=rpnXA1_2BNYzXN85c5ibxQ2