I am trying to figure out how to use Python's timeit module but I get vastly different timings between its timeit.timeit method and timeit.Timer.autorange():
import timeit
setup = """
def f():
x = "-".join(str(n) for n in range(100))
"""
def f():
x = "-".join(str(n) for n in range(100))
t = timeit.timeit("f()", setup=setup, number=100)
print(t)
num, timing = timeit.Timer(stmt='f()', globals=globals()).autorange()
per_run = timing/num
print(per_run *1000)
results in numbers like
0.0025681090000944096 # timeit.timeit
0.014390230550020533 # timeit.Timer.autorange
so an order of magnitude of difference between the two approaches.
I am probably doing something wrong but have no idea what. The autorange documentation is so sparse.
The result of
timeit.timeitis the total runtime, not per iteration. You need to divide it by thenumber!This example which uses the number of iterations determined by
autorangeshould nicely show it:Right now it gave me