I'm messing around with file lookups in python on a large hard disk. I've been looking at os.walk and glob. I usually use os.walk as I find it much neater and seems to be quicker (for usual size directories).
Has anyone got any experience with them both and could say which is more efficient? As I say, glob seems to be slower, but you can use wildcards etc, were as with walk, you have to filter results. Here is an example of looking up core dumps.
core = re.compile(r"core\.\d*")
for root, dirs, files in os.walk("/path/to/dir/")
for file in files:
if core.search(file):
path = os.path.join(root,file)
print "Deleting: " + path
os.remove(path)
Or
for file in iglob("/path/to/dir/core.*")
print "Deleting: " + file
os.remove(file)
I made a research on a small cache of web pages in 1000 dirs. The task was to count a total number of files in dirs. The output is:
As you see,
os.listdir
is quickest of three. Andglog.glob
is still quicker thanos.walk
for this task.The source: