Python: linecache file size limit?

488 Views Asked by At

I have a pretty big .txt file. Each entry is on a new line. I am trying to access the file and iterate through each line to grab the entry. However when I use linecache.getline('file_path', 1), I am getting an empty string. Which from the Python docs, is how linecache returns errors. Is there a file size limit? I am trying to read a 1.2GB file. I am also fairly sure linecache is still trying to read the whole file into memory before getting a line number. RAM usage goes up about the size of the file and then returns to normal. Anything I'm doing wrong with linecache? Any suggestions other than linecache?

2

There are 2 best solutions below

1
Chase On

If you simply want to read a file line by line, without loading it into memory, the python open comes with its own generator for exactly this.

with open("filename", "r") as file:
    for line in file:
        # Do stuff

As long as you do what you have to do inside that for loop, you don't have to worry about memory.

More info on the official docs

0
Shivam Seth On

All you need to do is use the file object as an iterator.

for line in open("log.txt"):
    do_something_with(line)

Even better is using context manager in Python versions.

with open("log.txt") as fileobject:
    for line in fileobject:
        do_something_with(line)

This will automatically close the file as well.