Need to read a text file of size greater than 5 GB in chunks to reduce processing time using C++

153 Views Asked by At
#include<iostream>
using namespace std;
int main(){    
    string lineValue;
    ifstream myFile("file.txt");
    if (myFile.is_open()) {

        while (getline(myFile, lineValue)) {
            cout << lineValue << '\n';
        } 
        myFile.close();
    }
    else cout << "Unable to open file";
    return 0;
}
  1. Want to read the file in chunks.
  2. The chunk that I've just read should go further for parsing.
  3. In the meantime another chunk should be read.

How should I do this?

1

There are 1 best solutions below

0
On

A kind of solution to this (reading a file line by line and passing the lines to threads) can be found under Message passing between threads using a command file. The efficiency however depends on the complexity of tasks for the worker threads. Sometimes the file I/O buffering already is efficient enough to compensate for I/O latency.