NodeJS Split large pdf file into multiple files

1.3k Views Asked by At

I have an app where the user uploads a very large pdf file ( 1GB ), and I need to split that file into multiple pdf documents (each 2 pages long) and map them to the associated user, I am using hummusJS to do the job, but the POST request takes way to long, does anyone know a good alternative to doing this?

here is my code:

const hummus = require('hummus');
const streams = require('memory-streams');
const PDFRStreamForBuffer = require('../pdf-stream-for-buffer.js');

const getPages = (buffer, offset = 0, numberOfPages = 2) => {
    //Creating a stream, so hummus pushes the result to it
    let outStream = new streams.WritableStream();
    //Using PDFStreamForResponse to be able to pass a writable stream
    let pdfWriter = hummus.createWriter(new hummus.PDFStreamForResponse(outStream));

    //Using our custom PDFRStreamForBuffer adapter so we are able to read from buffer
    let copyingContext = pdfWriter.createPDFCopyingContext(new PDFRStreamForBuffer(buffer));
    //Get the first page.
    copyingContext.appendPDFPageFromPDF(offset);
    copyingContext.appendPDFPageFromPDF(offset + 1);

    //We need to call this as per docs/lib examples
    pdfWriter.end();

    //Here is a nuance.
    //HummusJS does it's work SYNCHRONOUSLY. This means that by this line
    //everything is written to our stream. So we can safely run .end() on our stream.
    outStream.end();

    //As we used 'memory-stream' and our stream is ended
    //we can just grab stream's content and return it
    return outStream.toBuffer();
};

0

There are 0 best solutions below