For Context (These are also my assumptions with how memory, JSON, and JavaScript works):
- I make tons of files within
posts/*.mdx
. - When I 'build' my static site, I use these
posts/*.mdx
files to generate JSON and expose a typesafe array calledallPosts
(I'm using contentlayer for this by the way). - Since this
allPosts
is an array being loaded into memory, I assume it's going to be theoretically slower the more content is added in. It would probably have a high upper limit. - The build-step shouldn't have a problem (generating static pages at build time, it's okay if that's slow).
- But I'm thinking maybe displaying a list or doing pagination would be difficult on an array that has (as an example) 50,000 objects:
// I have a feeling that sticking to only this way to render/generate your list would be problematic. allPosts.map((post) => ...)
- Probably even more problematic if I need to
.sort()
or.find()
only the articles I want to render, first. Since there's no database involved, I really have no ability to "not load" all the data into memory. I also have no way to query only the ones I want, or get them in a specific order. I assume JSON and JavaScript, what ever is defined inside them would always be loaded into memory when they're executed.
My Questions:
I'm wondering if anyone has ever encountered this (upper-limit)? Is there any possibility of only loading in the first 20 items of an existing array into memory (Kind of like doing pagination but with a static array).
I'm probably worrying about a problem I'll never encounter, but just out of curiosity, can anyone give insight on this? Do I stream it? Do I virtualize it (Not too familiar with this yet) ?