I'm developing a website backed by a static site generator which will build about 100K static HTML pages. Currently, my workflow is building the project on my local machine, and use an FTP tool to upload the output folder (about 40G) from my local machine to a remote production server, which is a long and painful uploading process, which could take about 24 hours.
I'm wondering if there's a recommended way to set up a better build & deployment process to make it faster and more automated?
With an extremely large number of pages, build times have been the crux of static-site generators. The solution is to defer generating some pages from build time to request time.
For example, you can only statically generate the top 10,000 most requested at build time to keep your build times low. Then, at request time you can use Next.js and Incremental Static Regeneration to build static pages.
Let's say a request comes in for one of the other 90,000 pages you haven't statically generated. Instead of getting a static site, the first request will hit the server to fetch the data and generate the static page. Then, that page is cached. When another person visits this page, they will see the static page (which is much faster than talking directly to the server).
You can also invalidate the cache using the
revalidate
flag. For example, you could make your page fetch new information every minute usingrevalidate: 60
.Deploying a Next.js app to Vercel using this setup should reduce your build & deploy times to less than 10 minutes, while still creating a performant static site. Simply
git push
to your repository and the GitHub integration will build and deploy your application for you. No more FTP!