I have one main Google Sheet that computers an approximately 10 column x 30 row worksheet. I have roughly 400 "client" Google Sheets (separate files) that pull that 10 x 30 worksheet in to use for data validation. Working in a school environment, the main sheet is readable/editable by a bunch of guidance counselors and academic deans, while the client sheets are accessible to individual students and their advisors (hence the multitude of sheets, to manage access).
The 10 x 30 worksheet is fairly static. It changes maybe once every couple of months, although yesterday it went through a series of rapid changes that precipitated my problem and thus my research. When it changes, those changes need to be reflected in all ~400 client sheets.
I had been using IMPORTRANGE(), but that stopped working yesterday. But not with the error described in the documentation regarding rate-limiting due to a large number of requests, but with one that suggests a total failure:
Error linking spreadsheets. The source spreadsheet has reached the maximum capacity for sharing and importing. Learn more
I read this as meaning "you've connected too many worksheets to this sheet, only the first n sheets will get data" (since that's what I'm seeing). But n is not well-defined, since this is a setup under development, and trashing (and then permanently deleting from the trash) old, disused client sheets changed nothing.
I experimented with dumping out the 10 x 30 worksheet into a CSV and using IMPORTDATA(), connecting the client sheets to that, hoping that that would trigger requests when the client sheets were opened, rather than when the main sheet changed. But that also maxed out with around 20 client sheets connected to it. To get Google Sheets to import the CSV from a file hosted in a shared drive, I invoked IMPORTDATA() thus:
=IMPORTDATA("https://drive.google.com/uc?export=download&id=<FILE_ID>")
Having read the documentation from Google on the limitations of the IMPORTRANGE() function and on optimizing data references, and having searched the web, I'm still not clear on how these (not 100% clearly documented) limitations affect my situation. I'm thinking of scripting the data pull to the client sheets next.
Any links to documentation (or empirical evidence) would be appreciated.
Also impacted by this issue. I guess the fact that I use a special endpoint to programmatically grant access to my new files is the reason why there seems to be no going back, even after deleting a bunch a files for which this 'backdoor access' was granted: How to allow access for importrange function via apps script?
To go around this, I developed an architecture and scripts to duplicate my source files and reference the new file IDs where needed. By anticipating the moment when the limit is reached through a counter, I am able to avoid the periodic crash.
Not super happy with this solution, but it does what I need