We have a GA4 tag that is linked to a Google Ads tag.
We load the GA4 tag with the standard snippet, and then GTM automatically loads the Google Ads tag. This is fine, except GTM is making two duplicate requests for the Google Ads tag:
- https://www.googletagmanager.com/gtag/js?id=AW-...
- https://www.googletagmanager.com/gtag/js?id=AW-...&l=dataLayer&cx=c
I am only calling my embed snippet once, and I am not explicitly loading the ads tag, only the GA4 tag.
I have opened these two scripts and they're almost identical. They're ~358KB and are barely different except for two lines of code.
This additional script is around 6% of unnecessary page weight, so I'd love to remove it if possible, to improve page speed.
Is there something I need to do differently in my tag settings or embed code to get GTM to not fire duplicate requests?
It seems there are others with similar problems: Google Tag Manager being loaded multiple times. But I haven't managed to find a solution.
This is my embed code:
// Loads required Google Analytics V4 Property script
const gaURL = `https://www.googletagmanager.com/gtag/js?id=${measurementID}`;
const alreadyLoaded = !loadScript(gaURL, `gtag-global`);
if (alreadyLoaded) {
return;
}
// configure
window.dataLayer = window.dataLayer || [];
window.gtag = function gtag() {
window.dataLayer.push(arguments);
};
window.gtag('js', new Date());
window.gtag('config', config.measurementID, {
allow_google_signals: config.enableAdsTracking,
groups: config.trackerName,
});
You see, this is something Google came up with. It's not a bug, it's a feature.
The big idea here is that you can change the behavior of the library based on the account settings. So when you go in your GA4 account, and set up the rules there for, let's say, cross-domain tracking, GA4 will effectively hardcode your changes in the gtag library. And it will compile it with your changes.
Now when you need to "configure" the gtag library, it doesn't just pull the config from the remote endpoint, it downloads the whole compiled library with your cross-domain linking settings in it already.
Seems like Google is more concerned about usability than performance though I really fail to see why it has to build the library on the remote endpoint rather than really reconfigure it on the front-end. Maybe this was an MVP and nobody cared to fix it later? Don't know, but we're stuck with it unfortunately.
However, since it's loading asyncly, it doesn't really cause any measurable issues. It's just a bad practice that adds a drop into the ocean of the world's tech debt. But just think about how bad it looks on Google's side. They now have to host billions of libraries. I doubt they manage to dynamically build them on every request. That would be just too expensive. And storing a billion of files on their CDN is not too expensive? Well apparently not.