im working on a site where I use three.js for 3d content, I've made it work with relative paths, loads way faster than from unpkg or other 3rd party. However, when I try to measure the pagespeed with google pagespeed insight it fails with the NO_FCP error code. I have read that thats the response for slow websites if they fail to load in under 15 seconds, but as anybody can check it (at least in europe) the page loads pretty fast: https://adambernath.com/lotto/lotto.html
The only thing here that I do use which differs from other pages i build usually is three.js, I have no other idea what could cause this issue.
Low probability answer
Possibly this is caused by a bug in Lighthouse at the moment where it is trying to detect libraries.
If you look in your server logs you should see a request for "https://adambernath.com/asset-manifest.json" which returns a 500 error on your server.
If you fix that to return a 404 it may fix the issue.
More likely
It is known that Chromium and Lighthouse have problems with full WebGL rendered pages.
Give it some time and it will probably get resolved. For now I would resort to using a performance trace using the "performance" tab, with Network set to "fast 3g" and CPU set to "4x slowdown".
This will give you the raw data that lighthouse uses and let you spot bottlenecks etc. Obviously the down side is it is a lot harder to understand so you may need to do some googling!
Even then you may not get FCP and LCP as I have had mixed results when I tested this for you, but those are just arbitrary measures, the Performance tab will tell you all you need to know from the screenshot timeline and from the CPU usage (long tasks).
Performance wise - my first reaction would be to use
gzip
orbr
on your "fgoly.gltf" file - it currently does not look like it is compressed at all and represents 70% of the page weight at the moment.