Over the last couple of days, I've noticed some odd behaviour in Facebook debugger (also used for Threads).
For instance, this URL refuses to show a preview in debugger:
With the error coming back: URL returned a bad HTTP response code (i.e. a 403 error).
It's happening with other pages too - some of which appeared to work in the last crawl. I talked to the host and they say that nothing has changed their end. Can anyone work out if there's an issue with this page that I'm not seeing - or point me in a direction to get more info? Thanks!
Direct link to debugger: https://developers.facebook.com/tools/debug/?q=https%3A%2F%2Fwww.hot-dinners.com%2FFeatures%2FHot-Dinners-recommends%2Fnew-london-restaurants-opening-in-march-2024
The first step that you need to do is to check the logs - why are they returning 403 for facebook? Is the scraper IP blocked? Is the bot blocked in the .htaccess file? Is it a modsecurity rule that was recently installed by your host? Do you have a security plugin that might be causing it? If you do, try disabling this plugin for a few moments and see if this works.
If you can't find the traffic anywhere in your logs, then it means that the traffic is blocked at a higher level, possibly by your hosting platform's data center firewall. It might also be a hiccup somewhere which will resolve itself.