how to fix 'noindex' detected in 'X-Robots-Tag' http header error in wordpress NGINX server?

4.6k Views Asked by At

I created a sitemap using rankmath plugin when i added that sitemap url to the search console it shows couldn't fetct error then i used URL Inspection tool to check my sitemap It show url is not on google. After live test It shows 'noindex' detected in 'X-Robots-Tag' http header.

check this image to view issue

1

I want to fix this issue. If anybody know how to fix x-robots noinex tag from response header of sitemap please hrlp to fix it.

1

There are 1 best solutions below

0
On

The couldn't fetch issue on Google Search Console can happen for several issues. I strongly suggest you to visit the below URL to see if that resolves this issue for you: https://rankmath.com/kb/couldnt-fetch-error-google-search-console/

For noindex' detected in 'X-Robots-Tag' http header you're seeing on URL Inspection tool is logically correct and you shouldn't be bothered in this case as the sitemap URL(s) are set to noindex intentionally. This is because the sitemap URLs are not built to be indexed on Google or shown on SERPs, rather, they should be used as a backup mechanism to tell Google to crawl your site URLs. Thus, you should ignore that error on the URL Inspection Tool.

Please be aware that you should not submit your sitemap URLs to the URL Inspection tool, but rather to the sitemap section of Google Search Console.