I want to set noindex x-robots tag for a particular bad search engine which index even redirect page, instead of final destination.
At the top of my root .htaccess file, I have added below rules.
<IfModule mod_headers.c>
Header add X-Robots-Tag "BadBot: noindex"
</IfModule>
It works this way.
Requesting http://example.com/page
SERVER RESPONSE: HTTP/1.1 301 Moved Permanently
Date: Mon, 17 Jul 2017 11:17:10 GMT
Content-Type: text/html; charset=iso-8859-1
Connection: keep-alive
Location: https://example.com/page
SERVER RESPONSE: HTTP/1.1 301 Moved Permanently
Date: Mon, 17 Jul 2017 11:17:11 GMT
Content-Type: text/html; charset=UTF-8
Connection: keep-alive
Location: https://www.example.com/page/
X-Robots-Tag: BadBot: noindex
SERVER RESPONSE: HTTP/1.1 200 OK
Date: Mon, 17 Jul 2017 11:17:13 GMT
Content-Type: text/html; charset=UTF-8
Connection: keep-alive
X-Robots-Tag: BadBot: noindex
Requested: http://example.com/page
Final: https://www.example.com/page/
In the requested URL, the X-robots-tag is missing while forcing HTTPS. Is there any way tackle this issue?
Thanks
I think you made a small syntax error. Instead of
Header add X-Robots-Tag
it should beHeader set X-Robots-Tag
Reference : https://yoast.com/x-robots-tag-play/