Google URL Parameter tool - what to exclude?

247 Views Asked by At

Situation: Site built on OpenCart, which utilizes faceted navigation. Problem: Google Webmaster Tools' "URL Parameters" tool reports a huge number of URLs with parameters like "sort", "order", "limit", "search", and "page.

I would like to exclude them, but I'm worried about 2 things:

1.) Maybe there's a better way to handle this issue? Exclusion directives in robots.txt? Something else? I.e. fixing the problem on the site, before Google detects it in the first place.

2.) I don't want to accidentally exclude actual content.

So... anyone familiar with SEO and/or OpenCart, please give me a 2nd opinion on which of these parameters I should exclude, or change the settings for?

Thanks!

screenshot

1

There are 1 best solutions below

0
On

I'm not aware of a robots.txt option. But you might sovle this using http headers and/or html headers.

  1. You could set <META NAME="ROBOTS" CONTENT="NOINDEX, FOLLOW"> for duplicate pages in HTML header (cf. http://www.robotstxt.org/meta.html).

  2. Another approach could be to provide canonical URLs (e.g., in HTML header or HTTP headers, cf. https://yoast.com/rel-canonical/, https://support.google.com/webmasters/answer/139066 and https://support.google.com/webmasters/answer/1663744).