How to get certain pages to not be indexed by search engines?

76 Views Asked by At

I did:

<meta name="robots" content="none"/>

Is that the best way to go about it, or is there a better way?

2

There are 2 best solutions below

1
On BEST ANSWER

You could use a robots.txt file to direct search engines which pages not to index.

http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=156449

Or use the meta noindex tag in each page.

1
On

You can create a file called robots.txt in the root of your site. The format is this:

User-agent: (user agent string to match)
Disallow: (URL here)
Disallow: (other URL here)
...

Example:

User-agent: *
Disallow: /

Will make (the nice) robots not index anything on your site. Of course, some robots will completely ignore robots.txt, but for the ones that honor it, it's your best bet.

If you'd like more information on the robots.txt file, please see http://www.robotstxt.org/