I am a little confused about how the search bots are going to crawl my ajax site.
http://www.example.com contains 3 links
- #!/abc
- #!/xyz
- #!/123
I have the <meta name="fragment" content="!">
included in the head of my page so the robots should visit those links using the following url and I am using prerender.io to serve up a page
- http://www.example.com?_escaped_fragment_=/abc
- http://www.example.com?_escaped_fragment_=/xyz
- http://www.example.com?_escaped_fragment_=/123
How will the robots behave when they crawl these pages? When they find new links on these pages what will the url's look like?
- #!/abc1 => http://www.example.com?_escaped_fragment_=/abc#!/abc1
- #!/abc2 => http://www.example.com/#!/abc/abc2
Here is Google's full AJAX crawling specification: https://developers.google.com/webmasters/ajax-crawling/docs/specification
The robots will see the
#!
and transform them into_escaped_fragment_
URLs, just like you mentioned. The<meta name="fragment" content="!">
tag is only necessary for pages that don't have the#!
, for example your home page or any HTML5 push state URLs. If the URL has a#!
, Google will automatically ask for the_escaped_fragment_
without checking for<meta name="fragment" content="!">
If Google crawls
http://example.com?_escaped_fragment_=/abc
and finds a link on that page for/#!/xyz
, it will make a separate request forhttp://example.com?_escaped_fragment_=/xyz
.So you should always have your URLs link to the
#!
URL, never to a_escaped_fragment_
URL. Google will transform them on its own.