How to make Apostrophe/Node.js content pages crawlable?

324 Views Asked by At

Obviously Apostrophe CMS code is javascript-based, so I'm wondering to what extent Apostrophe pages are 'properly' indexable (i.e. "Ready for JavaScript Crawling & Indexing"?).

The reason I ask this is because of this article, in which Moz tested a number of javascript-derived content pages and found them lacking from an SEO perspective.

So can Apostrophe / Node.js pages be properly indexed by search engines?

1

There are 1 best solutions below

1
On BEST ANSWER

I'm the lead developer of Apostrophe at P'unk Avenue.

The article you're referring to is talking about crawling URLs that exist only due to JavaScript in the browser, such as React apps with a JavaScript-based router in them.

The issue here is that Google can't naively crawl them just by fetching pages from the server and parsing links in the returned HTML. It has to either run the JavaScript itself like a browser would or use other tricks where the server does extra work just for Google, etc.

Apostrophe is powered by Node.js, which is JavaScript running on the server. The pages it sends out are basically indistinguishable from pages generated by any other CMS or even a static folder full of webpages — the server sends perfectly valid HTML that doesn't require any browser-side JavaScript at all in order to read all of the text and find all of the links. The fact that the server happens to have a JavaScript engine inside it is irrelevant to Google because it is receiving normal HTML from that server. That server could be running PHP or Ruby or Python or CrazyMadeUpLanguage... doesn't matter to Google as long as it's sending HTML to the world.