Vue.js applications vs. SEO. Is server-side rendering really necessary?


#1

I’m a bit confused.

I’m building an application where all the content comes from a REST API. The application must be crawlable. I’ve just read the Server-Side rendering guide and the first sentence says very clearly that if a content is loaded asynchronousy it will not be visible for crawlers.

Google and Bing can index synchronous JavaScript applications just fine. Synchronous being the key word there. If your app starts with a loading spinner, then fetches content via Ajax, the crawler will not wait for you to finish.

This means if you have content fetched asynchronously on pages where SEO is important, SSR might be necessary.

It used to be true in the past couldn’t but is it still true?

Here’s a blog post from the official Google blog. We can read there that:

(…) Times have changed. Today, as long as you’re not blocking Googlebot from crawling your JavaScript or CSS files, we are generally able to render and understand your web pages like modern browsers.

and

(…) In general, websites shouldn’t pre-render pages only for Google – we expect that you might pre-render pages for performance benefits for users and that you would follow progressive enhancement guidelines.

So is it really true that pages that load their content via AJAX are not crawlable? According to the post on the Google blog, I would expect that it’s no longer an issue.


When to use what Vue CLI or Nuxt?
#2

good question +1


#3

I did some tests with Search Console using the Fetch by Google feature:

  • Pages rendered with JS are visible for Google when they’re rendered synchronously (as said in the documentation)
  • I wasn’t able to test if routes are visible for Google if a router is in the hash mode. Google ignores hash and renders a home page instead. It doesn’t mean that they are not indexed. It just means that I wasn’t able to perform tests.
  • Routes in history mode are visible for Google
  • When a page content is queried asynchronously (for example from a REST API) then Google Search Console returns an Error status. Still, it doesn’t mean that the page is not indexed, but it doesn’t mean it’s indexed either. But I’m afraid it’s not indexed :confused:

#4

would like to hear about your progress on this. I’m gonna be tackling this issue in a few weeks


#5

I’ll do more tests and share the details here. I’ve relied on the Search Console preview so far, but I don’t know how reliable this tool is - maybe the pages that caused errors in the Search Console preview are indexed anyway, maybe this is only a preview mode issue? who knows?. I’ll do real tests and check how the pages are indexed by Google. If not, we’ll be forced to use good, old pre-render in order to make our site crawlable :confused:


#6

Hi, wube, any further progress that you can share to us?