How to make your JavaScript rendered website SEO friendly

How to make your JavaScript rendered website SEO friendly

Shivam Mittal
29
Sep

This article is about two of the most dynamic domains in the web industry: JavaScript and SEO. While they both have been known to contribute significantly to the growth and advancement of the web, they don’t much seem to go well together. While the influx of JavaScript-based technologies like Angular, React, jQuery, etc. has always predominantly contributed to provide a richer user experience on the web, on the other hand, Search Engine Optimization is known to incline more towards search engines than the users. This has brought about a technology gap in the web industry where it is evident that the search technologies have not been able to parallel the growth in front-end technologies.


The problem with SPAs

 

The problem with Single page applications is the unconventional method of content rendering they adopt:

 

In traditional or Multi-page applications(MPAs), each page's content and layout is separately rendered by the server(server-side rendering) and then sent to the client (browser) or search engine. This does not involve the client-side JavaScript operations to add content to the layout. On the contrary, the modern JavaScript websites have a different architecture in which the layout information is delivered by the server along with the content only for the first time. For each subsequent page request, the server only delivers raw data in the form of JSON or other data structure format and the client is responsible for rendering the page by incorporating the raw data within the layout structure.

 

Thus, from search engines’ perspective, if the content can’t be properly rendered, only the initial HTML layout of the pages will get indexed!


Google indexing and JS rendering

 

Search engines like Google are indeed conscious and concerned of this scenario and have been upgrading their search policies, crawlers, indexes, and algorithms to incorporate complex Javascript websites.

 

Google has been on the forefront in this search progress, and has now declared its crawlers’ ability to crawl and index Javascript content well. Though, the underlying reality which counts is that other other search engines (which collectively form a substantial search base) are not at par with these advancements.

 

One such progressive move was introduced in 2009 by Google, named the AJAX crawling scheme. It required the webmasters to create separate search-friendly versions of their AJAX-enabled pages and add an extra bit of extension _escaped_fragment_= to their URLs. Later, Google completely depreciated this crawling scheme (as clarified in this tweet) stating it crawlers’ ability to crawl and understand the dynamic pages with the normal URL.

 

Besides, despite Google’s statements, proper rendering of dynamic content on the web still remains an assumption which can’t be guaranteed. Google confirmed it through their Official Webmasters Blog, saying:


Sometimes the JavaScript may be too complex or arcane for us to execute, in which case we can’t render the page fully and accurately.

 

Until April 2019, Google’s Web Rendering Service was still based on Chrome 41 released in Q1 2015! Compare it to the advancements in JavaScript and also with the latest version of Chrome at that time - Chrome 74! Fortunately, the next month Google announced the “Evergreen Googlebot”, i.e. Googlebot being regularly upgraded to the latest version of Chromium (85 at the time of this writing).


Adhering to SEO-friendly website structure

 

Even with Javascript rendering, searchability of the website’s content can be maintained at appreciable levels following some effective tactics:


Degradation

 

If possible, make provisions for your JavaScript website to degrade optimally to a simpler version, in case the client (browser, or in the other case search engine) is incapable to handle the JavaScript, or if it has been disabled! Google and Facebook do the same for the sake of incapable devices and those with slower internet connection.


Transpilation

 

Make sure your JavaScript code has cross-browser and universal compatibility. Babel and other transpilers can be used to transpile the higher versions (ECMAScript 6 or later) into ES5 for older browsers and incompatible search engines.


Access to resources

 

Make sure your website’s JavaScript and CSS files, images, and other dependent assets are not blocked with the Robots.txt file for access by the search engines.


Scripts’ performance

 

Your scripts must load quickly. Just like the crawl budget, the time search engines allot to execute scripts is limited, you can’t expect them to wait infinitely! Search engines may refuse to execute your JavaScript implementations if they are invalid or if they lack adequate performance.


URL structure

 

Keeping the URL structure clean (and more static-like) helps with both search engine crawling and user experience. Though Google is now capable of crawling and indexing dynamic parameters, use the least number of dynamic parameters required for identifying pages. Use Parameter Handling in Google Search Console to tell Google which parameters to ignore. This also helps with canonicalization issues. This should be done with care as specifying important parameters as unimportant (e.g. pagination parameters) may result in deindexation of important pages! Another related aspect is that any significant change in content should be accompanied with a change in the URL. Remember, search engines can only identify different pages (having different content) through different URLs.


Navigation

 

For internal navigation, use <a href=””> instead of onclick=”window.location=”. Search crawlers will only crawl through pages specified with the earlier method, and same for the passing of link juice.


User interaction-based content

 

Content tied with user interaction, i.e. which appears with mouse hovers, clicks, page scrolls, form interactions, etc. and which is not apparent naturally within DOM will never be crawled! Search engines are not humans, but robots - which are too dumb to perform these actions!


Canonical tags

 

Implement canonical tags with plain HTML or X-robots tags. Those injected dynamically by JavaScript are prone to getting ignored altogether by search engines!


Alternative content rendering approaches

 

If client-side rendering in your single page application is still creating havoc with search engines, you may consider (after relevant analysis) to implement either of these two alternative rendering approaches:


Dynamic Rendering

 

Dynamic rendering, as the name implies, requires setting up the process of switching between client-side rendering and pre-rendering of content for different clients. For browsers and search engines which are incapable or reluctant to execute JavaScript, the content is pre-rendered with the help of a pre-renderer and then delivered, and those clear of any hurdles are served the regular version of the website. As mentioned earlier, the similar approach is also adopted by giants like Facebook and Google.

 

On Jun 30, 2020, Bing officially recommended Dynamic rendering especially for big websites.

 

Google also supports this content rendering method, however chances are that with the advancements in search technologies, this approach may also become obsolete as did the AJAX crawling scheme.


Isomorphic Rendering

 

Isomorphic applications or universal applications are also based upon the dynamic rendering’s approach of client-detection and relevant rendering. However, in case of Isomorphic rendering instead of relying on an external pre-renderer, the JavaScript is executed on the server itself, i.e. Isomorphic code can be run on both client and the server. This is done by running JavaScript on the server with Node.js, while React’s virtual DOM can also be manipulated on the server as well!


The final word

 

Conforming to SEO with JavaScript-rendered web applications can be daunting. Often this decision is perplexing and requires conscientious analysis whether to choose a multi-page application for the sake of better rapport with the search engines or to go for a single-page application using JavaScript rendering features for an enhanced user experience. At Sarvay Xenia, we have Angular and React developers who have built modern Single Page Applications which have performed fairly well on search engines as well. They have been able to achieve this through a well-established coordination with our SEO experts. Do contact us if you are having SEO issues on your website or want to build a modern JavaScript web application that scores on Google just as well!