General enquiries

  • +44 (0)113 275 3912
  • [email protected]

Become a client

JavaScript and SEO. Friend or foe?

So, what is JavaScript and what’s it got to do with your site’s SEO?

Author

Davide Tien

Date

06.10.2022

Javascript and Seo.png

First things first. JavaScript is a programming language, which alongside HTML and CSS, is used to build web pages.

JavaScript renders in a user’s browser, which has made it problematic for SEOs, like me, and Google to crawl and understand. Search engines do not immediately render JavaScript as they make requests to the server, this is problematic because it impacts how the information on your site is understood by search engines. Whilst improvements in technologies have meant that Google is now able to crawl and understand JavaScript, it doesn’t mean us SEOs can sit back and leave the leg work to Google.

JavaScript can be a confusing topic for a lot of SEOs, and as more and more websites are being built in JavaScript frameworks, this topic has significantly grown in importance. In this article, we’re going to talk through some JavaScript basics, how you can identify potential JavaScript issues, and best practice for ensuring JavaScript isn’t affecting your SEO.

JavaScript 101

The boxes below illustrate how Google processes URLs:

bfee0bbe-rendering-process-768x493.png

Let’s break this down…

Crawl Queue – This is essentially a list of URLs that need to be crawled. The list gets updated as Googlebot discovers new URLs.

Crawler – Googlebot crawls the URLs within the queue and makes the HTML requests.

Processing – Googlebot crawls the contents of the page. New URLs found as part of this process are placed back in the crawl queue. At this point, Google also accesses all the signals placed within the HTML, looking at things like ‘noindex’ tags and canonicals to assess the indexing of the page. It is at this point that Google will look at the need to render the page based on how heavily the site relies on JavaScript. Those that need to be rendered get placed in a render queue.

Render Queue – Similar to the crawl queue, this is essentially Googles queue of pages that need to be rendered.

Rendering – Google renders pages from the render queue. As it renders these into HTML, it passes them back to the HTML processing, where it will then again analyse the rendered HTML and place new URLs found into the crawl queue.

Indexing – Google analyses the pages content to determine its relevance and places URLs into its index.

Ranking – Google matches search queries to pages in the index and ranks them in search results against all ranking metrics.

It’s in the render que and rendering stages where things can get complicated where JavaScript is involved. While Google can render JavaScript, this additional process can become particularly problematic for the following reasons:

  1. It’s an additional step Google needs to undertake before indexing and ranking your page.
  2. Google will crawl HTML with priority before rendering JavaScript.
  3. The rate at which Google renders is dependent on render budget.
  4. It takes longer for Google to not only to index JavaScript reliant pages, but to pass link equity to your internal pages, making it inherently more difficult to rank.

How can I identify potential JavaScript issues for my SEO?

There are a few things you can do. You can check your pages in Search Console with the URL inspect tool or the Mobile Friendly SEO test. Using either of these, you can look to compare what is visible in the browser against what is being rendered in these tools. Essentially, you can see what Google is seeing and crawling. Any differences signifies that there is an issue.

There are also extensions you can use, such as view rendered source, to visually compare what has been preloaded and what has been loaded within the browser.

What’s best practice to ensure JavaScript isn’t affecting my SEO?

HTML – It’s important to remember that Googlebot crawls HTML with priority. Delivering all the key SEO elements upon Google’s first wave of crawling is key to ensuring that indexing and ranking of the page is done as efficiently as possible. And if Google needs to render your site, make sure all key HTML elements are pre-loaded.

Meta Data – Ensure that elements such as meta titles and essential content are loaded and delivered upon the first wave of crawling.

Navigational Elements – Implement navigational elements so that they are also pre loaded. Navigational contextual links not only help Google understand relevance, but also help treacle link equity and PageRank through the site. Waiting for Google to render will slow down the growth and ranking of key category pages which Google does not see within its first wave of crawling.

Directives – Check that JavaScript is not changing directives upon render. You want to avoid scenarios where you are delivering mixed signals to Google, for example a ‘noindex’ tag pre-rendered and index tag in the rendered version.

One of the most important elements to ensure no changes are made in the rendered version are canonicals. John Mueller has previously stated that Google only fetch the pre-rendered version, while Martin Splitt has confirmed that mixed signals such as these will leave Google to guess which version should be honoured.

2804119f-canonical-rendering2-768x251.png

What’s the ideal solution?

Server-side rendering is one of the most commonly adopted solutions in order to tackle JavaScript issues affecting SEO. This implementation renders webpages within the server before passing them to the browser and crawlers. Important SEO elements will load server side, resulting in Google crawling and processing all the key HTML elements within the first crawl. Google will essentially index your pages quicker without having to place your pages in rendering queues. It’s worth noting that such implementations will require a heavy input from your development team, something to bear in mind when planning out resources.

An alternative solution would be dynamic rendering. This practice provides users and bots different versions of the webpage. Crawlers are provided a rendered version of the webpage whilst users will load their version in the browser.

Google is always evolving, and there’s no doubt crawling and rendering technologies will develop further. However, as it currently stands, Google needs additional help in order to be able to fully understand what’s happening on your brand’s site. Otherwise, you could be risking not appearing for the searches that matter to your business.


An Advantage Smollan Company


© IMA-HOME 2024
LDS • MCN • LDN • AMS • GIB • NYC • CPT • SYD
  • Modern Slavery Act
  • Privacy and Cookie Notice
  • Gender Pay Gap Response