You may not know, but much of the World Wide Web today is powered by JavaScript. But despite JavaScript being one of the most popular coding languages, search engine crawlers like Googlebot can still have a hard time rendering and indexing it.
This means that JavaScript has the potential to have a negative impact on your organic search visibility. Within this guide, we’ll look at what JavaScript SEO is, and why it’s important, and then delve into auditing, finding, and the fixes you can put in place to ensure JavaScript-heavy websites can get all content crawled, seen, and indexed by crawlers such as Googlebot.
I’ve been working in technical SEO for approximately 5 years, and I will hold my hands up and say that for the first 2-3 years of my SEO career I was not confident in understanding, finding, and fixing JavaScript SEO issues, and that’s okay. However, over these last couple of years, and especially since I started my own technical SEO consultancy service, I have lived and breathed Javacript SEO.
So for this article, my aim is to inspire confidence for other SEO’s and marketers that like myself a fw years ago, are not that confident in understanding JavaScript SEO.
Important – This guide is aimed to give a general overview of JavaScript issues for SEO and does not provide advanced-level advice or support. This is solely aimed at website owners, marketers, and SEO professionals wanting to better understand how JavaScript can impact your website’s organic visibility.
What is JavaScript SEO?
Simply put, JavaScript SEO is an element of technical SEO that improves JavaScript-heavy websites to be better crawled, rendered, and indexed by search engines such as Google and Bing.
Why is JavaScript so important for SEO?
When you have JavaScript-heavy websites, there can be a few things that become problematic for SEO, these are:
- Client-side rendering of JavaScript, where only the HTML is loaded from the server, and all of the JavaScript is executed within the browser.
- The length of time it takes for client-side rendering of JavaScript to load all of the content on the page, adds length to the overall page speed.
- Web crawlers like Googlebot have to execute client-side rendering of JavaScript to see the page content. Google will add this to a queue to render later.
Understanding the basics of JavaScript SEO
We’ve explained in part some of the basics of JavaScript SEO already. You’ll know that client-side rendering can be a problem for SEO, but before we go into more detail on that, it’s best to explain how website crawlers like Googlebot handles the processing of JavaScript.
How Google Processes JavaScript
There is a three-phase process for Google to process JavaScript. First and foremost the URL has to be crawled, then the page has to be rendered, and finally, it will be added to the index. Below is a visualization of the steps Google takes to process JavaScript.
Image source: Google Search Central
For phase one (crawling), once Google discovers a URL, that URL will be added to a crawl queue. The URL is then crawled and the HTML is parsed.
For phase two (rendering), the HTML is rendered. However, if there is JavaScript to also process as part of the page, this is added to a render queue before being rendered into HTML where it will be fully processed with any previous HTML rendered.
The final phase is the indexing of that page into Google’s index. However, not every page is guaranteed to be indexed. Your page/content still needs to meet certain indexing criteria, like content quality for example.
One final note here is that Google’s documentation does state that “The page may stay on this queue for a few seconds, but it can take longer than that.”
This solely depends on the available resources, and the crawl budget can play a part in this too. So there’s no guarantee that the JavaScript will be in the render queue for only a few seconds.
Client-side Vs server-side rendering
We spoke earlier about the issues that come with client-side rendering, but there are also other ways to render JavaScript. These are server-side rendering and dynamic rendering.
The key difference between client-side rendering and server-side rendering is where the JavaScript is parsed and rendered. For client-side rendering, this is done in the user’s browser upon the page load, where at first a blank page is loaded before the content is loaded in. For server-side rendering, this is done at the server side and everything is passed to the user’s browser as HTML, making the initial page load much faster.
If you’re developing a new website, you should carefully consider first whether client-side or server-side rendering will be best to meet your business needs and goals.
Popular front-end development frameworks like Angular, Ember. js, and Backbone all use client-side rendering, so client-side rendering is pretty common across the web.
However, front-end development frameworks like React, Angular, and Vue.js also offer built-in server-side rendering, so you have the option to start with server-side rendering from the offset in many cases.
Advantages of client-side rendering
One of the major advantages of client-side rendering is that once the page has loaded once in your browser, the next time, the JavaScript no longer needs to be processed and rendered by the browser. So the second, third, and fourth, etc. page loads are much quicker and rapid to load in the browser.
It’s also often a much cheaper option for developers and website owners, as there’s much less development time needed to implement this method.
Disadvantages of client-side rendering
The main disadvantage is the initial page load, as this will likely take longer as the browser has to process the JavaScript.
You also run the potential scenario of temporarily seeing an empty page or empty elements on the page while you wait for it to load in.
Advantages of server-side rendering
The initial page load is much quicker and all of the pages are parsed as HTML from the server, meaning it’s much more friendlier for SEO.
Disadvantages of server-side rendering
The time and investment needed to enable server-side rendering can put some business owners off.
There are more HTTP requests, as each time you go to a new page the entire page is rendered from scratch, whereas with CSR it’s cached in the browser after the initial page load.
There’s also the risk of server-side rendering struggling during extremely high server load pressures, so you’d want to make sure you have a robust server setup.
Dynamic rendering
There is then “dynamic rendering” which is seen as a short-term solution to client-side rendering SEO issues. This essentially servers server-side rendered pages to bots and client-side rendered pages to users.
This gives the best of both worlds, however, Google’s own documentation on dynamic rendering strongly suggests this should only be a workaround and not a long-term solution.
Common JavaScript SEO issues
These are the most common JavaScript SEO issues that I have encountered. These are not exclusive of all issues and there might be other, less common JavaScript issues that can occur.
JavaScript load time
When you first load a page and the JavaScript is rendered through client-side rendering, this will take longer than if the JavaScript was loaded through the server.
Depending on the size of the JavaScript resources, this can take longer than what’s deemed acceptable for Googlebot to wait. Although Google does not specify how long Googlebot will wait for the final HTTP request to come back, studies have shown that it’ll wait about 5.5 seconds before taking a screenshot of your page content. If your content within JavaScript has not fully loaded by then, you run the risk of Google not seeing all of your content on the page.
Rendering issues
As we stated above, if you’re relying on Javascript to load in content, and Googlebot has timed out and moved onto the next page, you’re potentially losing out on content being seen by Google. This is extremely prevalent for:
- Using JavaScript to handle page content – If you’re loading in all or parts of the content on the page via client-side JavaScript rendering, you run the risk of some or all of that content not being seen by Google.
- Dynamically changing Metadata – If you’re injecting the Title Tag using client-side JavaScript rendering, then there’s a very good chance Google won’t be seeing important meta tags. Or if you’re dynamically changing it, Google will only see the original Title Tag.
- Dynamically changing Headings – As with Title Tags, the same can be said for Headings, if you dynamically change these using client-side rendering, you run the risk of Google only seeing the original headings.
I have seen all of these examples on e-commerce sites that inject dynamic content like title tags, headings, prices, and page content via client-side JavaScript rendering, leaving Google to crawl the base Title Tag, Headings, and content before their dynamically changed by the JavaScript.
Crawling Issues
A very common issue is crawling issues for JavaScript-heavy websites. This usually comes down to three things:
- Blocked resources – Blocking Google from crawling JavaScript by blocking the URLs via robots.txt. Make sure you carefully check your Robots.txt rules to ensure critical JavaScript files are not being blocked by website crawlers.
- Using JavaScript to manage global navigation – Using server-side rendering to load in your global navigation (header nav) is a surefire way to make sure Google does not see important global links in your header. You’ll be devaluing your pages by doing so.
- Using JavaScript to manage filters – Likewise, injecting all of your filters for an e-commerce website, which contains important internal links, is not going to be seen and registered by some web crawlers.
How to audit a website and find JavaScript SEO issues
There are two main ways to audit a website for JS issues. A URL by URL basis, where there are many free-to-use tools, or using a website crawler to audit the entire website, which are paid-for tools.
Free-to-use tools
With free-to-use tools, these are best for quick diagnosis on a URL-by-URL level, as well as quickly and easily validating suspected JavaScript issues. These are my favourite free-to-use tools. They’re also a great resource for small businesses or freelance SEO consultants.
- Use the inspect page option in Chrome to check the DOM Vs page source – Possibly the simplest method is to compare the page source Vs the DOM (Right Click, Inspect, and Elements). If you view the source, you’ll see exactly what the server sent.
Image – page source
Whereas if you inspect the element and look at the HTML (DOM) within your Chrome dev tools, you’ll see everything as it is now (after a full client-side render).
Image – Inspect element (DOM)
If they are identical, this is a server-side rendering, and the less identical they are, the more client-side rendering is going on. This is a simple way to identify possible client-side rendering issues.
- Use Google Search Console’s URL inspection tool – If you have access to the website’s Google Search Console, use the URL inspection tool to see how Google has rendered the page/content. Simply run a live test of the URL and then click “view tested page”. From here you can see the rendered HTML and a screenshot of how Google sees the page. If there is content missing within the HTML and screenshot, you know that there is a JavaScript issue.
Image – Google Search Console URL inspection tool
- Use Google’s Rich Results testing tool – If you do not have access to the website’s Google Search Console, the Rich Results testing tool will give you the same result. Simply enter your URL and once it’s crawled to the live page, click “view tested page” and you’ll get the same HTML and screenshot as you’d get in Google Search Console.
Image – Rich Results testing tool
- Free rendering tools from technicalseo.com – Technicalseo.com has two free rendering tools available, which can provide great insights into potential JavaScript SEO issues. These free tools are:
Fetch & Render – This mimics bots like Googlebot and fetches and renders the page, much like the Google Search Console URL inspection tool. However, it’ll give you further insights and information that the Google tools do not give.
- Whether the URL was crawlable & indexable
- HTTP response time
- Rendered page response time
- Blocked resources
- JavaScript console messages
Pre-render testing – You can use this tool to see what content is being rendered by different user agents (e.g. Googlebot, Bingbot, etc.)
- Jet Octopus’ free JavaScript SEO page analyzer tool – Jet Octopus is a well-known crawler and log file analyzer tool for SEOs and marketers. They have a great free JavaScript page analyzer tool that allows you to compare HTML vs JS-rendered content.
Image – Jet Octopus free JS analyzer tool
Paid tools
Using a paid tool is needed to get data across an entire website, whereas the options that are all free are done on a URL-by-URL basis. The free options are great if you want to sense check a particular URL or small set of URLs, but if you need to do a sitewide JavaScript SEO audit, I’d recommend using a website crawling tool. Here are three tools that I’d recommend to use.
1. Screaming Frog
If you have a Screaming Frog license you can run a full crawl of your site to find potential JavaScript issues. It’s also worth noting Screaming Frog does have a free option but is limited to crawling a maximum of 500 URLs.
Screaming Frog is one of my favorite crawling tools, as it can pretty much do anything you want it to with regard to finding SEO issues. By default, Screaming Frog will crawl a website and only show you HTML content in the ‘text only’ mode. The ‘text only’ mode can show you hints if a website’s content is entirely using client-side JavaScript rendering, as only the homepage will be crawled, missing all of the other pages.
The best way to utilise Screaming Frog is by enabling JavaScript rendering mode (‘Config > Spider > Rendering’), and crawling your website. Screaming Frog will then crawl both the original and rendered HTML pages.
Once the crawl has finished, go to the JavaScript tab and you’ll be able to get data on:
- Compare HTML-rendered content and JS-rendered content and give you the number and percentage of word count change between the two.
- Find pages that have links rendered by JavaScript.
- Find pages where JavaScript is used to render Title Tags and Headings
For a more comprehensive run-through of how to use Screaming Frog to crawl and identify JavaScript issues, I recommend reading their detailed tutorial.
2. Sitebulb
Sitebulb, another website crawling tool, allows you to see what HTML is pre-rendered.
To set up a crawl for JavaScript, you need to make sure your project has the ‘chrome crawler’ enabled for the crawler type. Secondly, you can set the render timeout in the crawler settings, which is automatically set to 1 second. I’d recommend setting this to 5 or 6 seconds.
It’s not just the ability to view what HTML is pre-rendered, Sitbulb can also show you crawl map data for 1 second vs 5-second timeouts (as an example), the full time it takes to fully load the page, and it can even show you a report showing on page data dynamically rendered from GTM, which is a common short term fix for Javascript rendering issues.
3. JetOctotpus
Like the others, JetOctopus is a website crawler, and you can use this web crawler to identify JavaScript rendering issues. The main pro of using JetOctopus is its ability to compare HTML Vs JavaScript-rendered content, as well as many other JS-specific issues, such as:
- JavaScript load time
- JS console errors
- Rendering issues
How to fix common JavaScript SEO issues
So we have covered everything from what JavaScript SEO is, to how to audit and find Javascript problems, but how do we fix these?
Server Side Rendering
The choice of many SEOs is server-side rendering. In fact, Google’s documentation recommends server-side rendering as the best solution.
It’s recommended that you initiate server-side rendering with HTML caching to significantly reduce the server render and TTFB time to your browser. A good example of this would be to use a CDN network like Cloudflare.
When done right SSR can have major benefits to JavaScript-heavy website’s organic visibility.
Dynamic Rendering
If server-side rendering is unlikely to get buy-in from your business or developers in the short term, dynamic rendering might be a good temporary solution.
Often, if you already have client-side rendering implemented, changing over to server-side rendering can mean a large investment in development time and resources.
Dynamic rendering is a workaround solution where JavaScript is not available to search engines. It essentially supplies a static HTML version to search engine crawlers, and client-side rendering to users.
Prerendering
Another solution, which Google recommends, is to use prerendering, which is the practice of running a client-side application at build time to capture its initial state as static HTML.
The most straightforward solution to this would be to use an out-of-the-box solution like prerender.io.
What JavaScript rendering should you use?
So, in conclusion, should you use server-side or client-side rendering for your website? Well, I’ll use the classic SEO phrase “It depends”.
Server-side rendering is great for websites or applications where there’s a lot of interaction between pages, as it leads to a faster browsing experience.
If your business is heavily reliant on organic search as a channel, or you want to grow that channel, then server-side rendering with HTML caching will probably be the most sensible option for you.
However, if you already have client-side rendering in place and want to move to server-side rendering, then prerendering might be a sensible alternative.