JavaScript SEO

What is JavaScript or JS SEO?
JavaScript SEO is a type of technical SEO that focuses on making websites built using JavaScript, more search engine friendly. It generally deals with the following:
- It optimizes JavaScript-injected content for search engine crawling, rendering, and indexing.
- It prevents, diagnoses, and troubleshoots ranking difficulties for websites and Single Page Applications (SPAs) created with JavaScript frameworks like React, Angular, and Vue.
- Using linking techniques to ensure that web pages are discoverable by search engines.
- Providing a smooth User Experience by improving page loading speed for pages that parse and execute JS code (UX).
Is JavaScript beneficial to SEO?
JavaScript is vital to the modern web, as it makes website development more scalable and maintainable. Certain JavaScript implementations, on the other hand, can have a negative impact on search engine visibility.
What impact does JavaScript have on SEO?
JavaScript can have an impact on the following on-page elements and SEO ranking factors:
- Rendered content
- Links
- Lazy-loaded images
- Page load times
- Meta data
What are websites that use JavaScript?
When we talk about JavaScript sites, we’re not just talking about adding a layer of JS interactivity to HTML documents (for example, when adding JS animations to a static web page). When the core or principal content is injected into the DOM via JavaScript, this is referred to as a JavaScript-powered website.
How do you know if a website is built with JavaScript
By using the new technology tools like BuiltWith or Wappalyzer, you can easily see if a website is built with the JavaScript framework. To check for JS code, use the browser’s “Inspect Element” or “View Source” functions. The following are examples of popular JavaScript frameworks:
JavaScript SEO for core content
JavaScript frameworks like Angular, React, and Vue is used to create modern online apps. Developers may use JavaScript frameworks to swiftly create and scale interactive online apps.
When viewed in the browser, Angular.js, a popular framework created by Google, appears to be a standard web page. Text, photos, and links are all visible as they should be. Its HTML document, on the other hand, is almost entirely devoid of content. In the body of the page, there are simply the app-root and a few script tags. This is because the single-page application’s core content is dynamically injected into the DOM using JavaScript. To put it another way, this app relies on JS to load important on-page content!
Note: Any core content that is rendered to users but not to search engine bots could be seriously problematic! If the search engine is not able to fully crawl all of your content, then your website could be below your opposition.
JavaScript SEO for internal links
JavaScript can affect the crawlability of links in addition to dynamically injecting content into the DOM. Google finds new pages by crawling links found on other pages.
Google suggests utilizing HTML anchor tags with href attributes to link pages, as well as supplying descriptive anchor texts for the hyperlinks.
However, Google advises developers not to use other HTML elements for links, such as div or span, or JS event handlers. According to official Google rules, these are referred to as “pseudo” links, and they are often not crawled.
Regardless of these guidelines, a third-party analysis suggests that Googlebot may be able to crawl JavaScript links. Nonetheless, it is generally seen that keeping links as static HTML elements is a great practice according to experienced people.
Note: If search engines can’t crawl and follow links to your important pages, they may be missing out on valuable internal links. Internal links assist search engines in properly crawling your website and highlighting the most significant pages. In the worst-case situation, if your internal links are incorrectly implemented, Google may not be able to find your new sites at all (outside of the XML sitemap).
JavaScript SEO for lazy-loading images
The crawlability of lazy-loaded images can also be affected by JavaScript. When Googlebot visits your website, it supports slow loading, but it does not scroll as a person would. Instead, when crawling site content, Googlebot merely expands its virtual viewport. As a result, the crawler’s scroll event listener is never activated, and the material is never shown.
The IntersectionObserver API fires a callback when any watched element becomes visible, as shown in SEO-friendly code. It’s more adaptable and robust than the on-scroll event listener, and it’s compatible with the latest Googlebot. Googlebot resizes its viewport to see your content, this is how the code works.
You can also use native lazy-loading in the browser. Google Chrome supports this, but it should be noted that it is currently an experimental function. In the worst-case situation, Googlebot will simply disregard it, and all images will lo
Note: Similar to not loading important material, it is critical to ensure that Google can see all of the content on a page, including images. Lazy-loading photos can improve the user experience for both humans and bots on an e-commerce site with numerous rows of product listings.
Javascript SEO for page speed
Page load times can be influenced by Javascript, which is an official ranking factor in Google’s mobile-first initiative. This indicates that a slower page may have a negative impact on search rankings. What can we do to assist developers to avoid this?
- Minifying JavaScript
- Deferring non-critical JS until after the main content is rendered in the DOM
- Inlining critical JS
- Serving JS in smaller payloads
Note: Even search engines have a bad user experience when a website is slow. Because Google defers loading JavaScript to save resources, that’s why it’s critical to ensure that any content supplied to clients is properly coded and delivered to protect rankings.
JavaScript SEO for meta data
It’s worth noting that SPAs that use a router package like react-router or vue-router must take extra precautions to manage things like modifying meta tags when switching between router views. A Node.js package like vue-meta or react-meta-tags is frequently used for this.
Users and bots are not served several static HTML files when they follow links to URLs on a React website. Instead, the React components (such as headers, footers, and body text) that are stored on the root./index.html page are simply restructured to display alternate content. It’s for this reason that they’re referred to as single-page applications.
Note: When browsing SPAs, it’s critical to utilize a package like React Helmet to ensure that visitors are shown unique metadata for each page. Otherwise, search engines may crawl the same metadata for each page, or none at all!
JavaScript SEO for e-commerce
E-commerce websites are real-world examples of JavaScript-injected dynamic content. For example, online businesses frequently use JavaScript to load products onto category pages.
E-commerce websites can use JavaScript to dynamically update products on their category pages. This is understandable given that their inventory is always changing due to sales. Is Google, on the other hand, able to “see” your content if your JS files aren’t executed?
Not having Google index their products might be terrible for e-commerce companies that rely on online conversions.
Key takeaways
The information provided above is intended to provide general best practices and insights into JavaScript SEO. JavaScript SEO, on the other hand, is a complicated and subtle topic of study. For more JavaScript SEO basics, you should browse through Google’s official guidelines and troubleshooting guide.
Additional Readings: