- Using linking techniques to ensure that web pages are discoverable by search engines.
- Providing a smooth User Experience by improving page loading speed for pages that parse and execute JS code (UX).
- Rendered content
- Lazy-loaded images
- Page load times
- Meta data
Note: Any core content that is rendered to users but not to search engine bots could be seriously problematic! If the search engine is not able to fully crawl all of your content, then your website could be below your opposition.
Google suggests utilizing HTML anchor tags with href attributes to link pages, as well as supplying descriptive anchor texts for the hyperlinks.
However, Google advises developers not to use other HTML elements for links, such as div or span, or JS event handlers. According to official Google rules, these are referred to as “pseudo” links, and they are often not crawled.
Note: If search engines can’t crawl and follow links to your important pages, they may be missing out on valuable internal links. Internal links assist search engines in properly crawling your website and highlighting the most significant pages. In the worst-case situation, if your internal links are incorrectly implemented, Google may not be able to find your new sites at all (outside of the XML sitemap).
The IntersectionObserver API fires a callback when any watched element becomes visible, as shown in SEO-friendly code. It’s more adaptable and robust than the on-scroll event listener, and it’s compatible with the latest Googlebot. Googlebot resizes its viewport to see your content, this is how the code works.
You can also use native lazy-loading in the browser. Google Chrome supports this, but it should be noted that it is currently an experimental function. In the worst-case situation, Googlebot will simply disregard it, and all images will lo
Note: Similar to not loading important material, it is critical to ensure that Google can see all of the content on a page, including images. Lazy-loading photos can improve the user experience for both humans and bots on an e-commerce site with numerous rows of product listings.
- Deferring non-critical JS until after the main content is rendered in the DOM
- Inlining critical JS
- Serving JS in smaller payloads
It’s worth noting that SPAs that use a router package like react-router or vue-router must take extra precautions to manage things like modifying meta tags when switching between router views. A Node.js package like vue-meta or react-meta-tags is frequently used for this.
Users and bots are not served several static HTML files when they follow links to URLs on a React website. Instead, the React components (such as headers, footers, and body text) that are stored on the root./index.html page are simply restructured to display alternate content. It’s for this reason that they’re referred to as single-page applications.
Note: When browsing SPAs, it’s critical to utilize a package like React Helmet to ensure that visitors are shown unique metadata for each page. Otherwise, search engines may crawl the same metadata for each page, or none at all!
Not having Google index their products might be terrible for e-commerce companies that rely on online conversions.