Are you an Edgio customer facing uncertainty? Discover how Macrometa ensures continuity and delivers high-performance solutions for your business.
Pricing
Log inTalk to an expert

Single Page Application SEO: Challenges and Best Practices

Chapter 2

Single-page applications (SPAs) are a growing trend in website and web app delivery, with significant pros and cons. On the positive side, well-designed SPAs can provide a fluid user experience while improving backend efficiency and streamlining development. However, they can present real problems with search engine optimization (SEO), as well as other factors that affect performance, technical complexity, analytics, and accessibility.

SPAs use JavaScript to render the necessary parts of a website directly on the client side instead of reloading the entire webpage on each view change. Unfortunately, some search engine crawlers can’t appropriately interact with JavaScript or index single-page applications, resulting in failed search results and invisible rankings. Other crawlers, such as Google and Bing, can render JavaScript, but there is a delay in the time it takes to index pages that require JavaScript to render. Slow and incomplete searches result in very poor results and low rankings. To compensate, companies must often increase paid ad spending dramatically.

Fortunately, there are technical solutions that can solve these search problems, providing a best-of-both-worlds solution that delivers a great experience while preserving organic search and reducing paid ad spend.

This article discusses best practices and strategies for helping eCommerce store architects and developers improve the SEO of single-page applications.

Executive Summary of single-page application SEO best practices

This table summarizes the article below, highlighting key single-page application SEO best practices

Best practiceDescription
Perform dynamic PrerenderingMacrometa PhotonIQ Prerender converts JavaScript-heavy eCommerce sites into SEO-friendly static HTML sites.
Implement server-side renderingThis approach provides the crawler with fully rendered and indexable HTML content from the server.
Make views crawlableUpdate the URL of each view of a single-page application and treat views as standalone pages with unique URLs.
Update meta tagsModify the view’s meta tags on each view change to reflect current content client-side or with server-side rendering.
Provide context about your contentUse a sitemap and structured data to provide context about your website’s content to the web crawlers.
Clean scripts and handle errors gracefullyEnsure that your JavaScript files are error-free and that the server response codes match the view's content. When an error occurs, redirect a user to a different view.
Optimize for mobile devicesOptimize loading speed for low-resource devices that take longer to parse JavaScript-intensive websites.
Utilize tools to help evaluate SEOTest code changes to ensure the efficacy of each SEO strategy.
Remember to get the basics rightWrite quality content from high-level expertise or personal experience, Improve your site’s trustworthiness by implementing it with the HTTPS protocol, Increase domain authority by getting external, credible websites to link to your website, To provide information on your website's structure, include internal links pointing to other relevant pages in the same domain, Use a responsive design to ensure your website appears in search engine results for mobile users.

Single-page applications

Single-page applications (SPAs) are web applications or website implementations that load only a single web document and then update the body content via JavaScript. When JavaScript loads new content, it’s called a “view.” Views change in response to user actions and do not require a complete reload of the site. Unfortunately, optimizing SEO for SPAs can be challenging because the search engine may not crawl and index the different “views.” In the following sections, we review techniques and best practices for single-page application SEO.

Perform dynamic prerendering

Prerendering solves the problem of a crawler being unable to appropriately interact with JavaScript code and index content hidden away in a client-side rendered view.

With a prerendering strategy, you can render the content of a view before the user clicks - or the search crawler indexes. Search crawlers see a crawler-friendly, static HTML version that they can fully index. However, traditional techniques can suffer performance degradation, seriously harming search engine ranking. Cutting-edge techniques solve that problem by performing the prerendering using edge computing and caching results at the edge. Macrometa’s PhotonIQ Prerender enables deep web crawling of your content and simulates user actions for improved indexing and SEO ranking. For example, a product listing page on an eCommerce store often uses pagination to split a list of items into multiple pages. This approach avoids presenting a user with an overwhelming number of products. The image below shows that a user must click Next to view further product listings. Unfortunately, search crawlers may not index products beyond the first page because they cannot follow the Next button. PhotonIQ Prerender solves this through Synthetic Interactions, which simulate a user clicking Next to provide the complete list of products to the search crawler.

Example of splitting product listings into multiple pages (Source)

Automated eCommerce SEO Optimization

  • Prerender your website pages automatically for faster crawl time
  • Customize waiting room journeys based on priorities
  • Improve Core Web Vitals using a performance proxy



An eCommerce store that starts with a small user base might not see the detrimental effects of having one centralized database for all the website’s content. However, performance issues may arise as the traffic grows with the website’s popularity and users worldwide visit your website. This is another scenario where edge computing and edge caching shine regarding optimization and speed. Macrometa’s GDN and PhotonIQ provide low-latency APIs, and computing is performed geographically closer to each user, providing a faster and more reliable connection to your website than a centralized database or content management system.

Implement server-side rendering

Server-side rendering (SSR) or dynamic rendering can positively impact your website’s SEO. In the server-side rendering strategy, the page's first load is generated on the server and sent back to the client as an indexable HTML page. Generating the HTML from the server means that the web crawler can index the rendered view like any standard page without the complications of executing JavaScript. Subsequent interactions with the page will be handled on the client side as usual in SPAs. Another benefit of server-side rendering is that the end user will get a viewable page more quickly.

Some frameworks have built-in SSR. For React, you can use the NextJS framework, which uses server components by default.

Make views crawlable

As mentioned earlier, SPAs render the dynamic parts of the website based on a concept called views. It might be tempting to append #hash fragments to the URL and then use JavaScript to load content based on hash changes. For example, a SPA might have a homepage at https://site.com/#home and a product page at https://site.com/#product. When a user clicks on the product page, the fragment identifier changes from home to product, and the content can be updated. Unfortunately, crawlers see hashed URLs as the same page.

<nav>
  <ul>
    <li><a href="#/products">Our products</a></li>
    <li><a href="#/services">Our services</a></li>
  </ul>
</nav>

Example of bad practice using hash fragments in href tags (source)

To help crawlers see the different views of your SPA as separate pages, you need to use the History API to modify the URL without needing a full page reload. The History API lets you treat views as different URLs and use accessible URLs.

<nav>
  <ul>
    <li><a href="/products">Our products</a></li>
    <li><a href="/services">Our services</a></li>
  </ul>
</nav>

Example of good practice using accessible URLs in href tags(source)

With JavaScript, user-click events on the different links can be intercepted, and the History API can be updated with the new view.

<script>
function goToPage(event) {
  event.preventDefault(); // Stop the browser from navigating to the destination URL.
  const hrefUrl = event.target.getAttribute('href');
  const pageToLoad = hrefUrl.slice(1); // Remove the leading slash
  document.getElementById('placeholder').innerHTML = load(pageToLoad);
  window.history.pushState({}, window.title, hrefUrl); // Update URL as well as browser history
}
// Enable client-side routing for all links on the page
document.querySelectorAll('a').forEach(link => link.addEventListener('click', goToPage));
</script>

Example of JavaScript being used to intercept view changes and update the History API (source)

Update meta tags

Including the appropriate meta tags in each view is important so the crawler knows how to index correctly. Meta tags are included in the head element of a page and look like this:

<meta name="..." content="..., ...">

Every time a view in a SPA changes, the meta tags should also change. The meta tags are located in the head tag, which SPAs would put in the container or skeleton of the page, and are inaccessible from the rendered view.

You can dynamically update the container's meta tags and other elements with JavaScript. For example, this JavaScript code changes the description meta tag and page title:

document.getElementsByTagName('meta')["description"].content = "Description for this view";

document.title = "New View Title";

The trick is to run this code on view changes. There are two ways to handle this: server-side (with SSR) or client-side.

In React, for example, you can use a third-party library like React Helmet and update the meta tags client-side as follows:

import React from "react";
import { Helmet } from "react-helmet";

class Application extends React.Component {
  render() {
    return (
      <div className="application">
        <Helmet>
          <meta charSet="utf-8" />
          <title>My Title</title>
          <link rel="canonical" href="http://mysite.com/example" />
        </Helmet>
        ...
      </div>
    );
  }
}

To use React Helmet, add the Helmet component to the view and include the meta tags you wish to change as children elements. React Helmet will automatically replace the meta tags in your website's head tag.

To update the meta tags from the server, the create-react-app team suggests adding placeholders to your code, such as the OG_TITLE and OG_DESCRIPTION properties in the following code:

<!DOCTYPE html>
<html lang="en">
  <head>
    <meta property="og:title" content="__OG_TITLE__" />
    <meta property="og:description" content="__OG_DESCRIPTION__" />
  </head>
</html>

Replace the placeholders with the correct information for the server.

If you prefer injecting data from the server, you can set a client-side global variable from the server like this:

<!doctype html>
<html lang="en">
  <head>
    <script>
      window.SERVER_DATA = __SERVER_DATA__;
    </script>

In the code above, the server replaces SERVER_DATA with JSON-encoded actual data that the client code can use.

Reminder: Always sanitize data sent from the server to protect yourself from XSS attacks.

Google provides a guide for common meta tags here: https://developers.google.com/search/docs/crawling-indexing/special-tags

Some of the most common ones are shown below.

<title>Page Title</title>
<meta charset="utf-8"><!-- HTML5 version of http-equiv="Content-Type"... -->
<meta name="description" content="Description of website">
<meta name="keywords" content="keywords">
<link rel="author" href="<<Link to author's website>>" />
<link rel="canonical" href="<<Page URL>>" />

Social media platforms use Open Graph to share links. Open Graph is a protocol created by Facebook to control how content is presented and shared on social media. When you share a social media post, the link expands into a thumbnail, a short description of the post, and a title—that is the Open Graph protocol in action.

Facebook uses these tags:

<meta property="og:url" content="<<Page URL>>">
<meta property="og:image" content="<<Open Graph Image URL>>">
<meta property="og:description" content="<<Description>>">
<meta property="og:title" content="<<Title>>">
<meta property="og:site_name" content="<<Site Title>>">
<meta property="og:see_also" content="<<Home page URL>>">

Twitter/X uses:

<meta name="twitter:card" content="summary">
<meta name="twitter:url" content="<<Page URL>>">
<meta name="twitter:title" content="<<Title>>">
<meta name="twitter:description" content="<<Description>>">
<meta name="twitter:image" content="<<Image URL>>">

Learn more

Provide context about your content

Sitemaps

A sitemap is an XML document with all of your website's URLs that helps web crawlers understand your site’s structure and content. It also allows bots to detect new content that needs to be indexed. The larger your website, the more necessary a sitemap is to guide the crawlers throughout it accurately.

There are three ways to create a sitemap file with NextJS: manually, programmatically, and dynamically.

To make a sitemap manually, create a sitemap.xml in the public directory of your website and add the following text:

<!-- public/sitemap.xml -->
<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
  <url>
    <loc>http://www.example.com/foo</loc>
    <lastmod>2021-06-01</lastmod>
  </url>
</urlset>

Add a <url> element for each of your site’s views.

To create a sitemap programmatically we need to use an array of type MetadataRoute.Sitemap and return it from a function like so:

import type { MetadataRoute } from 'next';

export default function sitemap(): MetadataRoute.Sitemap {
  return [
    {
      url: 'https://acme.com',
      lastModified: new Date(),
      changeFrequency: 'yearly',
      priority: 1,
    },
    {
      url: 'https://acme.com/about',
      lastModified: new Date(),
      changeFrequency: 'monthly',
      priority: 0.8,
    },
    {
      url: 'https://acme.com/blog',
      lastModified: new Date(),
      changeFrequency: 'weekly',
      priority: 0.5,
    },
  ];
}

Each item in the array is an object that describes each route.

You can also build a sitemap dynamically based on a server or database response. For example, say you have a table with article posts and a URL per each post. You may generate a sitemap like this:

import { BASE_URL } from '@/app/lib/constants';

export async function generateSitemaps() {
  // Fetch the total number of posts and calculate the number of sitemaps needed
  return [{ id: 0 }, { id: 1 }, { id: 2 }, { id: 3 }];
}

export default async function sitemap({
  id,
}: {
  id: number;
}): Promise<MetadataRoute.Sitemap> {
  // Google's limit is 50,000 URLs per sitemap
  const start = id * 50000;
  const end = start + 50000;
  const posts = await getPosts(
    `SELECT id, date FROM posts WHERE id BETWEEN ${start} AND ${end}`
  );
  return posts.map((post) => ({
    url: `${BASE_URL}/posts/${post.id}`,
    lastModified: post.date,
  }));
}

Example of generating a sitemap dynamically (source)

Structured data

You can help Google and other crawlers understand and display your website's content using structured data, which is data in a standardized format that provides important information about your page. For a single-page application in React, for example, Google provides a package to facilitate the inclusion of structured data in your views. First, install the package and the schema:

npm install schema-dts
npm install react-schemaorg

Then insert a snippet like this:

import { Person } from "schema-dts";
import { JsonLd } from "react-schemaorg";

export function GraceHopper() {
  return (
    <JsonLd<Person>
      item={{
        "@context": "https://schema.org",
        "@type": "Person",
        name: "Grace Hopper",
        alternateName: "Grace Brewster Murray Hopper",
        alumniOf: {
          "@type": "CollegeOrUniversity",
          name: ["Yale University", "Vassar College"],
        },
        knowsAbout: ["Compilers", "Computer Science"],
      }}
    />
  );
}

If using NextJS, you can even use next/head to include the structured data at the head element of the page:

import { Person } from "schema-dts";
import { jsonLdScriptProps } from "react-schemaorg";
import Head from "next/head";

export default function MyPage() {
  return (
    <Head>
      <script
        {...jsonLdScriptProps<Person>({
          "@context": "https://schema.org",
          "@type": "Person",
          name: "Grace Hopper",
          alternateName: "Grace Brewster Murray Hopper",
          alumniOf: {
            "@type": "CollegeOrUniversity",
            name: ["Yale University", "Vassar College"],
          },
          knowsAbout: ["Compilers", "Computer Science"],
        })}
      />
    </Head>
  );
}

For more information, visit the official GitHub package page.

Is your website ready for holiday shoppers? Find out.

Clean scripts and handle errors gracefully

The JavaScript parser that some web crawlers use does not accept any JavaScript errors. An error in the JavaScript code can mean that the entire page is not visible to the bot, which is detrimental to your SEO. Making sure that there are no errors in syntax or at runtime and surrounding code that you suspect might throw errors beyond your control with a try and catch are great ways to protect your website from not being indexed.

try {
  nonExistentFunction();
} catch (error) {
  console.error(error);
  // Expected output: ReferenceError: nonExistentFunction is not defined
  // (Note: the exact output may be browser-dependent)
}

Handling errors gracefully is also essential so that any server response matches the content displayed on the view. In such cases, JavaScript can redirect the user to a different view. It’s critical that the user never feel lost or be left waiting on a blank view with no indication that an error has occurred.

The two most popular ways to redirect in plain JavaScript are:

// Simulate a mouse click:
window.location.href = "http://www.mywebsite.com";

// Simulate an HTTP redirect:
window.location.replace("http://www.mywebsite.com");

Remember that the back button will not work as intended when using replace because it removes the current URL from the history.

Optimize for mobile devices

Even though most users navigate the web using mobile devices, it’s easy to get carried away with a website's content and treat mobile users as an afterthought. One problem mobile devices have is limited resources compared to desktops or laptops. Some JavaScript-intensive websites are slow to load and clunky to respond to user interactions. Given the plethora of mobile device screen resolutions and hardware specifications, a dynamic approach to mobile optimization is required.

Macrometa’s PhotonIQ Performance Proxy (P3) leverages AI to optimize your website’s HTML, CSS, and JavaScript across multiple devices. It works without modifying application code and is compatible with various frameworks and technologies. Performance Proxy enables faster loading and execution and improves your website’s core web vitals. It also offloads JavaScript to the Edge to enhance speed.

Utilize tools that help evaluate SEO

As with any website optimization, testing the results and measuring improvements is essential. Developers can use some available tools and tests to verify whether their efforts have improved the website's SEO. For further information, please see our guide on eCommerce SEO tools.

URL inspection in Google Search Console

The URL inspection feature in Google Search Console is an excellent tool for reviewing how a website is indexed and any errors encountered by the crawler when attempting to index it. It also provides tips and suggestions for improvement from one of the most significant search engines.

Google’s Lighthouse

Within the Google Chrome website’s developer tools, Lighthouse can run many tests, including performance, SEO, and best practices tests. Lighthouse is convenient to use during the development of your eCommerce store because it runs within Chrome DevTools. A Lighthouse 10 report includes metric scores as shown in the following table.

MetricDescriptionWeight
First Contentful PaintFCP measures how long it takes for the browser to render any part of the page's content on the screen.10%
Speed IndexSpeed Index measures how quickly content is visually displayed above the fold.10%
Largest Contentful PaintLCP reports the render time of the viewport's largest image, text block, or video element.25%
Total Blocking TimeThe Total Blocking Time (TBT) metric measures the time after the First Contentful Paint (FCP) that the page is blocked from reacting to user input.30%
Cumulative Layout ShiftCLS measures how much a page on your site unexpectedly shifts during the entire lifecycle.25%

Summary of Lighthouse 10 audit metrics (source)

Lighthouse uses the results from each metric and weighting to provide an overall performance score. Even though Lighthouse provides an overall performance score, Google recommends considering your site performance as a distribution of scores. This is because no single metric is sufficient to capture all of a page's performance characteristics. Similarly, no single tool is sufficient for analyzing SEO performance, and we provide a guide on eight categories of essential eCommerce SEO tools.

Lighthouse available performance reports (source)

Remember to get the basics right

Even if you use all the strategies discussed in this article, it’s essential to remember the basics of healthy SEO.

Content quality

Providing content people want to read and share is one of the most basic principles in gaining organic traffic and a higher SEO rank. Authoritative website content comes from an author with a high level of expertise or personal experience, is well-written and well-researched, and tends to do better than articles that simply include as many keywords as possible. If your eCommerce store also has a blog where you discuss some of the uses of the products sold, providing high-quality content will benefit the blog post and the entire website’s ranking.

HTTPS

Making a website load securely using the HTTPS protocol ensures its trustworthiness. It will be shown to more users, which will increase its authority. The website needs a valid SSL certificate to load securely, and its server must have a correct configuration. A correctly configured server will route any insecure connections to the HTTPS website address. You can see if you’ve configured your website correctly with an SSL certificate if your URL starts with https:// and the lock icon appears in the browser, confirming the secure connection.

Example of a secure HTTPS connection for google.com

Building backlinks to improve domain authority

Backlinks are links on another website that point to a page on your site. Using them tells the crawler how much your website is relevant and authoritative for a specific topic or keyword. Writing high-quality content encourages other websites to link to your content. Getting external, credible websites to link to your site tells the crawlers that your site has domain authority. See the section on backlink tracking tools in our guide on eight categories of essential eCommerce SEO tools for more information on monitoring and analyzing the links that point to your eCommerce website.

Using responsive design

In web development, a project must always start from a mobile standpoint and work its way up to desktop users, prioritizing small screens over large ones rather than the other way around.

A responsive website design ensures that your website appears in mobile device search results. Some UI frameworks, like Bootstrap, TailwindCSS, and others, are responsive out of the box, meaning that their elements will not overflow the screen or cause unnecessary scrollbars, text with fonts too small to read, or unclickable links. Another consideration is handling touch events from touch screen devices to behave similarly to their mouse pointer counterparts.

Last thoughts on single-page application SEO

A single-page application approach allows developers to present content faster than a traditional website. It bridges the gap between modern-looking websites with slower load times because of their JavaScript-heavy code and fast, conventional static HTML websites. It gives the developers a chance to explore technological advances without sacrificing speed. But with this speed advantage comes the drawback of the rendered view not being picked up by indexing crawlers and the website not being shown in search results.

In this article, we reviewed some ways to take advantage of the benefits of SPAs while maintaining a high SEO rank. We examined the basics of good SEO-ranked pages and the importance of content quality, security, and mobile-friendly websites. Critically, we reviewed how to use the power of prerendering at the edge to help search engine crawlers discover client-side rendered views while leveraging the performance advantages of the edge. We also discussed treating views as regular pages, including meta tags, and replacing the browser URL with each rendered view.

With prerendering, JS offloading, AI-driven site optimization, and more, Macrometa’s PhotonIQ is the complete solution to help your single-page application meet your SEO goals.

Learn more

Like the Article?

Subscribe to our LinkedIn Newsletter to receive more educational content.

Chapters