The most widely used websites in the world are brought to life with the help of React — Instagram, Facebook, and Netflix for example. While these websites are SEO-friendly, there are still some hurdles in working with React and search engine optimization. React was developed to create interactive, declarative, modular, and cross-platform user interfaces. Today, it is one of the more popular JavaScript frameworks for writing frontend applications. Originally developed to write single page applications (SPAs), it is now a major tool in a developer’s playbook for writing full-fledged websites and mobile applications.

While Google receives roughly 90% of all searches, it would be wise to consider optimizing a project and understanding what Google is doing in the process. Googlebot maintains a crawl queue containing all the URLs needed to scour through the website or application in the future. When idle, it picks the next URL in queue, makes a request, then fetches the HTML. After parsing the HTML, Googlebot determines if it should fetch the Javascript and render the content. It adds the request to a render queue, in turn, sending back to process any links and tags that will be added to the crawl queue. Finally, the URL and data is indexed with Google. During this process, the Javascript is held back due to the costly processes of executing the code while Googlebot mainly cares about indexing content and links among other requested data. This inherently creates the problem of running a Javascript framework that should be indexing the application and it’s dynamic content.

React is definitely a big help in building a very user-friendly UI that is also valuable by SEO. It shouldn’t be avoided while creating a user interface for your website or application.

The secret to writing an SEO-friendly project in React is to ensure that Google won’t have to use JavaScript to render the content. We achieve this goal by using server-side rendering (SSR). This means that the files will be sent to the user prior to running the JavaScript. Simply put, it will make the page load faster because the user won’t have to wait for React to execute and eliminates the need for Google to wait for the content of the page.

Static and dynamic apps make it easier for Google bot to access the pages as they use server-side rendering. Tools such as GatsbyJS and NextJS offer pre-rendering tools that will export static HTML on build and most simple single-page applications can easily be adapted. Luckily, single-page apps aren’t essential for most online businesses. Most marketplaces have dynamic websites while landing pages for those websites are static.

Even without a full framework tool, a team of developers can work with Babel and Webpack, import data and use a package known as React Helmet with a pre-renderer to inject your SEO data into the head of your static HTML landing pages.

import React from 'react'
import { Seo } from './components'
import data from './assets/data/seo.json'
import image from './assets/img/image.jpg'
<Seo title={`${} | Webpage Title`} description={} image={image} keywords={[``]} />

To achieve the above example on any of our website’s or application’s pages, we need to create a file within a folder called components and assign it within an index.js file inside the same folder.


import Layout from './layout'
import Seo from './seo'
export { Layout, Seo }

Now we can create the seo.js file and work with our props and inject the data into the <head> of each rendered HTML page with React Helmet. Our goal is to allow each rendered page to inject unique data via props as well as load global data and provide the global optimization structure. As advertised by the creators of React Helmet, “Helmet takes plain HTML tags and outputs plain HTML tags. It's dead simple, and React beginner friendly.”


import React from 'react'
import Helmet from 'react-helmet'
import SeoData from './assets/data/seo.json'
function Seo({ title, image, description, lang, meta, keywords }) {
const metaTitle = title || SeoData.siteMetadata.title
const metaDescription = description || SeoData.siteMetadata.description
return (
name: `google-site-verification`,
content: `${}`,
name: `description`,
content: metaDescription,
name: `og:title`,
content: metaTitle,
name: `og:image`,
content: `${SeoData.siteMetadata.siteUrl}` + image,
// Add more fields with the pattern above
.concat( keywords.length > 0 ? { name: `keywords`, content: keywords.join(`, `) } : [] )
htmlAttributes={{ lang }}
export default Seo

We’re effectively creating a wrapper for Helmet to dynamically retrieve data that we can import and directly add it within the <head> of our project and pre-rendered HTML pages. Each page will have the same structure and demand the same properties throughout the site while giving the developer and user flexibility to dynamically output the data that should be captured by search engines.