Online search has become synonymous with the use of mobile phones and computing devices. Whenever we decide to look for answers, read reviews, or understand a product or service, the first thing we do is to search online. Countless search engines deliver results as a result of readability, SEO practices, domain authority, and other factors.
This creates a value proposition for optimizing your website, helping search engines sort through terabytes of data and provide relevant information to users. SEO makes it easier for search engines to read and understand your application.
In today’s competitive world, an effective SEO strategy can boost your website’s ranking, leading to more organic traffic and business.
Here are some SEO practices you can implement to improve your online presence while using Next.js.
In a single-page application, the entire website is loaded into the browser with a single route. React mounts a single index.html file and handles user interactions and page navigation from that file. Search engines can only see the content and metadata of a single page, making it difficult to index more content.
This can be overcome by using Next.js with a different .html file for each page. In this case, React will mount the content of each page when it is visited, instead of relying on the first page. Next.js has various techniques for SEO, which we will explain below.
- Meta tags
- Meta tags provide search engines with information about how to crawl your website, and they can be added within Next.js' built-in component “Head” tag.
- for example:
- Adding this will prevent your website from appearing in search results.
- The following example has no indexing or crawling restrictions:
- Metadata
- Metadata defines the purpose of our website and allows us to dynamically create the website in order to include the latest updates to the website.
- There are two main elements
- title — Describe what the page is about. We recommend adding important keywords to the title and making it human-readable.
- explanation — Supplemental information about the website.
- import Head from “next/head”.
- Exports the default function About({product}:{product:string}).
- return (
- <>
- <ヘッド>
{`My ${ product } store`} - name=”description”
- content={`A store with a variety of ${product}`}
- />
- ヘッド>
- <メイン>
What is SEO?
- メイン>
- >
- );
- }
- Export an asynchronous function getStaticProps({params}:{params:any}) .
- return {
- props: {
- Product:params.product
- }
- }
- }
- export async function getStaticPaths() {
- return {
- path: [{ params: { product: ‘shoes’ } }],
- fallback: true,
- }
- }
- Open Graph Protocol
- The Open Graph protocol allows any web page to become a rich object in the social graph. To make a website into a graph object, basic metadata must be added to the web page.
- The basic properties included are:
- og:title — The title of the object as it appears in the graph.
- og:type — Object type
- og:Image — An image URL that represents the object in the graph.
- og:url — The canonical URL for the object, which will be used as its persistent ID in the graph
- In addition to this, you can add other optional properties as well, for more information please refer to the following link: https://ogp.me/.
- Robots.txt File
- You may want to protect certain areas of your website from being crawled and indexed by search engines, such as CMS, API roots, etc. In this case, you can specify these file paths in a robots.txt file added to the public folder in the root directory.
- For example, let's look at Facebook's robots.txt file:
- Link: robots.txt
- XML Sitemap
- XML sitemaps are the easiest way to communicate with Google and are extremely useful if you have a large or content-rich website.
- This is a file that contains information about the pages, videos, and other files on your site and how they relate to each other. Search engines read this file to crawl your site more efficiently.
- To enhance your SEO, we recommend creating an XML file that dynamically pulls in new content as it becomes available on your website.
- You have two options
- manual
- You can write a static sitemap.xml file in the public folder of your root directory.
http://www.mystore.com/shoes 2023–03–08 - Get Server Side Properties
- getServerSideProps can generate an XML sitemap on request.
- function generateDynamicSiteMap(product:string) {
- ` Returns
https://mystore.com ${`https://mystore.com/${product}`} - `;
- }
- Adding sitemap.ts to your pages folder will allow you to access the API and get information about dynamic URLs.
- HTML Structure
- Proper page structure is also very important for improving SEO. Here are some best practices to follow while writing your code:
- For headings
tag
— Used on each page to describe the content of that page.Tags are
Can be used for consecutive headings after.
- Link Tags — Next.js provides the Link component for client-side navigation, which can be imported from “next/link”.
- import Link from “next/link”.
- const About = () => {
- return (
- <リンク href={`/contact`}>
- Link to the contact page!
- リンク>
- );
- };
- Regarding default export;
- Image tags — Next.js also provides an Image component that automatically resizes, optimizes, and serves images in modern formats.
- Import image from “next/image”.
- const About = () => {
- return (
- <画像 src=”/images/flower.jpeg” alt=”花の画像” width={500} height={500} />
- );
- };
- Regarding default export;
While React was not the best for SEO, Next.js provides tools to improve performance and enhance customer experience. Web stores need to strategically plan multiple operations that will increase overall traffic and SEO scores. There are multiple tools to measure SEO practices and this task needs to be performed carefully to further improve the quality of the website reach.