How to Add a Sitemap and Update Robots.txt in Next.js: A Step-by-Step Guide

Need to add a sitemap.xml and customize robots.txt in your Next.js 14 project? Follow these steps to configure both efficiently.

  1. Install next-sitemap: First, install the next-sitemap package to generate your sitemap automatically after each build.

    npm install next-sitemap
    
  2. Configure Sitemap: Create a next-sitemap.config.js file in the root directory and configure it with your site’s URL.

    /** @type {import('next-sitemap').IConfig} */
    module.exports = {
      siteUrl: process.env.SITE_URL || 'https://yourdomain.com',
      generateRobotsTxt: true, // Generate robots.txt
    }
    
  3. Add Post-Build Script: Update package.json to automatically generate the sitemap and robots.txt after building.

"scripts": {
  "postbuild": "next-sitemap"
}
  1. Customize Robots.txt: Modify next-sitemap.config.js to include specific rules in robots.txt. Here’s an example configuration:
;/\*_ @type {import('next-sitemap').IConfig} _/
module.exports = {
  siteUrl: process.env.SITE_URL || 'https://yourdomain.com',
  generateRobotsTxt: true,
  robotsTxtOptions: {
    policies: [
      {
        userAgent: '*',
        allow: '/',
        disallow: [
          '/login',
          '/signup',
          '/manage-listings',
          '/profile',
          '/onboarding/step1',
          '/onboarding/step2',
          '/verify-id',
        ],
      },
    ],
  },
}
  1. Run the Build: Generate sitemap.xml and the customized robots.txt by building your project.
npm run build
  1. Verify the Files: After building, check public/sitemap.xml and public/robots.txt to confirm the updates.

With these steps, your Next.js 14 site will have a properly configured sitemap.xml and a customized robots.txt that ensures optimal indexing by search engines.