How to Add a Sitemap and Update Robots.txt in Next.js: A Step-by-Step Guide
Need to add a sitemap.xml
and customize robots.txt
in your Next.js 14 project? Follow these steps to configure both efficiently.
-
Install
next-sitemap
: First, install thenext-sitemap
package to generate your sitemap automatically after each build.npm install next-sitemap
-
Configure Sitemap: Create a
next-sitemap.config.js
file in the root directory and configure it with your site’s URL./** @type {import('next-sitemap').IConfig} */ module.exports = { siteUrl: process.env.SITE_URL || 'https://yourdomain.com', generateRobotsTxt: true, // Generate robots.txt }
-
Add Post-Build Script: Update
package.json
to automatically generate the sitemap and robots.txt after building.
"scripts": {
"postbuild": "next-sitemap"
}
- Customize Robots.txt: Modify next-sitemap.config.js to include specific rules in robots.txt. Here’s an example configuration:
;/\*_ @type {import('next-sitemap').IConfig} _/
module.exports = {
siteUrl: process.env.SITE_URL || 'https://yourdomain.com',
generateRobotsTxt: true,
robotsTxtOptions: {
policies: [
{
userAgent: '*',
allow: '/',
disallow: [
'/login',
'/signup',
'/manage-listings',
'/profile',
'/onboarding/step1',
'/onboarding/step2',
'/verify-id',
],
},
],
},
}
- Run the Build: Generate sitemap.xml and the customized robots.txt by building your project.
npm run build
- Verify the Files: After building, check public/sitemap.xml and public/robots.txt to confirm the updates.
With these steps, your Next.js 14 site will have a properly configured sitemap.xml and a customized robots.txt that ensures optimal indexing by search engines.