Indexability
15-30 minAdvanced
Control search engine crawling and indexing using robots.txt and XML sitemaps in Next.js.
Prerequisites
- Next.js project setup
Intermediate Recommended
Next.js Metadata API
Built-in file generation.
1
robots.txt
1
Create src/app/robots.ts to dynamically generate your robots file.
Example
import { MetadataRoute } from 'next'
export default function robots(): MetadataRoute.Robots {
return {
rules: {
userAgent: '*',
allow: '/',
disallow: '/private/',
},
sitemap: 'https://acme.com/sitemap.xml',
}
}2
sitemap.xml
1
Create src/app/sitemap.ts to generate your sitemap.
2
Fetch data from your CMS or database to include dynamic routes.
Example
import { MetadataRoute } from 'next'
export default function sitemap(): MetadataRoute.Sitemap {
return [
{
url: 'https://acme.com',
lastModified: new Date(),
changeFrequency: 'yearly',
priority: 1,
},
// ... map over posts
]
}Verification Checklist
- Visit
/robots.txtand/sitemap.xmlto verify they are generated correctly. - Inspect page source code to ensure
<meta name="robots">tags are present if configured. - Use Google Search Console URL Inspection tool to test live URLs.