主题
robots.txt
添加或生成与 app 根目录中的 机器人排除标准 相匹配的 robots.txt 文件,以告诉搜索引擎爬虫它们可以访问你网站上的哪些 URL。
¥Add or generate a robots.txt file that matches the Robots Exclusion Standard in the root of app directory to tell search engine crawlers which URLs they can access on your site.
静态 robots.txt
¥Static robots.txt
txt
User-Agent: *
Allow: /
Disallow: /private/
Sitemap: https://acme.com/sitemap.xml生成机器人文件
¥Generate a Robots file
添加返回 Robots 对象 的 robots.js 或 robots.ts 文件。
¥Add a robots.js or robots.ts file that returns a Robots object.
需要了解:
robots.js是一个特殊的路由处理程序,除非使用 动态 API 或 动态配置 选项,否则默认情况下会缓存。¥Good to know:
robots.jsis a special Route Handlers that is cached by default unless it uses a Dynamic API or dynamic config option.
ts
import type { MetadataRoute } from 'next'
export default function robots(): MetadataRoute.Robots {
return {
rules: {
userAgent: '*',
allow: '/',
disallow: '/private/',
},
sitemap: 'https://acme.com/sitemap.xml',
}
}输出:
¥Output:
txt
User-Agent: *
Allow: /
Disallow: /private/
Sitemap: https://acme.com/sitemap.xml定制特定的用户代理
¥Customizing specific user agents
你可以通过将一组用户代理传递给 rules 属性来自定义各个搜索引擎机器人如何抓取你的网站。例如:
¥You can customise how individual search engine bots crawl your site by passing an array of user agents to the rules property. For example:
ts
import type { MetadataRoute } from 'next'
export default function robots(): MetadataRoute.Robots {
return {
rules: [
{
userAgent: 'Googlebot',
allow: ['/'],
disallow: '/private/',
},
{
userAgent: ['Applebot', 'Bingbot'],
disallow: ['/'],
},
],
sitemap: 'https://acme.com/sitemap.xml',
}
}输出:
¥Output:
txt
User-Agent: Googlebot
Allow: /
Disallow: /private/
User-Agent: Applebot
Disallow: /
User-Agent: Bingbot
Disallow: /
Sitemap: https://acme.com/sitemap.xml机器人对象
¥Robots object
tsx
type Robots = {
rules:
| {
userAgent?: string | string[]
allow?: string | string[]
disallow?: string | string[]
crawlDelay?: number
}
| Array<{
userAgent: string | string[]
allow?: string | string[]
disallow?: string | string[]
crawlDelay?: number
}>
sitemap?: string | string[]
host?: string
}版本历史
¥Version History
| 版本 | 更改 |
|---|---|
v13.3.0 | robots 已引入。 |