README
@nuxtjs/robots
A Nuxt.js module thats inject a middleware to generate a robots.txt file
Setup
- Add
@nuxtjs/robots
dependency to your project
yarn add @nuxtjs/robots # or npm install @nuxtjs/robots
- Add
@nuxtjs/robots
to themodules
section ofnuxt.config.js
export default {
modules: [
// Simple usage
'@nuxtjs/robots',
// With options
['@nuxtjs/robots', { /* module options */ }]
]
}
Using top level options
export default {
modules: [
'@nuxtjs/robots'
],
robots: {
/* module options */
}
}
Options
The module option parameter can be:
Object
export default {
robots: {
UserAgent: '*',
Disallow: '/'
}
}
Array
export default {
robots: [
{
UserAgent: 'Googlebot',
Disallow: () => '/users' // accepts function
}
]
}
Function
export default {
robots: () => {
return {
UserAgent: '*',
Disallow: '/'
}
}
}
Will generate a /robots.txt
User-agent: Googlebot
Disallow: /users
User-agent: Bingbot
Disallow: /admin
The keys and values available:
- UserAgent =
User-agent
- CrawlDelay =
Crawl-delay
- Disallow =
Disallow
- Allow =
Allow
- Host =
Host
- Sitemap =
Sitemap
- CleanParam =
Clean-param
Note: Don't worry keys are parsed with case insensitive and special characters.
License
Copyright (c) - Nuxt Community