@nuxtjs/robots

A Nuxt.js module thats inject a middleware to generate a robots.txt file

Usage no npm install needed!

<script type="module">
  import nuxtjsRobots from 'https://cdn.skypack.dev/@nuxtjs/robots';
</script>

README

@nuxtjs/robots

npm version npm downloads Github Actions CI Codecov License

A Nuxt.js module thats inject a middleware to generate a robots.txt file

📖 Release Notes

Setup

  1. Add @nuxtjs/robots dependency to your project
yarn add @nuxtjs/robots # or npm install @nuxtjs/robots
  1. Add @nuxtjs/robots to the modules section of nuxt.config.js
export default {
  modules: [
    // Simple usage
    '@nuxtjs/robots',

    // With options
    ['@nuxtjs/robots', { /* module options */ }]
  ]
}

Using top level options

export default {
  modules: [
    '@nuxtjs/robots'
  ],
  robots: {
    /* module options */
  }
}

Options

The module option parameter can be:

Object

export default {
  robots: {
    UserAgent: '*',
    Disallow: '/'
  }
}

Array

export default {
  robots: [
    {
      UserAgent: 'Googlebot',
      Disallow: () => '/users' // accepts function
    }
  ]
}

Function

export default {
  robots: () => {
    return {
      UserAgent: '*',
      Disallow: '/'
    }
  }
}

Will generate a /robots.txt

User-agent: Googlebot
Disallow: /users
User-agent: Bingbot
Disallow: /admin

The keys and values available:

  • UserAgent = User-agent
  • CrawlDelay = Crawl-delay
  • Disallow = Disallow
  • Allow = Allow
  • Host = Host
  • Sitemap = Sitemap
  • CleanParam = Clean-param

Note: Don't worry keys are parsed with case insensitive and special characters.

License

MIT License

Copyright (c) - Nuxt Community