@hashicorp/next-hashicorp

Next.js plugin for configuring Next for HashiCorp-style websites.

Usage no npm install needed!

<script type="module">
  import hashicorpNextHashicorp from 'https://cdn.skypack.dev/@hashicorp/next-hashicorp';
</script>

README

Next-HashiCorp

This tool layers a number of configuration choices, code quality checks, and code generators on top of next.js. Specifically, it provides:

  • Truly universal routing & rendering: switch between static and dynamic modes with no code changes
  • Baked in, zero-config typescript linting & prettier formatting via binary
  • Code generators for base website templates, new pages, and new components via binary
  • A client for easily fetching from DatoCMS
  • An optional Apollo setup to enable more complex data fetching scenarios
  • Strong set of default plugins, including:
    • mdx-processed markdown with front-matter and layouts
    • css files with pre-configured postcss-preset-env can be imported directly into components
    • graphql file loader
    • webpack bundle analyzer

Table Of Contents

Quick reference on how to create a new website template: npx next-hashicorp generate website

Basic Usage & Options

The plugin looks like this inside of your next.config.js file:

const withHashicorp = require('@hashicorp/next-hashicorp')

module.exports = withHashicorp(/* options */)(/* normal nextjs config */)

Let's go through the full options:

withHashicorp({
  // passed directly to next-mdx-enhanced
  // see options here: https://github.com/hashicorp/next-plugin-mdx-enhanced#readme
  mdx: {
    layoutsPath: 'somePath/otherPath',
    defaultLayout: true
  },
  css: {
    plugins: [somePlugin(), otherPlugin()], // array of postcss plugins
    presetEnvOptions: { stage: 3 } // https://github.com/csstools/postcss-preset-env#options
  },
  transpileModules: ['foo'], // third party package names that should be transpiled by babel
  usingApollo: true // if you're using apollo, this enables loader and transpilation optimizations
})

All of these are optional, none are required to make withHashicorp function properly. In fact, we recommend not using any custom options unless you need to.

Routing Utilities

next-hashicorp exposes a couple utilities that allow you to more easily manage dynamic routes, and keep a single routes config which allows both static and dynamically served sites to export exactly the same routes. This tool makes it much easier to switch a nextjs site between being statically or dynamically served, as there are no code changes required while using it.

Before diving into the specifics, it's important to be clear on how nextjs handles routes out of the box. When run as a dynamic site, the pages directory is treated as the web root - so if you go to localhost:3000/foo, next will look for a javascript page (one with an extension that is included in pageExtensions, defined in your config/plugins) named foo, and if present, serve that page. This is a great pattern that allows simple routing without extra boilerplate - we refer to these routes as simple routes for the purposes of these docs.

However, when you need to define a route that is not so simple like blog-posts/:slug, or even change the name of a route, such that localhost:3000/foo actually routes to a nextjs page under pages/docs/foo, for example, simple routes can't cut it for you anymore. We refer to these types of routes as complex routes. When dealing with complex routes for dynamic sites, you extend nextjs' express server and augment it with the routing logic you need, which is fairly straightforward since express' routing capabilities are extensive.

For static exports however, you get no routes out of the box at all, not even simple routes, and every page you want to export must be explicitly defined through the exportPathMap function in your config file. Additionally, there is no such thing as a dynamic route in a static site. So if you have, for example, a blog, you must define the route for each post explicitly. For example, if you have a blog with posts in the pages folder, perhaps under blog subfolder, and you wanted to export it as static, you'd have to require node's fs module within the next config, fetch all the post paths, and export them out of exportPathMap.

While both of these modes of functioning work nicely for their intended outputs, our aim with this utility is to unify the way routes are handled, such that for any given codebase you can at any time switch between a static export or running it with a server, and require no code changes at all. As such, we combine some of the patterns here and create new expectations for how routes are handled.

The first thing we changed is the fact that static exports do not include simple routes by default. Since we have utilties for defining complex routes, we can conclude by default that any page not pointed to by a complex route should be available as a simple route, so all simple routes will work the same with a static export as they do in dynamic mode. Now let's take a closer look at how we deal with complex routes.

NOTE: The nextjs team is working on a way that this functionality can be achieved without any external plugins. When this functionality ships, we will remove our routing utilities.

Generic Route Definition

To implement the routing utilities, first create a routes.js in the root of your project where you can define your dynamic routes. The file is expected to look something like this:

module.exports = [
  {
    name: 'Blog Post', // human-readable route name
    pattern: 'blog/:slug', // url pattern to match, as it would be fed to express
    page: '/blog_post', // route to your nextjs page template, relative to "pages" folder
    data: async ({ fetchFromDato }) => {
      const resp = await fetchFromDato(
        `query BlogPosts { allBlogPosts(first: 100) { slug } }`
      )
      return { slug: resp.data.allBlogPosts.map(post => post.slug) }
    }
  }
]

The data function is the most involved part. If you have a dynamic route that includes :params, you must define a data function, so that when the site is exported as static, it can determine what each of the pages are. In order to export each page as static, next needs to know what each page actually is, which is what the data function is responsible for. If a route does not have :params in its pattern, you don't need to define a data function.

The data function expects you to return an object, with each key corresponding to a :param in your route, and the value being an array of each possible page value for that param. So in this example, we return { slug: ['first-post', 'second-post' ]} etc.

The next question is how you fetch the data? Your data fetchers will be passed in as parameters to the function, and we'll talk about where you define these next.

Defining Static Routes

To implement your routes as static, jump into your next.config.js file, where next provides the exportPathMap function for defining static routes. There, you can do the following:

const fetch = require('isomorphic-unfetch')
const withHashicorp = require('@hashicorp/next-hashicorp')
const defineStaticRoutes = require('@hashicorp/next-hashicorp/routes/static')
const routes = require('./routes')

module.exports = withHashicorp(/* options */)({
  async exportPathMap() {
    return defineStaticRoutes(routes, { fetchKittens })
  }
})

function fetchKittens = (numberOfKittens) => {
  return fetch(`http://api.kittens.com/${numberOfKittens}`).then(res => res.json())
}

A couple things are going on here. First, you require your routes, then the defineStaticRoutes function from next-hashicorp. Then, within the exportPathMap function you run defineStaticRoutes, passing in your routes. You can also optionally pass in any number of data fetchers, each of which will be provided as parameters to each data function.

There are some data fetchers that are also included by default for convenience, as they are used across all HashiCorp web properties, listed below:

  • fetchFromDato(graphqlQuery: String) - fetches data from HashiCorp's instance of DatoCMS

It should also be noted that any route params will be exposed as query for static routes. So if you take thr blog route example with /blog/:slug, each route that is exported statically will get { slug: 'xxx' } in the query object, which is accessible via getInitialProps. This same pattern is followed with dynamic routes.

Defining Dynamic Routes

Ok now that our static routes are all set, let's make sure that when the site is running as an express server, our routes still work. Jump in to your server.js file, or create one. Within this file, you can apply your routes as such:

const next = require('next')
const express = require('express')
const port = parseInt(process.env.PORT, 10) || 3000
const dev = process.env.NODE_ENV !== 'production'
const app = next({ dev })

const defineDynamicRoutes = require('@hashicorp/next-hashicorp/routes/dynamic')
const routes = require('./routes')
const handler = defineDynamicRoutes(routes).getRequestHandler(app)

app.prepare().then(() => {
  const server = express()
  // implement next-routes
  server.use(handler)
  // fire up the server!
  server.listen(port, err => {
    if (err) throw err
    console.log(`> Ready on http://localhost:${port}`)
  })
})

With this in place, all your routes will be exposed to your express server and it will run as expected.

Using

The final step is ensuring that your client-side router is aware of your dynamic routes and able to fetch them correctly. Luckily, this is a small step. While typically you might have a component like this using nextjs' router:

import Link from 'next/link'

export default function SomeComponent() {
  return (
    <nav>
      <Link href="/about">
        <a>About Page</a>
      </Link>
    </nav>
  )
}

In order to make your links aware of the dynamic routes, that changes as such:

import routes from '../routes'
import defineDynamicRoutes from '@hashicorp/next-hashicorp/routes/dynamic'
const { Link } = defineDynamicRoutes(routes)

export default function SomeComponent() {
  return (
    <nav>
      <Link route="/about">
        <a>About Page</a>
      </Link>
    </nav>
  )
}

A couple notable differences here. First, we import the routes and dynamic routes helper again, then pull a custom Link component from that. Second, <Link href= changes to <Link route=. That's it!

The Binary

Next-hashicorp ships with a binary that includes a variety of useful tools, which we will go through below. Generally, we recommend using npx or a local install and npm scripts to run the binary, rather than installing globally.

Linting & Formatting

Next-hashicorp provides centrally managed, pre-configured ESLint and Prettier tasks which can be executed via next-hashicorp lint and next-hashicorp format respectively. We recommend installing locally and running them as npm tasks. We prefer to run both of these tasks before any commit can be made -- if you share that preference, you can execute both using the command next-hashicorp precommit.

If you would like to change the configuration or use a different configuration for any of these tasks, we'd recommend forking the project and changing it to match your preferences. The purpose of a controlled, centralized config is to ensure that all projects that implement it are consistent, and allowing per-project config changes eliminates this benefit.

Generators

Next-hashicorp also provides a few generators that can provision templates for common assets. At the moment, this includes:

  • next-hashicorp generate website - creates a new, bare-bones website template that idiomatically implements next-hashicorp tooling
  • next-hashicorp generate component - creates a new component template in your components folder

After running these commands, you will be asked a couple questions, then your files will be generates.

Analytics

Next-hashicorp includes a command to generate a typed analytics client based on a specific Tracking Plan. All plans are located in the web-tracking-plans repository. Run next-hashicorp analytics and follow the CLI to generate the specific analytics client of your choosing. By default these files will be generated within an analytics/generated directory. Pass an -o or --outputPath flag to specify a specific output directory. i.e. next-hashicorp analytics --outputPath ./analytics/typewriter

GraphiQL

We provide a handy bin command that opens up Dato's in-browser GraphiQL IDE in your default browser. The URL to this IDE can be a bit annoying to track down because you need to have your API Token handy but since next-hashicorp hangs on to this, we can avoid that step.

next-hashicorp graphiql

If you're unfamiliar with what GraphiQL provides you, please have a look at the GraphiQL repo.

Markdown Compilation

Next-hashicorp comes pre-configured to compile markdown files using mdx. Any .mdx file included in the pages folder will automatically output as a page. Additionally we ship a custom loader that allows the parsing of yaml front matter, and the ability to render markdown files into layouts. This layout rendering process is a little involved and we'll go into it further below.

The markdown file itself is the easy part - simply create a .mdx file in the pages directory and you're set. If you use frontmatter, it will be parsed and available within the layout and any loaders used to process the file, and is parsed using gray-matter.

To implement a layout, first create a layouts directory within the pages directory and add a file. Then using the layout property of your front matter, enter the name of the layout file, relative to pages/layouts. Finally, we need to populate the layout file properly using mdx's layout system. Here's a minimal example of how this whole setup might look. First, the markdown file:

---
title: 'Testing Page'
layout: 'test-layout'
---

Hello **world**!

And then your layout, at pages/layouts/test-layout.jsx:

import { MDXProvider } from '@mdx-js/tag'

export default frontmatter => {
  return ({ children }) => (
    <MDXProvider>
      <h1>{frontmatter.page_title}</h1>
      {children}
    </MDXProvider>
  )
}

A couple things going on here. First we import the mdx provider, which we wrap our layout template with in order to properly render the markdown/react-component contents. Then we simply export a function, which is passed the front matter, and return whatever layout we'd like as long as it's wrapped with MDXProvider. Wherever you want to render the markdown content of each file that implements the layout, use children in the same way as you might with a higher order component. Nice and easy!

NOTE: Our custom loader adds one special variable, __resourcePath, to the markdown for each file. You can use this variable within your layout to determine the original path of the file that is being shown, relative to /pages.

Loading From DatoCMS

We use DatoCMS as an interface through which our non-technical staff can have the ability to modify content on our websites. Dato is not used on every part of every page, rather as we are building each site we decide which areas to add it to and what to make editable.

There are two different strategies for data loading, and depending on the scenario, you should use different tools and techniques to get it done.

DatoCMS exposes two endpoints. One provides production ready, published content. The other also returns records that are in a saved, but unpublished state for previewing. Setting HASHI_ENV=preview in your environment will use the preview endpoint and return unpublished records. The default is to return production only records to avoid unexpectedly exposing preview content.

Loading Initial Data

If you need to load a set of initial data in order to render a component, and that data does not change at all after the initial load, you should use getInitialProps to do it. Next-hashicorp provides a pre-configured graphql request client that can be used to fetch data from DatoCMS as such:

import client from '@hashicorp/next-hashicorp/dato/client'
import query from './query.graphql'

function someComponent({ posts }) {
  <p>{JSON.stringify(posts)}</p>
}

someComponent.getInitialProps = () => {
  const { posts } = await client.request(query)
  return { posts }
}

This will integrate nicely with nextjs, ensuring that the necessary data is loaded before the page renders for client-side routing, and fetching on the server or at build time for dynamic and static build outputs, respectively.

Loading Dynamic Data

If you have more complex data fetching needs such as:

  • you want to render the page first then fetch data after for only one portion of the page
  • you want to fetch data in response to user input or client-side timers
  • you want to re-fetch the initial data in response to user input or timers
  • you want to make several different data fetching requests in parallel and render their outputs on the page as soon as they are available

You will need as more powerful tool that a blocking function that loads data only for the initial render. In this situation, you can use Apollo. See the next-hashicorp apollo docs for more information on installation and usage.

Our default template and options do not include any advanced data fetching utilities, as they are more the exception than the norm, and do incur performance overhead in exchange for their increased power.

GraphQL Code Generation

This plugin also has an extra build step that must be used to introspect the graphql schema for DatoCMS in order to make complex queries on fragment types possible. This same logic will soon be used to also generate typescript typings for all of our components. To use this, run next-hashicorp codegen. If you made changes to model structures within dato and are adding queries for these model structures, you will need to re-run the codegen command.

CSS Processing

Out of the box, we give you next-css, with postcss configured as well. By default you get postcss-preset-env and postcss-import, and you can add your own postcss plugins or customize the options for postcss-preset-env through the options.

NOTE: Our css processing toolchain is very likely to change in the near future, and there will be a major verson boost and more thorough writeup here when that happens.