isomorphic-flux-boilerplate

An ES7 universal ReactJS with AltJS boilerplate

Usage no npm install needed!

<script type="module">
  import isomorphicFluxBoilerplate from 'https://cdn.skypack.dev/isomorphic-flux-boilerplate';
</script>

README

Circle CI Coverage Status Dependency Status devDependency Status NPM Version

A complete ES7 Isomorphic Universal ReactJS boilerplate with alt as Flux library.

A wonderfull boilerplate for Flux/ReactJS universal applications, running on koajs.

Demo: http://isomorphic.iam4x.fr

Libraries used

Setup

  • 1. Requirements

    • nodejs@4.4.2
    • npm@3 ($ npm i -g npm)
  • 2. How to / Installation

    • $ git clone -o upstream https://github.com/iam4x/isomorphic-flux-boilerplate.git app
    • $ cd app && npm install

    (Don't forget to add your remote origin: $ git remote add origin git@github.com:xxx/xxx.git)

Run

  • dev

    • $ npm run dev OR
    • $ PORT=xxxx npm run dev
  • test

    • $ npm test OR
    • $ npm test -- --watch
  • build

    • $ NODE_ENV=[production/staging] npm run build
    • $ NODE_ENV=[production/staging] npm run prod

Concepts

Koa will be our server for the server side rendering, we use alt for our Flux architecture and react-router for routing in our app.

With iso as helper we can populate alt flux stores before the first rendering and have a complete async isomorphic React application.

Run this boilerplate, you will see the server is fetching some fake users and will populate the UserStore with this data. Koa will render the first markup, serve the JavaScript and then it will entirely run on the client.

Flux

We use alt instance as Flux implementation.

We need to use instances for isomorphic applications, to have a unique store/actions per requests on the server.

On the client, Flux is initialized in app/main.js and sent to our first React Component via context (this.context.flux). Every time you want to uses stores or actions in a component you need to give it access through context.

On the server, it's similar but Flux is initialized in server/router.jsx. The instance is sent to alt-resolver for rendering components with the correct props.

Learn more about alt instances in the alt documentation.

There's also alt-devtools enabled in development, it's a Chrome Extension that you can find here: https://github.com/goatslacker/alt-devtool

Internationalization (i18n)

The boilerplate provides an easy way to use internationalization, it provides through context a function called i18n that does not break when translations are missing. No more react-intl/IntlMixin to load everywhere.

import { FormattedRelative } from 'react-intl';
...
static contextTypes = { i18n: PropTypes.func.isRequired }

render() {
  const { i18n } = this.context;
  return (
    <div>
      {/* You can pass values to `i18n` fn like `<FormattedMessage />` component */}
      <h1>{ i18n('some.random.i18n.key', { now: new Date() }) }</h1>

      {/* FormattedRelative, FormattedCurrency works out the box :+1: */}
      <FormattedRelative value={ Date.now() - (1000 * 60 * 60 * 24) } />
    </div>
  );
}

We use react-intl for internationalization, it uses browser implementation of Intl. For older browser and for node, we load the polyfill.

  • Support localized strings (see data/en.js)
  • Support localized dates, times and currencies.

Lang files and Intl polyfill are compiled into webpack chunks, for lazy-loading depending the locale of the user.

If user changes locale, it is saved into a cookie _lang and used by the server to know the locale of rendering. If there's no _lang cookie, server will rely on Accept-Language request header. Server will set <html lang='x'> on rendering.

Thank's to gpbl/react-locale-hot-switch for the implementation example!

Localized routes

We have an utility to generate severals routes for the same component (see app/utils/localized-routes.js).

Use the same logic as localized string, declare the localized routes into app/routes.js and into your data/{lang} file.

Async data-fetching

Alt-resolver is the magic thing about the boilerplate, it will be our tool for resolving promises (data-fetching) before server side rendering.

Alt-resolver expose also an ApiClient for doing the requests to the intern API. It handle for you the cookies on server and URI matching between server and browser.

Wrap data-fetching requests from actions into promises and send them to altResolver like:

show(id) {
  // You need to return a fn in actions
  // to get alt instance as second parameter to access
  // `alt-resolver` and the ApiClient
  return (dispatch, { resolve, request  }) =>
  // We use `alt-resolver` from the boilerplate
  // to indicate the server we need to resolve
  // this data before server side rendering
    resolve(async () => {
      try {
        const response = await request({ url: '/users' });
        this.actions.indexSuccess(response);
      } catch (error) {
        this.actions.indexFail({ error });
      }
    });
}

Call the fetch action from component in the componentWillMount method:

import React, { Component, PropTypes } from 'react';
import connect from 'connect-alt';

// connect-alt is an util to connect store state to component props
// you can read more about it here: http://github.com/iam4x/connect-alt
// it handle store changes for you :)
//
// users -> store name
// collection -> `this.collection` into users store
//
@connect(({ users: { collection } }) => ({ users: collection }))
class Users extends Component {

  static contextTypes: { flux: PropTypes.object.isRequired }
  static propTypes: { users: PropTypes.array.isRequired }

  componentWillMount() {
    const { flux } = this.context
    return flux.getActions('users').fetch();
  }

  render() {
    const { users } = this.props;
    return (<pre>{ JSON.stringify(users, null, 4) }</pre>)
  }
}

On browser side, the rendering won't be stopped and will resolve the promise instantly.

On server side, altResolver.render will fire a first render to collect all the promises needed for a complete rendering. It will then resolve them, and try to re-render the application for a complete markup.

Open app/actions/users.js, app/utils/alt-resolver.js, app/stores/users.js for more information about data-fetching.

How to require() images on server side

On client with webpack, you can directly require() images for your images DOM element like:

<img src={require('images/logo.png')} />

Webpack will load them through the url-loader and if it's too big it will sent through file-loader for minification/compilation. The results is an image with a new filename for cache busting.

But on node, require() an image will just throw an exception. There's an util for loading image on server side to achieve this:

import imageResolver from 'utils/image-resolver'

let image;
// On browser just require() the image as usual
if (process.env.BROWSER) {
  image = require('images/logo.png');
}
else {
  image = imageResolver('images/logo.png');
}

...
render () {
  return (
    <img src={image} />
  );
}
...

The utils/image-resolver with match the original image name with the compiled one.

VoilĂ ! You can require() images on server side too.

Run the project in development:

  • $ npm run dev

Open your browser to http://localhost:3002 and you will see the magic happens! Try to disable JavaScript in your browser, you will still be able to navigate between pages of the application. Enjoy the power of isomorphic applications!

(Note: ports 3000-3002 are needed, you can change this with $ PORT=3050 npm run dev it will run on 3050-3052)

Update the boilerplate

You can fetch the upstream branch and merge it into your master:

  • $ git checkout master
  • $ git fetch upstream
  • $ git merge upstream/master
  • $ npm install

Run in production

Build the project first:

  • $ npm run build

Then start the koa server:

  • $ NODE_ENV=production node server/index.js

You can also use processes.json to run the application with PM2 Monitor on your production server (customize it for your use):

  • $ pm2 start processes.json

(OSX) Run into docker for development

You can build and dev with the boilerplate through docker container, it runs with dinghy.

  • Install dinghy (it has support for NFS sharing which is required for changes detection and it's fast!)
  • $ dinghy up
  • $ docker-compose build (don't kill your terminal it take time to install node_modules for dev)
  • $ docker-compose up

Then open http://webapp.docker into your browser. (You can change this URL into docker-compose.yml)

Learn more