saga-query

Data fetching and caching using a middleware system

Usage no npm install needed!

<script type="module">
  import sagaQuery from 'https://cdn.skypack.dev/saga-query';
</script>

README

saga-query

ci

Control your data cache on the front-end.

Data fetching and caching using a robust middleware system. Quickly build data loading within your redux application and reduce boilerplate.

This library is undergoing active development. Consider this in a beta state.

Features

  • Write middleware to handle fetching, synchronizing, and caching API requests on the front-end
  • A familiar middleware system that node.js developers are familiar with (e.g. express.js)
  • Simple recipes to handle complex use-cases like cancellation, polling, optimistic updates, loading states, undo, react
  • Full control over the data fetching and caching layers in your application
  • Fine tune selectors for your specific needs
// api.ts
import { createApi, requestMonitor, requestParser, call } from 'saga-query';

const api = createApi();
api.use(requestMonitor());
api.use(api.routes());
api.use(requestParser());

api.use(function* onFetch(ctx, next) {
  const { url = "", ...options } = ctx.request;
  const apiUrl = `https://api.github.com${url}`;

  const resp: Response = yield call(fetch, apiUrl, options);
  const data = yield call([resp, "json"]);

  ctx.response = { status: resp.status, ok: resp.ok, data };

  yield next();
});

export const fetchRepo = api.get(
  `/repos/neurosnap/saga-query`,
  api.request({ simpleCache: true })
);
// app.tsx
import React, { useEffect } from 'react';
import { useSimpleCache } from 'saga-query';
import { fetchUsers } from './api';

const useUsers = () => {
  const cache = useSimpleCache(fetchUsers());

  useEffect(() => {
    cache.trigger();
  }, []);
}

const App = () => {
  const cache = useUsers();

  if (cache.isInitialLoading) return <div>Loading ...</div>
  if (cache.isError) return <div>{cache.message}</div>

  return (
    <div>
      {cache.data.map((user) => <div key={user.id}>{user.email}</div>)}
    </div>
  );
}

Why?

Libraries like react-query, rtk-query, and apollo-client are making it easier than ever to fetch and cache data from an API server. All of them have their unique attributes and I encourage everyone to check them out.

There's no better async flow control system than redux-saga. Treating side-effects as data makes testing dead simple and provides a powerful effect handling system to accommodate any use-case. Features like polling, data loading states, cancellation, racing, parallelization, optimistic updates, and undo are at your disposal when using redux-saga. Other libraries and paradigms can also accomplish the same tasks, but I think nothing rivals the readability and maintainability of redux/redux-saga.

All three libraries above are reinventing async flow control and hiding them from the end-developer. For the happy path, this works beautifully. Why learn how to cache API data when a library can do it for you? However:

  • What happens when useMemo isn't good enough?
  • What happens when the data syncing library lacks the caching granularity you need?
  • What happens when the data syncing library doesn't cache things in an optimized way for your needs?
  • What happens when you want to reuse your business logic for another platform (e.g. a cli) and can't use react?

This library is built to support both small and large scale, complex flow control applications that need full control over the data cache layer while setting good standards for using redux and a flexible middleware to handle all business logic.

Core principles

  • The end-developer should have full control over fetching/caching/querying their server data
  • Fetching and caching data should be separate from the view layer
  • We should treat side-effects as data
  • Sagas are the central processing unit for IO/business logic
  • A minimal API that encourages end-developers to write code instead of configuring objects

saga-query is not

  • A DSL wrapped around data fetching and caching logic
  • A one-line solution to fetch and cache server data automatically

Examples

How does it work?

createApi will build a set of actions and sagas for each create or http method used (e.g. get, post, put). Let's call them endpoints. Each endpoint gets their own action and linked saga. When you call api.saga() it loops through all the endpoints and creates a root saga that is fault tolerant (one saga won't crash all the other sagas). The default for each endpoint is to use takeEvery from redux-saga but as you'll see in other recipes, this can be easily overridden.

The middleware that is loaded into the query via .use(...) gets added to an array. This array becomes a pipeline that each endpoint calls in order. When yield next() is called inside the middleware or an endpoint, it calls the next middleware in the stack until it finishes. Everything after yield next() gets called after all the middleware ahead of the current middleware finishes its execution.

Here's a test that demonstrates the order of execution:

test('middleware order of execution', async (t) => {
  t.plan(1);
  let acc = '';
  const api = createApi();
  api.use(api.routes());

  api.use(function* (ctx, next) {
    yield delay(10);
    acc += 'b';
    yield next();
    yield delay(10);
    acc += 'f';
  });

  api.use(function* (ctx, next) {
    acc += 'c';
    yield next();
    acc += 'd';
    yield delay(30);
    acc += 'e';
  });

  const action = api.create('/api', function* (ctx, next) {
    acc += 'a';
    yield next();
    acc += 'g';
  });

  const store = setupStore(api.saga());
  store.dispatch(action());

  await sleep(60);
  t.assert(acc === 'abcdefg');
});

The actions created from saga-query are JSON serializable. We are not passing middleware functions through our actions. This is a design decision to support things like inter-process communication.

Control your data cache

import { createTable, createReducerMap } from 'robodux';
import {
  createApi,
  requestMonitor,
  requestParser,
  // FetchCtx is an interface that's built around using window.fetch
  // You don't have to use it if you don't want to.
  FetchCtx,
  put,
  call
} from 'saga-query';

// create a reducer that acts like a SQL database table
// the keys are the id and the value is the record
const users = createTable<User>({ name: 'users' });

// something awesome happens in here
// The default generic value here is `ApiCtx` which includes a `payload`,
// `request`, and `response`.
// The generic passed to `createApi` must extend `ApiCtx` to be accepted.
const api = createApi<FetchCtx>();

// This middleware monitors the lifecycle of the request.  It needs to be
// loaded before `.routes()` because it needs to be around after everything
// else. It is composed of other middleware: dispatchActions and loadingMonitor.
// [dispatchActions]  This middleware leverages `redux-batched-actions` to
//  dispatch all the actions stored within `ctx.actions` which get added by
//  other middleware during the lifecycle of the request.
// [loadingMonitor] This middleware will monitor the lifecycle of a request and
//  attach the appropriate loading states to the loader associated with the
//  endpoint.
api.use(requestMonitor());

// This is where all the endpoints (e.g. `.get()`, `.put()`, etc.) you created
// get added to the middleware stack.  It is recommended to put this as close to
// the beginning of the stack so everything after `yield next()`
// happens at the end of the effect.
api.use(api.routes());

// This middleware is composed of other middleware: queryCtx, urlParser, and
// simpleCache
// [queryCtx] sets up the ctx object with `ctx.request` and `ctx.response`
//  required for `createApi` to function properly.
// [urlParser] is a middleware that will take the name of `api.create(name)` and
//  replace it with the values passed into the action.
// [simpleCache] is a middleware that will automatically store the response of
//  endpoints if the endpoint has `request.simpleCache = true`
api.use(requestParser());

// this is where you define your core fetching logic
api.use(function* onFetch(ctx, next) {
  // ctx.request is the object used to make a fetch request when using
  // `queryCtx` and `urlParser`
  const { url = '', ...options } = ctx.request;
  const resp = yield call(fetch, `https://api.com${url}`, options);
  const data = yield call([resp, 'json']);

  // with `FetchCtx` we want to set the `ctx.response` so other middleware can
  // use it.
  ctx.response = { status: resp.status, ok: resp.ok, data };

  // we almost *always* need to call `yield next()` that way other middleware will be
  // called downstream of this middleware. The only time we don't call `next`
  // is when we don't want to call any middleware after this one.
  yield next();
});

// This is how you create a function that will fetch an API endpoint.  The
// first parameter is the name of the action type.  When using `urlParser` it
// will also be the URL inside `ctx.request.url` of which you can do what you
// want with it.
const fetchUsers = api.get(
  `/users`,
  // Since this middleware is first it has the unique benefit of being in full
  // control of when the other middleware get activated.
  // The type inside of `FetchCtx` is the response object
  function* processUsers(ctx: FetchCtx<{ users: User[] }>, next) {
    // anything before this call can mutate the `ctx` object before it gets
    // sent to the other middleware
    yield next();
    // anything after the above line happens *after* the middleware gets called and
    // and a fetch has been made.

    // using FetchCtx `ctx.response` is a discriminated union based on the
    // boolean `ctx.response.ok`.
    if (!ctx.response.ok) return;

    // data = { users: User[] };
    const { data } = ctx.response;
    const curUsers = data.users.reduce<MapEntity<User>>((acc, u) => {
      acc[u.id] = u;
      return acc;
    }, {});

    // save the data to our redux slice called `users`
    yield put(users.actions.add(curUsers));
  },
);

// This is a helper function, all id does is iterate through all the objects
// looking for a `.reducer` property and create a big object containing all
// the reducers which will then have `combineReducers` applied to it.
const reducers = createReducerMap(users);
// This is a helper function that does a bunch of stuff to prepare redux for
// saga-query.  In particular, it will:
//   - Setup redux-saga
//   - Setup redux-batched-actions
//   - Setup a couple of reducers that saga-query will use: loaders and data
const prepared = prepareStore({
  reducers,
  sagas: { api: api.saga() }
});
const store = createStore(
  prepared.reducer,
  undefined,
  applyMiddleware(...prepared.middleware),
);
// This runs the sagas
prepared.run();

store.dispatch(fetchUsers());

Recipes

Manipulating the request

const createUser = api.post<{ id: string, email: string }>(
  `/users`,
  function* onCreateUser(ctx: FetchCtx<User>, next) {
    // here we manipulate the request before it gets sent to our middleware
    ctx.request = {
      body: JSON.stringify({ email: ctx.payload.email }),
    };
    yield next();
    if (!ctx.response.ok) return;

    const curUser = ctx.response.data;
    const curUsers = { [curUser.id]: curUser };

    yield put(users.actions.add(curUsers));
  },
);

store.dispatch(createUser({ id: '1', }));

Have some request data that you want to set when creating the endpoint?

const fetchUsers = api.get('/users', api.request({ credentials: 'include' }))

api.request() accepts the request for the Ctx that the end-developer provides.

Simple cache

If you want to have a cache that doesn't enforce strict types and is more of a dumb cache that fetches and stores data for you, then simpleCache will provide that functionality for you.

The following code will mimic what a library like react-query is doing behind-the-scenes. I want to make it clear that react-query is doing a lot more than this so I don't want to understate what it does. However, you can see that not only can we get a core chunk of the functionality react-query provides with a little over 100 lines of code but we also have full control over fetching, querying, and caching data with the ability to customize it using middleware.

// api.ts
import {
  createApi,
  requestMonitor,
  requestParser,
  timer,
  prepareStore,
} from 'saga-query';

const api = createApi();
api.use(requestMonitor());
api.use(api.routes());
api.use(requestParser());

// made up api fetch
api.use(apiFetch);

// this will only activate the endpoint at most once every 5 minutes.
const cacheTimer = timer(5 * 60 * 1000);
export const fetchUsers = api.get(
  '/users',
  { saga: cacheTimer },
  // set `simpleCache=true` to have simpleCache middleware cache response data
  // automatically
  api.request({ simpleCache: true }),
);

const prepared = prepareStore({
  sagas: { api: api.saga() },
});
const store = createStore(
  prepared.reducer,
  undefined,
  applyMiddleware(...prepared.middleware),
);
// This runs the sagas
prepared.run();
// app.tsx
import React from 'react';
import { useQuery } from 'saga-query';

import { fetchUsers } from './api';

interface User {
  id: string;
  name: string;
}

const useUsers = () => {
  const { data: users = [], ...loader } = useQuery<{ users: User[] }>(
    fetchUsers()
  );
  return { users, ...loader };
}

export const App = () => {
  const { users, isInitialLoading, isError, message } = useUsers();

  if (isInitialLoading) return <div>Loading ...</div>;
  if (isError) return <div>Error: {message}</div>;

  return (
    <div>
      {users.map((user) => <div key={user.id}>{user.name}</div>)}
    </div>
  );
}

Dispatching many actions

Sometimes we need to dispatch a bunch of actions for an endpoint. From loading states to making multiple requests in a single saga, there can be a lot of actions being dispatched. When using prepareStore we automatically setup redux-batched-actions so you don't have to. Anything that gets added to ctx.actions will be automatically dispatched by the dispatchActions middleware.

Dependent queries

Sometimes it's necessary to compose multiple endpoints together. For example we might want to fetch a mailbox and its associated messages. Every endpoint also returns a property on the action creator .run which returns the saga that runs when the action is dispatched.

This allows us to yield to that saga inside another endpoint.

const fetchMailbox = api.get('/mailboxes');

const fetchMessages = api.get<{ id: string }>(
  '/mailboxes/:id/messages',
  function*(ctx, next) {
    // The return value of a `.run` is the entire `ctx` object.
    const mailCtx = yield call(fetchMailbox.run, fetchMailbox());

    if (!mailCtx.response.ok) {
      yield next();
      return;
    }

    ctx.request = {
      url: `/mailboxes/${mailCtx.response.id}/messages`
    };

    yield next();
  },
);

Dynamic endpoints

Sometimes a URL needs to be generated from other data. When creating an endpoint, it must be created before api.saga() is called. Because of this, there's a limitation to what we can permit inside the name of the endpoint. The name is the first parameter passed to the HTTP methods api.get(name) or api.post(name). If you need to generate the URL based on dynamic content, like a state derived value, then the recommended solution is to do the following:

api.post<{ message: string }>('create-message', function* onCreateMsg(ctx, next) {
  // made up selector that grabs a mailbox
  const mailbox = yield select(selectMailbox);
  const message = ctx.payload.message;

  ctx.request = {
    url: `/mailboxes/${mailbox.id}/messages`,
    // NOTE: at this point in time, when `ctx.request.url` is present before
    // `urlParser` is activated, then we cannot automatically set the `method`
    // property, you **must** add it youself.
    method: 'POST',
    body: JSON.stringify({ message }),
  };

  yield next();
})

As you can see, we can put whatever want for the name parameter passed into api.get(name). The key thing to realize here is that the name must be unique across all endpoints since the name is what we use for the action type.

Error handling

Error handling can be accomplished in a bunch of places in the middleware pipeline.

Catch all middleware before itself:

const api = createApi();
api.use(function* upstream(ctx, next) {
  try {
    yield next();
  } catch (err) {
    console.log('error!');
  }
});
api.use(api.routes());

api.use(function* fail() {
  throw new Error('some error');
});

const action = api.create(`/error`);
const store = setupStore(api.saga());
store.dispatch(action());

Catch middleware inside the action handler:

const api = createApi();
api.use(api.routes());
api.use(function* fail() {
  throw new Error('some error');
});

const action = api.create(`/error`, function* (ctx, next) {
  try {
    yield next();
  } catch (err) {
    console.log('error!');
  }
});

const store = setupStore(api.saga());
store.dispatch(action());

Global error handler:

const api = createApi({
  onError: (err: Error) => { console.log('error!'); },
});
api.use(api.routes());
api.use(function* upstream(ctx, next) {
  throw new Error('failure');
});

const action = api.create(`/error`);
const store = setupStore(api.saga());
store.dispatch(action());

Loading state

When using prepareStore in conjunction with dispatchActions, loadingMonitor, and requestParser the loading state will automatically be added to all of your endpoints. We also export QueryState which is the interface that contains all the state types that saga-query provides.

// app.tsx
import React, { useEffect } from 'react';
import { useSelector, useDispatch } from 'react-redux';
import { selectLoaderById, QueryState } from 'saga-query';
import { MapEntity } from 'robodux';

import {
  fetchUsers,
  selectUsersAsList,
} from './api';

interface AppState extends QueryState {
  users: MapEntity<User>;
}

const App = () => {
  const dispatch = useDispatch();
  const users = useSelector(selectUsersAsList);
  const loader = useSelector(
    (s: AppState) => selectLoaderById(s, { id: `${fetchUsers}` })
  );
  useEffect(() => {
    dispatch(fetchUsers());
  }, []);

  if (loader.isInitialLoading) {
    return <div>Loading ...</div>
  }

  if (loader.isError) {
    return <div>Error: {loader.message}</div>
  }

  return (
    <div>{users.map((user) => <div key={user.id}>{user.email}</div>)}</div>
  );
}

React

We built a couple of simple hooks useQuery and useSimpleCache to make interacting with saga-query easier. Having said that, it would be trivial to build your own custom hooks to do exactly what you want.

Let's rewrite the react code used in the previous example (loading state)

// use-query.ts
import { useEffect } from 'react';
import { useQuery } from 'saga-query';

import { fetchUsers, selectUsersAsList } from './api';

export const useQueryUsers = () => {
  const cache = useQuery(fetchUsers, selectUsersAsList);
  useEffect(() => {
    cache.trigger();
  }, []);
  return cache;
}
// app.tsx
import React from 'react';
import { useQueryUsers } from './use-query';

const App = () => {
  const { data, isInitialLoading, isError, message } = useQueryUsers();

  if (isInitialLoading) {
    return <div>Loading ...</div>
  }

  if (isError) {
    return <div>Error: {message}</div>
  }

  return (
    <div>{data.map((user) => <div key={user.id}>{user.email}</div>)}</div>
  );
}

Cache timer

Only call the endpoint at most on an interval. We can call the endpoint as many times as we want but it will only get activated once every X milliseconds. This effectively updates the cache on an interval.

import { timer } from 'saga-query';

const SECONDS = 1000;
const MINUTES = 60 * SECONDS;

const fetchUsers = api.get(
  '/users',
  { saga: timer(5 * MINUTES) }
);

Take leading

If two requests are made:

  • (A) request; then
  • (B) request

While (A) request is still in flight, (B) request would be canceled.

import { takeLeading } from 'saga-query';

// this is for demonstration purposes, you can import it using
// import { leading } from 'saga-query';
function* leading(action: string, saga: any, ...args: any[]) {
  yield takeLeading(`${action}`, saga, ...args);
}

const fetchUsers = api.get(
  `/users`,
  { saga: leading },
  function* processUsers(ctx, next) {
    yield next();
    // ...
  },
);

Polling

We built this saga helper for you which will either accept the timer as a number or if the payload contains a timer prop:

import { poll } from 'saga-query';

const pollUsers = api.get(
  `/users`,
  { saga: poll(5 * 1000) },
);

action.payload.timer takes precedence.

import React, { useState, useEffect } from 'react';
import { useDispatch } from 'react-redux';
import { pollUsers } from './api';

const App = () => {
  const dispatch = useDispatch();
  const [polling, setPolling] = useState("init");

  const onClick = () => {
    if (polling === "on") setPolling("off");
    else setPolling("on");
  };

  useEffect(() => {
    if (polling === "init") return;
    dispatch(pollRepo());
  }, [polling]);

  return (
    <div>
      <div>Polling: {polling}</div>
      <button onClick={onClick}>Toggle Polling</button>
    </div>
  );
}

Optimistic UI

Here is the manual, one-off way to handle optimistic ui:

import { put, select } from 'saga-query';

const updateUser = api.patch<Partial<User> & { id: string }>(
  `/users/:id`,
  function* onUpdateUser(ctx: FetchCtx<User>, next) {
    const { id, email } = ctx.payload;
    ctx.request = {
      body: JSON.stringify(email),
    };

    // save the current user record in a variable
    const prevUser = yield select(selectUserById, { id }));
    // optimistically update user
    yield put(users.actions.patch({ [user.id]: { email } }));

    // activate PATCH request
    yield next();

    // oops something went wrong, revert!
    if (!ctx.response.ok) {
      yield put(users.actions.add({ [prevUser.id]: prevUser });
      return;
    }

    // even though we know what was updated, it's still a good habit to
    // update our local cache with what the server sent us
    const nextUser = ctx.response.data;
    yield put(users.actions.add({ [nextUser.id]: nextUser }));
  },
)

Not too bad, but we built an optimistic middleware for you:

import { MapEntity, PatchEntity } from 'robodux';
import { OptimisticCtx, optimistic } from 'saga-query';

const api = createApi();
api.use(api.routes());
api.use(optimistic);

api.patch(
  function* (ctx: OptimisticCtx<PatchEntity<User>, MapEntity<User>>, next) {
    const { id, email } = ctx.payload;
    const prevUser = yield select(selectUserById, { id }));

    ctx.optimistic = {
      apply: users.actions.patch({ [id]: { email } }),
      revert: users.actions.add({ [id]: prevUser }),
    };

    ctx.request = {
      method: 'PATCH',
      body: JSON.stringify({ email }),
    };

    yield next();
  }
);

Undo

We build a simple undo middleware that waits for one of two actions to be dispatched:

  • doIt() which will call the endpoint
  • undo() which will cancel the endpoint

The middleware accepts three properties:

  • doItType (default: ${doIt}) => action type
  • undoType (default: ${undo}) => action type
  • timeout (default: 30 * 1000) => time in milliseconds before the endpoint get canceled automatically
import { createAction } from 'robodux';
import {
  createApi,
  requestMonitor,
  requestParser,
  undoer,
  undo,
  doIt,
  UndoCtx,
  delay,
  put,
  race
} from 'saga-query';

interface Message {
  id: string;
  archived: boolean;
}

const messages = createTable<Message>({ name: 'messages' });
const api = createApi<UndoCtx>();
api.use(requestMonitor());
api.use(api.routes());
api.use(requestParser());
api.use(undoer());

const archiveMessage = api.patch<{ id: string; }>(
  `message/:id`,
  function* onArchive(ctx, next) {
    ctx.undoable = true;

    // prepare the request
    ctx.request = {
      body: JSON.stringify({ archived: true }),
    };

    // make the API request
    yield next();
  }
)

const reducers = createReducerMap(messages);
const store = setupStore(api.saga(), reducers);

store.dispatch(archiveMessage({ id: '1' }));
// wait 2 seconds to cancel endpoint
store.dispatch(undo());
// -or- to activate the endpoint
store.dispatch(doIt());

This is not the only way to implement an undo mechanism, it's just the one we provide out-of-the-box to work with a UI that fully controls the undo mechanism.

For example, if you want the endpoint to be called automatically after some timer, you could build a middleware to do that for you:

import { race, delay } from 'saga-query';

const undo = createAction('UNDO');
function* undoer<Ctx extends UndoCtx = UndoCtx>() {
  if (!ctx.undoable) {
    yield next();
    return;
  }

  const winner = yield race({
    timer: delay(3 * 1000),
    undo: take(`${undo}`),
  });

  if (winner.undo) return;
  yield next();
}

Performance monitor

import { performanceMonitor, createPipe, wrap, PerfCtx, delay } from 'saga-query';

const thunks = createPipe<PerfCtx>();
thunks.use(function* (ctx, next) {
  yield next();
  console.log(`calling ${ctx.name} took ${ctx.performance} ms`);
});
thunks.use(performanceMonitor);
thunks.use(thunks.routes());

function* slowSaga() {
  yield delay(10 * 1000);
}

const slow = thunks.create('something-slow', wrap(slowSaga));

store.dispatch(slow());
// calling something-slow took 10000 ms

A note on robodux

The docs heavily use robodux and is recommended for usage with saga-query. It is not required to use saga-query. At this point in time saga-query works fine with other libraries like redux-toolkit and I didn't want to impose robodux on other developers.

Having said that, I use it for most of my production applications and it will make caching data simple and straight-forward. Even for large scale applications, 100% of my redux state is composed of robodux slice helpers.

I also wrote a redux-saga style-guide that is also heavily encouraged.

Redux-toolkit

redux-toolkit is a very popular redux library officially supported by the redux team. When using it with saga-query the main thing it is responsible for is setting up the redux slice where we want to cache the API endpoint response data.

import { createStore } from 'redux';
import { createReducerMap } from 'robodux';
import {
  prepareStore,
  createApi,
  requestMonitor,
  requestParser,
} from 'saga-query';
import { createSlice } from 'redux-toolkit';

const users = createSlice({
  name: 'users',
  initialState: {},
  reducers: {
    add: (state, action) => {
      action.payload.forEach((user) => {
        state[user.id] = user.id;
      });
    }
  }
});

const api = createApi();
api.use(requestMonitor());
api.use(api.routes());
api.use(requestParser());
// made up window.fetch logic
api.use(apiFetch);

const fetchUsers = api.get<{ users: User[] }>('/users', function* (ctx, next) {
  yield next();
  const { data } = response.data;
  yield put(users.actions.add(data.users));
});

const reducers = createReducerMap(users);
const prepared = prepareStore({
  reducers,
  sagas: { api: api.saga() },
});
const store = createStore(
  prepared.reducer,
  {},
  applyMiddleware(...prepared.middleware),
);
prepared.run();

store.dispatch(fetchUsers());