A flexible fetcher for GraphQL queries, mutations and subscriptions

Usage no npm install needed!

<script type="module">
  import apolloFetcher from 'https://cdn.skypack.dev/apollo-fetcher';



A flexible GraphQL networking library for GraphQL queries, mutations and subscriptions


Even though a network interface is easy to write in theory, we've observed that most people won't go through the trouble of writing one if they can bend the existing one to their will with middlewares / afterwares or wrappers. Our aim here is to implement a network interface that is flexible enough to support 80% of use-cases out of the box and can handle another 15 - 20% of real use cases with middlewares, afterwares or wrappers, such that all users of GraphQL can rely on a really solid network interface and transport implementation.

The goal is to make a very flexible network interface + transport layer for GraphQL that supports some or all the following:

  1. Send GraphQL operations over a pluggable transport that is either HTTP (GET/POST) or websockets
  2. Handles GraphQL queries, mutations, subscriptions and live queries (@live, @defer, @stream)
  3. Is very flexible by allowing to pass a context or metadata field that can be read and written in middlewares
  4. Supports middlewares and afterwares (maybe as part of the transport?)
  5. Supports batching for HTTP requests (maybe as part of HTTP transport middleware?)

Should support the following features implemented as middleware:

  1. Persisted queries
  2. Polling
  3. Basic retries on errors
  4. Basic offline handling like queueing up requests and sending them on reconnect

To make the API easily consumable, the network interface should return an observable. The client can start GraphQL operations on the network interface and subscribe to them. Every operation can be subscribed to in an observable, so it has the ability to return multiple responses, and the client must be equipped to handle that. Operations that return a single response will call complete right after calling next.

This is a work in progress, please feel free to tune in with your ideas and use-cases!