README
š„ BackFire āļøā
Ultimate control over backing up and restoring your Firestore data! Use BackFire to import and export data from Firestore, including the Local Firestore Emulator.
Key features
- Control which collections are imported/exported
- Control which documents are imported/exported based on path
- Control the depth of subcollections to import/export
- Import/export data from local files
- Import/export data from Google Cloud Storage
- (WIP) Import/export data from S3
ā ļø Project is a WIP
This project is still under development. It has an unstable API and may contain bugs. Not recommended for production use yet.
Installation
Install this program using yarn
or npm
.
# Using yarn
yarn add @benyap/backfire
# Using npm
npm install @benyap/backfire
Optional peer dependencies
If you plan to import/export data from Google Cloud Storage, you must also install
@google-cloud/storage
:
# Using yarn
yarn add @google-cloud/storage
# Using npm
npm install @google-cloud/storage
CLI Usage
All commands are accessed through backfire
on the CLI. Options can be provided
either as command line arguments or via a configuration file.
Usage: backfire [options] [command]
Ultimate control over backing up and restoring your Firestore data
Options:
-V, --version output the version number
--verbose output verbose logs
-h, --help display help for command
Commands:
export [options] <path> Export data from Firestore to the given path
import [options] <path> Import data to Firestore from the given path
help [command] display help for command
Export command
The export
command will export data from a Firestore instance. The path
argument
must be provided, and this should be a path to one of:
- a local directory where the exported data will be created (e.g.
./data-folder
) - a path to a GCS bucket where the exported data will be saved (e.g.
gs://my-gcs-bucket
)
All other command options are listed in the shared commands options section.
Command reference
Usage: backfire export [options] <path>
Export data from Firestore to the given path
Options:
-P, --project <project> the Firebase project id
-K, --keyfile <path> path to Firebase service account credentials JSON file
-E, --emulator <host> use the local Firestore emulator
--collections [collections...] name of the root collections to export (all collections exported if not specified)
--patterns [regex...] regex patterns that a document path must match to be exported
--depth <number> subcollection depth to export (default: 100)
--concurrency <number> number of concurrent processes allowed (default: 10)
--json outputs data in JSON array format (only applies when exporting to local files)
--gcs-project <project> the Google Cloud project id (required if using GCS)
--gcs-keyfile <path> path to Google Cloud service account credentialsĀ JSON file (required if using GCS)
-h, --help display help for command
Import command
The import
command will import data to a Firestore instance. The path
argument
must be provided, and this should be a path to one of:
- a local directory where the data should be imported from (e.g.
./data-folder
) - a path to a GCS bucket where data should be imported from (e.g.
gs://my-gcs-bucket
)
The data should be in the .snapshot
format (or the
JSON version of it).
All other command options are listed in the shared commands options section.
Command reference
Usage: backfire import [options] <path>
Import data to Firestore from the given path
Options:
-P, --project <project> the Firebase project id
-K, --keyfile <path> path to Firebase service account credentials JSON file
-E, --emulator <host> use the local Firestore emulator
--collections [collections...] name of the root collections to import (all collections imported if not specified)
--patterns [regex...] regex patterns that a document path must match to be imported
--depth <number> subcollection depth to import (default: 100)
--concurrency <number> number of concurrent processes allowed (default: 10)
--json import data from JSON array format (only applies when importing from local files)
--gcs-project <project> the Google Cloud project id (required if using GCS)
--gcs-keyfile <path> path to Google Cloud service account credentialsĀ JSON file (required if using GCS)
-h, --help display help for command
Shared command options
The following options are shared between the import
and export
commands.
-P, --project <project>
The Firebase project to import/export data from.
-K, --keyfile <path>
The path to service account credentials for connecting to Firestore.
For example, to connect to my-project
using the service account credentials file
service-account.json
in the current directory:
backfire export my-folder -P my-project -K service-account.json
-E, --emulator <host>
Provide the emulator host to connect to using the --emulator
option.
For example, to connect to the emulator at http://localhost:8080
:
backfire export my-folder -P my-project -E localhost:8080
The -E, --emulator
option takes precendenceĀ over the -K, --keyfile
option. This
means that if both options are provided, the emulator will be used.
--collections [collections...]
You can specify which root collections to import/export by using the --collections
option. Provide a list of space-separated collection names. If not specified, all
available collections will be imported/exported.
backfire export my-folder -P my-project -K service-account.json --collections users settings
The above command will export data from the users
and settings
collection,
including all subcollections.
--patterns [regex...]
You can provide a list of patterns in the form of regular expressions to filter which documents to import/export. If more than one pattern is provided, a document must match at least one pattern to be imported/exported. If you are providing more than one pattern, they should be space-separated. You may need to wrap your patterns in quotes if they include special characters, such as an asterisk (*).
Regular expressions are parsed by regex-parser.
backfire export my-folder -P my-project -K service-account.json --patterns '^logs\/[^/]*F