README
🔥 Firestore BackFire
Ultimate control over importing and exporting data from Firestore and the Firestore Emulator.
Key features
- Control which documents are imported/exported by specifying paths or through pattern matching
- Control the depth of subcollections to import/export
- Import and export data as JSON to a variety of different storage sources:
- local files
- Google Cloud Storage
- AWS S3
Please see the changelog for the latest updates.
Installation
Install the package and peer dependencies using yarn
or npm
.
# Using yarn
yarn add firestore-backfire @google-cloud/firestore
# Using npm
npm install firestore-backfire @google-cloud/firestore
Optional peer dependencies
If you plan to import/export data from Google Cloud Storage, you must install the following peer dependencies:
@google-cloud/storage
If you plan to import/export data from AWS S3, you must install the following peer dependencies:
@aws-sdk/client-s3
@aws-sdk/lib-storage
Additionally, if you want to use a credential profile from ~/.aws/credentials
to
run this program, you should also install:
@aws-sdk/credential-provider-ini
CLI Usage
Usage: backfire [options] [command]
Ultimate control over importing and exporting Firestore data
Options:
-V, --version output the version number
-c, --config <path> specify the config file to use
-h, --help display help for command
Commands:
export [options] [path] export data from Firestore
import [options] [path] import data into Firestore
help [command] display help for command
Run backfire
in your shell by directly calling the script file in node_modules
,
or by running it through yarn
. Options can be provided as command line arguments or
via a configuration file.
# Run the script file
./node_modules/.bin/backfire import
# Using yarn
yarn backfire import
Alternatively, you can also use it in your package.json
scripts.
// package.json
{
"scripts": {
"import-my-data": "backfire import"
}
}
Export command (default)
The export
command will export data from a Firestore database. The path
argument
must be provided either in the command or from a config file, and should be a path to
one of:
- a local directory where the exported data will be created (e.g.
./data-folder
) - a path to a Google Cloud Storage bucket where the exported data will be saved (e.g.
gs://my-gs-bucket
) - a path to an S3 bucket where the exported data will be saved (e.g.
s3://my-s3-bucket
)
Command reference
Usage: backfire export [options] <path>
export data from Firestore
Options:
-p, --project <project> the Firebase project to export data from
-k, --keyfile <path> path to Firebase service account credentials JSON file
-e, --emulator <host> export data from Firestore emulator if provided
--paths <path...> specify paths to export (all paths exported if not specified)
--patterns <pattern...> specify regex patterns that a document path must match to be exported
--depth <number> subcollection depth to export (root collection has depth of 0, all subcollections exported if not specified)
--workers <number> number of worker threads to use (determines number of export chunks, defaults to number of logical CPU cores available)
--logLevel <level> specify the logging level (choices: "silent", "info", "debug", "verbose")
--prettify prettify the output JSON
--force overwrite any existing data in the write location
--gcpProject <project_id> the Google Cloud project to import data from
--gcpKeyfile <path> path to Google Cloud service account credentials JSON file
--awsRegion <region> the AWS region to use
--awsProfile <profile> the AWS profile to use
--awsAccessKeyId <value> the AWS access key id
--awsSecretAccessKey <value> the AWS secret access key
-h, --help display help for command
Import command
The import
command will import data into a Firestore database. The path
argument
must be provided either in the command or from a config file, and should be a path to
one of:
- a local directory where the data should be imported from (e.g.
./data-folder
) - a path to a Google Cloud Storage bucket where data should be imported from (e.g.
gs://my-gs-bucket
) - a path to an AWS S3 bucket where data should be imported from (e.g.
s3://my-s3-bucket
)
Command reference
Usage: backfire import [options] <path>
import data into Firestore
Options:
-p, --project <project_id> the Firebase project to import data to
-k, --keyfile <path> path to Firebase service account credentials JSON file
-e, --emulator <host> import data into Firestore emulator if provided
--paths <path...> specify paths to export (all paths imported if not specified)
--patterns <pattern...> specify regex patterns that a document path must match to be imported
--depth <number> subcollection depth to import (root collection has depth of 0, all subcollections imported if not specified)
--workers <number> number of worker threads to use (defaults to number of data chunks to read)
--logLevel <level> specify the logging level (choices: "silent", "info", "debug", "verbose")
--mode <write_mode> specify whether importing existing documents should be throw an error, be merged or overwritten (choices: "create", "create-and-skip-existing", "merge", "overwrite")
--gcpProject <project_id> the Google Cloud project to import data from
--gcpKeyfile <path> path to Google Cloud service account credentials JSON file
--awsRegion <region> the AWS region to use
--awsProfile <profile> the AWS profile to use
--awsAccessKeyId <value> the AWS access key id
--awsSecretAccessKey <value> the AWS secret access key
-h, --help display help for command
Programmatic Usage
You can import this into your Node.js program from the firebase-backfire
package
and run the import or export commands.
import { exportFirestoreData } from "firebase-backfire";
async function main() {
await exportFirestoreData({ /* export options */ });
}
This package provides first-class Typescript support. Import and export options are fully typed with documentation.
Options
The following configuration options are the same in both CLI and programmatic usage.
Firestore connection options
-p, --project <project_id>
The Firebase project to import/export data from.
-k, --keyfile <path>
The path to service account credentials for connecting to Firestore.
For example, to connect to my-project
using the service account credentials file
service-account.json
in the current directory:
backfire export my-folder -P my-project -K service-account.json
-e, --emulator <host>
Provide the emulator host to connect to using the --emulator
option.
For example, to connect to the emulator at http://localhost:8080
:
backfire export my-folder -p my-project -e localhost:8080
The -e, --emulator
option takes precendence over the -k, --keyfile
option. This
means that if both options are provided, the emulator will be used.
Data options
--paths <path...>
You can specify which document or collection paths to import/export by using the
--paths
option. Provide a list of space-separated paths. If not specified, all
available paths will be explored for documents to import/export.
For example, the command below exports data from the users
collection, as well as
the settings/users
document. The subcollections of each document are also exported.
backfire export my-folder -p my-project -k service-account.json --paths users settings/users
--patterns <pattern...>
You can provide a list of patterns in the form of regular expressions to filter which document paths to import/export. If more than one pattern is provided, a document's path must match at least one pattern to be imported/exported. If you are providing more than one pattern, they should be space-separated. You may need to wrap your patterns in quotes if they include special characters, such as an asterisk (*). Regular expressions passed on the CLI are parsed by regex-parser.
For example, the command below will only export documents from the logs
collection
with a document id that ends with "F", in addition to any documents and documents
from subcollections from within the settings
collection.
backfire export my-folder -p my-project -k service-account.json --patterns '^logs\/[^/]*F