@d0whc3r/node-s3

Utility to use s3 storage using nodejs. It could upload content to s3 storage to use as backup service.

Usage no npm install needed!

<script type="module">
  import d0whc3rNodeS3 from 'https://cdn.skypack.dev/@d0whc3r/node-s3';
</script>

README

:floppy_disk: Node s3

Utility to use s3 storage using nodejs. It could upload content to s3 storage to use as backup service.

:eyes: Project status

Actions Status pipeline status

codecov Codacy Badge

Quality Gate Status Maintainability Rating Security Rating Bugs Vulnerabilities

dependencies npm version

:key: Create keys

Access key and Secre key should be defined in environment variables check example.env for more info about environment variables

:boat: Docker usage

You could use cli app in docker

:rowboat: Build docker image

docker build -t s3 .

:beginner: Environment variables

  • ENDPOINT: Endpoint to connect, could be any s3 compatible endpoint (it could be defined in commandline with --endpoint or -e, example: http://s3.eu-central-1.amazonaws.com:9000)
  • ACCESS_KEY: Access key to use (required)
  • SECRET_KEY: Secret key to use (required)
  • BUCKET: Bucket name to connect (it could be created using -c or it could be defined in commandline with --bucket)
  • MAX_RETRIES: Maximum retry connections when fail (optional, example: 3)
  • FORCE_PATH_STYLE: Force path style (optional, example: true)
  • SSL_ENABLED: Enable ssl connection to endpoint (optional, example: false)

:checkered_flag: Cli help output

Docker usage

Using docker image from hub.docker.com

docker run --rm d0whc3r/s3 --help

NPM Usage

npx @d0whc3r/node-s3 -h

or

npm install -g @d0whc3r/node-s3
node-s3 -h
Help for node-s3

  Usage of npm node-s3 in command line. 

Options

  -e, --endpoint url                             Destination url (can be defined by $ENDPOINT env variable)                    
  --bucket bucket                                Destination bucket (can be defined by $BUCKET env variable)                   
  -l, --list                                     List all files                                                                
  -b, --backup file*                             Backup files                                                                  
  -z, --zip zipname.zip                          Zip backup files                                                              
  -r, --replace                                  Replace files if already exists when backup upload                            
  -c, --create                                   Create destination upload bucket                                              
  -f, --folder foldername                        Folder name to upload file/s                                                  
  -d, --delete foldername=duration OR duration   Clean files older than duration in foldername                                 
  -m, --mysql                                    Mysql backup using environment variables to connect mysql server              
                                                 ($MYSQL_USER, $MYSQL_PASSWORD, $MYSQL_DATABASE, $MYSQL_HOST, $MYSQL_PORT)     
  -h, --help                                     Print this usage guide.                                                       

Examples

  1. List files in "sample" bucket.                                                                             $ node-s3 -e http://s3.eu-central-1.amazonaws.com --bucket sample -l                                         
  2. Backup multiple files to "backupFolder" folder.                                                            $ node-s3 -e http://s3.eu-central-1.amazonaws.com --bucket sample -b src/index.ts -b images/logo.png -f      
                                                                                                                backupFolder                                                                                                
  3. Backup files using wildcard to "backup" folder.                                                            $ node-s3 -e http://s3.eu-central-1.amazonaws.com --bucket sample -b src/* -b images/* -f backup             
  4. Backup files using wildcard and zip into "zipped" folder, bucket will be created if it doesn't exists.     $ node-s3 -e http://s3.eu-central-1.amazonaws.com --bucket sample -b src/* -b images/* -z -f zipped.zip -c   
  5. Backup files using wildcard and zip using "allfiles.zip" as filename into "zipped" folder, bucket will     $ node-s3 -e http://s3.eu-central-1.amazonaws.com --bucket sample -b src/* -b images/* -z allfiles.zip -f    
  be created if it doesn't exists and zipfile will be replaced if it exists                                     zipped -c -r                                                                                                
  6. Delete files in "uploads" folder older than 2days and files in "monthly" folder older than 1month          $ node-s3 -e http://s3.eu-central-1.amazonaws.com --bucket sample -d uploads=2d -d monthly=1M                
  7. Delete files in "uploads" folder older than 1minute                                                        $ node-s3 -e http://s3.eu-central-1.amazonaws.com --bucket sample -f uploads -d 1m                           
  8. Generate mysql dump file zip it and upload to "mysql-backup" folder                                        $ node-s3 -e http://s3.eu-central-1.amazonaws.com --bucket sample -f mysql-backup -m -z                      

Alternatives

The same interface/api using go (golang): go-s3