Are you happy with your logging solution? Would you help us out by taking a 30-second survey? Click here

s3-sync

A streaming interface for uploading multiple files to S3.

Subscribe to updates I use s3-sync


Statistics on s3-sync

Number of watchers on Github 76
Number of open issues 14
Average time to close an issue 6 months
Main language JavaScript
Average time to merge a PR about 1 month
Open pull requests 8+
Closed pull requests 2+
Last commit about 4 years ago
Repo Created over 6 years ago
Repo Last Updated over 1 year ago
Size 319 KB
Organization / Authorhughsk
Contributors7
Page Updated
Do you use s3-sync? Leave a review!
View open issues (14)
View s3-sync activity
View on github
Fresh, new opensource launches 🚀🚀🚀
Trendy new open source projects in your inbox! View examples

Subscribe to our mailing list

Evaluating s3-sync for your project? Score Explanation
Commits Score (?)
Issues & PR Score (?)

s3-sync

A streaming upload tool for Amazon S3, taking input from a readdirp stream, and outputting the resulting files.

s3-sync is also optionally backed by a level database to use as a local cache for file uploads. This way, you can minimize the frequency you have to hit S3 and speed up the whole process considerably.

You can use this to sync complete directory trees with S3 when deploying static websites. It's a work in progress, so expect occasional API changes and additional features.

Installation

npm install s3-sync

Usage

require('s3-sync').createStream([db, ]options)

Creates an upload stream. Passes its options to knox, so at a minimum you'll need:

  • key: Your AWS access key.
  • secret: Your AWS secret.
  • bucket: The bucket to upload to.

The following are also specific to s3-sync:

  • concurrency: The maximum amount of files to upload concurrently.
  • retries: The maximum number of times to retry uploading a file before failing. By default the value is 7.
  • headers: Additional headers to include on each file.
  • hashKey: By default, file hashes are stored based on the file's absolute path. This doesn't work very nicely with temporary files, so you can pass this function in to map the file object to a string key for the hash.
  • acl: Use a custom ACL header. Defaults to public-read.
  • force: Force s3-sync to overwrite any existing files.

You can also store your local cache in S3, provided you pass the following options, and use getCache and putCache (see below) before/after uploading:

  • cacheDest: the path to upload your cache backup to in S3.
  • cacheSrc: the local, temporary, text file to stream to before uploading to S3.

If you want more control over the files and their locations that you're uploading, you can write file objects directly to the stream, e.g.:

var stream = s3sync({
    key: process.env.AWS_ACCESS_KEY
  , secret: process.env.AWS_SECRET_KEY
  , bucket: 'sync-testing'
})

stream.write({
    src: __filename
  , dest: '/uploader.js'
})

stream.end({
    src: __dirname + '/README.md'
  , dest: '/README.md'
})

Where src is the absolute local file path, and dest is the location to upload the file to on the S3 bucket.

db is an optional argument - pass it a level database and it'll keep a local cache of file hashes, keeping S3 requests to a minimum.

stream.putCache(callback)

Uploads your level cache, if available, to the S3 bucket. This means that your cache only needs to be populated once.

stream.getCache(callback)

Streams a previously uploaded cache from S3 to your local level database.

stream.on('fail', callback)

Emitted when a file has failed to upload. This will be called each time the file is attempted to be uploaded.

Example

Here's an example using level and readdirp to upload a local directory to an S3 bucket:

var level = require('level')
  , s3sync = require('s3-sync')
  , readdirp = require('readdirp')

// To cache the S3 HEAD results and speed up the
// upload process. Usage is optional.
var db = level(__dirname + '/cache')

var files = readdirp({
    root: __dirname
  , directoryFilter: ['!.git', '!cache']
})

// Takes the same options arguments as `knox`,
// plus some additional options listed above
var uploader = s3sync(db, {
    key: process.env.AWS_ACCESS_KEY
  , secret: process.env.AWS_SECRET_KEY
  , bucket: 'sync-testing'
  , concurrency: 16
  , prefix : 'mysubfolder/' //optional prefix to files on S3
}).on('data', function(file) {
  console.log(file.fullPath + ' -> ' + file.url)
})

files.pipe(uploader)

You can find another example which includes remote cache storage at example.js.

s3-sync open issues Ask a question     (View All Issues)
  • almost 4 years Cannot be used with frankfurt aws region because of v4 signature
  • about 4 years listen `finish` event of LevelWriteStream(db)
  • about 4 years Headers should be on a file basis not per options
  • about 4 years peerDependencies dont allow me install the package
  • over 4 years Files are not recognized as changed if they are modified with something other than s3-sync
  • over 5 years doesn't remove files from bucket if they've been deleted locally
s3-sync open pull requests (View All Pulls)
  • Change knox to aws-sdk in order to support v4 aws signatures
  • fix #26
  • option to overwrite files
  • compatible with latest level release/broken version
  • If there is no syncfilehash header, this object didn't come from s3-sync
  • Allow explicitly specified content-types
  • Add custom endpoint option.
  • Update index.js
s3-sync questions on Stackoverflow (View All Questions)
  • 's3cmd sync' is working but not 'aws s3 sync'
  • How can I check if AWS S3 sync has any changes?
  • disable progress output aws s3 sync
  • AWS S3 Sync function - missing files
  • Is it possible to exclude from aws S3 sync files older then x time?
  • Sync command for OpenStack Object Storage (like S3 Sync)?
  • Amazon S3 sync Deleting excluded files
  • AWS CLI S3 sync only over selected files?
  • Amazon S3 sync to local machine failed
  • How to include empty folders in "s3 sync"?
  • AWS S3 sync --delete, removed new files in local
  • aws s3 sync not working as expected
  • Amazon CloudFront cache invalidation after 'aws s3 sync' CLI to update s3 bucket contents
  • AWS S3 Sync with JS/Node SDK
  • Amazon S3 sync millions of files to local for incremental backup
  • Amazon S3 sync to folder in C#
  • SprightlySoft S3 Sync last version?
s3-sync list of languages used
Other projects in JavaScript