Configurable bulk uploads to google cloud storage
MIT License
google cloud storage bulk upload
npm:
npm install google-cloud-storage-bulk
yarn:
yarn add google-cloud-storage-bulk
const GCS = require('google-cloud-storage-bulk');
const gcs = new GCS({
projectId,
bucketName: 'my-bucket',
concurrency: 100,
hashStrategy: 'file',
retries: 3,
subdirectory: 'application-name',
uploadOptions: {}
});
// ... in async function
await gcs.uploadFiles(someDirectory);
projectId
- (required
) - google cloud project id (see project reference for more details)bucketName
- (required
) - cloud storage bucket nameconcurrency
- (optional
- default: 250
) - upload concurrency limitsretries
- (optional
- default: 3
) - upload retry attemptshashStrategy
- (required
- options: none
, file
, subdirectory
) - described in detail below
subdirectory
- (optional
) - subdirectory within cloud storage bucket to push content intouploadOptions
- (optional
) - extends @google/cloud-storage
upload options
hashStrategy
none
No hashing to file structure, just push content as is. Google cloud does handle metageneration that you can take advantage of for versioning.
Before:
|_directory_to_upload
|_a.js
|_b.js
|_c.css
After:
|_subdirectory
|_a.js
|_b.js
|_c.css
file
Hash per file currently using sha1
algorithm.
Before:
|_directory_to_upload
|_a.js
|_b.js
|_c.css
After:
|_subdirectory
|_a.a9993e364706816aba3e25717850c26c9cd0d89d.js
|_b.924f61661a3472da74307a35f2c8d22e07e84a4d.js
|_c.bcb8c41b803b91661b5e6ee45362f47df368a731.css
|_asset-manifest.json <-- references initial file to their new hash complement
subdirectory
Hash content of the directory being uploaded and use this directory hash as the bucket subdirectory.
Before:
|_directory_to_upload
|_a.js
|_b.js
|_c.css
After:
|_subdirectory
|_924f61661a3472da74307a35f2c8d22e07e84a4d
|_a.js
|_b.js
|_c.css
Name | Website |
---|---|
Shaun Warman | https://shaunwarman.com |