Writing a single file to multiple s3 buckets with gulp-awspublish

mattcooney

I have a simple single-page app, that is deployed to an S3 bucket using gulp-awspublish. We use inquirer.js (via gulp-prompt) to ask the developer which bucket to deploy to.

Sometimes the app may be deployed to several S3 buckets. Currently, we only allow one bucket to be selected, so the developer has to gulp deploy for each bucket in turn. This is dull and prone to error.

I'd like to be able to select multiple buckets and deploy the same content to each. It's simple to select multiple buckets with inquirer.js/gulp-prompt, but not simple to generate arbitrary multiple S3 destinations from a single stream.

Our deploy task is based upon generator-webapp's S3 recipe. The recipe suggests gulp-rename to rewrite the path to write to a specific bucket. Currently our task looks like this:

gulp.task('deploy', ['build'], () => {
  // get AWS creds
  if (typeof(config.awsCreds) !== 'object') {
    return console.error('No config.awsCreds settings found. See README');
  }

  var dirname;
  const publisher = $.awspublish.create({
    key: config.awsCreds.key,
    secret: config.awsCreds.secret,
    bucket: config.awsCreds.bucket
  });

  return gulp.src('dist/**/*.*')
    .pipe($.prompt.prompt({
      type: 'list',
      name: 'dirname',
      message: 'Using the ‘' + config.awsCreds.bucket + '’ bucket. Which hostname would you like to deploy to?',
      choices: config.awsCreds.dirnames,
      default: config.awsCreds.dirnames.indexOf(config.awsCreds.dirname)
    }, function (res) {
      dirname = res.dirname;
    }))
    .pipe($.rename(function(path) {
        path.dirname = dirname + '/dist/' + path.dirname;
    }))
    .pipe(publisher.publish())
    .pipe(publisher.cache())
    .pipe($.awspublish.reporter());
});

It's hopefully obvious, but config.awsCreds might look something like:

awsCreds: {
  dirname: 'default-bucket',
  dirnames: ['default-bucket', 'other-bucket', 'another-bucket']
}

Gulp-rename rewrites the destination path to use the correct bucket.

We can select multiple buckets by using "checkbox" instead of "list" for the gulp-prompt options, but I'm not sure how to then deliver it to multiple buckets.

In a nutshell, if $.prompt returns an array of strings instead of a string, how can I write the source to multiple destinations (buckets) instead of a single bucket?

Please keep in mind that gulp.dest() is not used -- only gulp.awspublish() -- and we don't know how many buckets might be selected.

Sven Schoenung

Never used S3, but if I understand your question correctly a file js/foo.js should be renamed to default-bucket/dist/js/foo.js and other-bucket/dist/js/foo.js when the checkboxes default-bucket and other-bucket are selected?

Then this should do the trick:

// additionally required modules
var path = require('path');
var through = require('through2').obj;

gulp.task('deploy', ['build'], () => {
  if (typeof(config.awsCreds) !== 'object') {
    return console.error('No config.awsCreds settings found. See README');
  }

  var dirnames = []; // array for selected buckets
  const publisher = $.awspublish.create({
    key: config.awsCreds.key,
    secret: config.awsCreds.secret,
    bucket: config.awsCreds.bucket
  });

  return gulp.src('dist/**/*.*')
    .pipe($.prompt.prompt({
      type: 'checkbox', // use checkbox instead of list
      name: 'dirnames', // use different result name
      message: 'Using the ‘' + config.awsCreds.bucket + 
               '’ bucket. Which hostname would you like to deploy to?',
      choices: config.awsCreds.dirnames,
      default: config.awsCreds.dirnames.indexOf(config.awsCreds.dirname)
    }, function (res) {
      dirnames = res.dirnames; // store array of selected buckets
    }))
    // use through2 instead of gulp-rename
    .pipe(through(function(file, enc, done) {
      dirnames.forEach((dirname) => {
        var f = file.clone();
        f.path = path.join(f.base, dirname, 'dist',
                           path.relative(f.base, f.path));
        this.push(f);
      });
      done();
    }))
    .pipe(publisher.cache())
    .pipe($.awspublish.reporter());
});

Notice the comments where I made changes from the code you posted.

What this does is use through2 to clone each file passing through the stream. Each file is cloned as many times as there were bucket checkboxes selected and each clone is renamed to end up in a different bucket.

Collected from the Internet

Please contact [email protected] to delete if infringement.

edited at
0

Comments

0 comments
Login to comment

Related

From Dev

Upload to multiple S3 buckets with Carrierwave

From Dev

Laravel s3 multiple buckets

From Dev

S3: How to grant access to multiple buckets?

From Dev

multiple buffered input streams from single file while writing to that file?

From Dev

multiple buffered input streams from single file while writing to that file?

From Dev

Writing json to file in s3 bucket

From Dev

Is it possible to sync a single file to s3?

From Dev

s3cmd : copy file from encrypted buckets in s3 to local machine

From Dev

Run gulp plugin on single file

From Dev

Run gulp plugin on single file

From Dev

How to display file size of a single file among multiple files using gulp?

From Dev

Is it better to have multiple s3 buckets or one bucket with sub folders?

From Dev

read only particular json files from s3 buckets from multiple folders

From Dev

How to configure multiple S3 buckets in android using the amplify framework

From Dev

How to serve static web content from S3 backed by multiple buckets from different regions

From Dev

writing every new char element to a single file if it's overwritten

From Dev

Writing temp files to Heroku from an S3 hosted file

From Dev

Character encoding issue when writing a file to S3

From Dev

Writing to a file in S3 from jar on EMR on AWS

From Dev

Writing a pickle file to an s3 bucket in AWS

From Dev

Writing csv file to Amazon S3 using python

From Dev

Writing temp files to Heroku from an S3 hosted file

From Dev

Writing Spark RDD as text file to S3 bucket

From Dev

Writing to S3 using SDK no errors but no file either

From Dev

Creating a single file from multiple streams while keeping the streams and files order with Gulp

From Dev

Multiple views into single file in Sublime Text 3

From Dev

How to write multiple DataStream's to a single file

From Dev

PySpark: spit out single file when writing instead of multiple part files

From Dev

Save a large Spark Dataframe as a single json file in S3

Related Related

  1. 1

    Upload to multiple S3 buckets with Carrierwave

  2. 2

    Laravel s3 multiple buckets

  3. 3

    S3: How to grant access to multiple buckets?

  4. 4

    multiple buffered input streams from single file while writing to that file?

  5. 5

    multiple buffered input streams from single file while writing to that file?

  6. 6

    Writing json to file in s3 bucket

  7. 7

    Is it possible to sync a single file to s3?

  8. 8

    s3cmd : copy file from encrypted buckets in s3 to local machine

  9. 9

    Run gulp plugin on single file

  10. 10

    Run gulp plugin on single file

  11. 11

    How to display file size of a single file among multiple files using gulp?

  12. 12

    Is it better to have multiple s3 buckets or one bucket with sub folders?

  13. 13

    read only particular json files from s3 buckets from multiple folders

  14. 14

    How to configure multiple S3 buckets in android using the amplify framework

  15. 15

    How to serve static web content from S3 backed by multiple buckets from different regions

  16. 16

    writing every new char element to a single file if it's overwritten

  17. 17

    Writing temp files to Heroku from an S3 hosted file

  18. 18

    Character encoding issue when writing a file to S3

  19. 19

    Writing to a file in S3 from jar on EMR on AWS

  20. 20

    Writing a pickle file to an s3 bucket in AWS

  21. 21

    Writing csv file to Amazon S3 using python

  22. 22

    Writing temp files to Heroku from an S3 hosted file

  23. 23

    Writing Spark RDD as text file to S3 bucket

  24. 24

    Writing to S3 using SDK no errors but no file either

  25. 25

    Creating a single file from multiple streams while keeping the streams and files order with Gulp

  26. 26

    Multiple views into single file in Sublime Text 3

  27. 27

    How to write multiple DataStream's to a single file

  28. 28

    PySpark: spit out single file when writing instead of multiple part files

  29. 29

    Save a large Spark Dataframe as a single json file in S3

HotTag

Archive