Processing an array of promises sequentially in node.js

This post describes how you can perform a sequence of promises sequentially – one after another – using Kriskowal’s Q library in node.js. If you’re just interested in how to do this, and not in the other examples, scroll down to the last code snippet.

Imagine you have an array of filenames, and you want to upload those files to a server. Normally, you’d just fire off uploads asynchronously and wait for all of them to finish. However, what if the server you upload to has restrictions, for example maximum 1 concurrent upload, or a bandwidth limit?

In this post, we’ll be using the Q library to create and process promises, and log4js to get log lines with timestamps easily. First we create an uploader.js  module with a function uploadFile  that takes a filename, and uploads it to a server. For demonstration purposes, this function doesn’t actually upload a file, but simply waits for a random time, and then fulfills the promise.

var Q = require('q');
var log4js = require('log4js');
var logger = log4js.getLogger('uploader');

exports.uploadFile = function(filename) {
    var deferred = Q.defer();
    Q.fcall(function() {
        var delay = Math.random() * 4000 + 3000;
        logger.info("Starting upload: " + filename);
        setTimeout(function() {
            logger.info("Completed upload: " + filename);
            return deferred.resolve();
        }, delay)
    });
    return deferred.promise;
}

The following code uploads a single file:

var log4js = require('log4js');
var logger = log4js.getLogger('upload-example-1');
var uploader = require('./uploader');

var filename = 'file1.jpg';

uploader.uploadFile(filename)
    .then(function(result) {
        logger.info("The file has been uploaded.");
    })
    .catch(function(error) {
        logger.error(error);
    });

The output of this script is:

[14:15:28.128] [INFO] uploader - Starting upload: file1.jpg
[14:15:32.169] [INFO] uploader - Completed upload: file1.jpg
[14:15:32.170] [INFO] upload-example-1 - The file has been uploaded.

That’s uploading a single file. This is how you use Q to upload multiple files in parallel:

var Q = require('q');
var log4js = require('log4js');
var logger = log4js.getLogger('upload-example-2');
var uploader = require('./uploader');

var filenames = ['file1.jpg', 'file2.txt', 'file3.pdf'];
var promises = filenames.map(uploader.uploadFile);

Q.allSettled(promises)
    .then(function(results) {
        logger.info("All files uploaded. Results:");
        logger.info(results.map(function(result) { return result.state }));
    })
    .catch(function(error) {
        logger.error(error);
    });

Here, we have an array of filenames, which we turn into an array of promises by using the map  method. After that, we use Q.allSettled()  to wait until all promises have either been fulfilled or rejected. We don’t use Q.all()  here, because that would stop processing as soon as one of the promises is rejected.

This leads to the following output:

[14:49:09.598] [INFO] uploader - Starting upload: file1.jpg
[14:49:09.601] [INFO] uploader - Starting upload: file2.txt
[14:49:09.602] [INFO] uploader - Starting upload: file3.pdf
[14:49:13.788] [INFO] uploader - Completed upload: file3.pdf
[14:49:14.489] [INFO] uploader - Completed upload: file2.txt
[14:49:15.014] [INFO] uploader - Completed upload: file1.jpg
[14:49:15.014] [INFO] upload-example-2 - All files uploaded. Results:
[14:49:15.014] [INFO] upload-example-2 - [ 'fulfilled', 'fulfilled', 'fulfilled' ]

As you can see, the three uploads are started immediately after each other, and run in parallel.

To turn an array of filenames into a sequentially processed array of promises, we can use reduce . If you’ve never used reduce  before (here is the documentation), this will look a bit weird.

var Q = require('q');
var log4js = require('log4js');
var logger = log4js.getLogger('upload-example-3');
var uploader = require('./uploader');

var filenames = ['file1.jpg', 'file2.txt', 'file3.pdf'];

var lastPromise = filenames.reduce(function(promise, filename) {
    return promise.then(function() {
        return uploader.uploadFile(filename);
    });
}, Q.resolve())

lastPromise
  .then(function() {
    logger.info("All files uploaded.");
  })
  .catch(function(error) {
    logger.error(error);
  });

Let’s go through this. We call the filenames.reduce()  with two arguments, a callback and an initial value. Because we passed a second argument to reduce(), it will now call the given callback for each element in filenames . The callback gets two arguments. The first time, the first argument is the initial value and the second argument is the first element of filenames . Each next time, the first argument is the return value of the previous time the callback was called, and the second argument is the next element of filenames .

In other words, the first argument of the callback is always a promise, and the second is an array element. We use an “empty promise”, Q.resolve() , as “seed” for this chain.

Using this code, each next step in the reduce()  chain is only called when the previous step has been completed, as can be seen in the output:

[14:22:35.935] [INFO] uploader - Starting upload: file1.jpg
[14:22:39.814] [INFO] uploader - Completed upload: file1.jpg
[14:22:39.815] [INFO] uploader - Starting upload: file2.txt
[14:22:45.293] [INFO] uploader - Completed upload: file2.txt
[14:22:45.293] [INFO] uploader - Starting upload: file3.pdf
[14:22:48.657] [INFO] uploader - Completed upload: file3.pdf
[14:22:48.658] [INFO] upload-test-3 - All files uploaded.

The code turns out to do exactly what we want. The above reduce()  solution can be used to perform all kinds of promises sequentially, by inserting the right code in the callback to reduce() .

However, there is one thing to add, which is error handling. What if one of the promises is rejected? Let’s try that. We’ll make a failing-uploader.js , which we’ll rig to fail sometimes.

var Q = require('q');
var log4js = require('log4js');
var logger = log4js.getLogger('failing-uploader');

exports.uploadFile = function(filename) {
    var deferred = Q.defer();
    Q.fcall(function() {
        var delay = Math.random() * 4000 + 3000;
        logger.info("Starting upload: " + filename);
        setTimeout(function() {
            if (filename === 'file2.txt') {
                logger.error("Timeout while uploading: " + filename);
                return deferred.reject("Timeout while uploading: " + filename);
            }
            else {
                logger.info("Completed upload: " + filename);
                return deferred.resolve();
            }
        }, delay)
    });
    return deferred.promise;
}

It turns out that an error stops the entire chain. When we modify Example 3 by changing require(‘uploader’)  to require(‘failing-uploader’) , and then running it, we get:

[18:04:47.576] [INFO] failing-uploader - Starting upload: file1.jpg
[18:04:53.896] [INFO] failing-uploader - Completed upload: file1.jpg
[18:04:53.903] [INFO] failing-uploader - Starting upload: file2.txt
[18:04:59.701] [ERROR] failing-uploader - Timeout while uploading: file2.txt
[18:04:59.702] [ERROR] file-upload - Not all files uploaded: Timeout while uploading: file2.txt

This might be what you want, or maybe you want to just register the error while continuing uploading the other files. In that case, you need to modify the callback to reduce() , for example like this:

var Q = require('q');
var log4js = require('log4js');
var logger = log4js.getLogger('upload-example-4');
var uploader = require('./failing-uploader');

var filenames = ['file1.jpg', 'file2.txt', 'file3.pdf'];
var results = [];

var lastPromise = filenames.reduce(function(promise, filename) {
    return promise.then(function() {
        results.push(true);
        return uploader.uploadFile(filename);
    })
    .catch(function(error) {
        results.push(false);
        logger.error("Caught an error but continuing with the other uploads.");
    });
}, Q.resolve());

lastPromise
    .then(function() {
        // Remove the first result, which is <true> returned by
        // the seed promise Q.resolve().
        // This is a clumsy way of storing and retrieving the results.
        // Suggestions for improvement welcome!
        results.splice(0, 1);
        logger.info("All files uploaded. Results:");
        logger.info(results);
    })
    .catch(function(error) {
        logger.error("Not all files uploaded: " + error);
    });

 

 

This will catch the rejection a level deeper, so the chain can continue. The output of this code is:

[18:15:55.659] [INFO] failing-uploader - Starting upload: file1.jpg
[18:15:59.883] [INFO] failing-uploader - Completed upload: file1.jpg
[18:15:59.884] [INFO] failing-uploader - Starting upload: file2.txt
[18:16:05.279] [ERROR] failing-uploader - Timeout while uploading: file2.txt
[18:16:05.279] [ERROR] file-upload - Caught an error but continuing with the other uploads.
[18:16:05.279] [INFO] failing-uploader - Starting upload: file3.pdf
[18:16:10.600] [INFO] failing-uploader - Completed upload: file3.pdf
[18:16:10.601] [INFO] file-upload - All files uploaded. Results:
[18:16:10.601] [INFO] file-upload - [ true, false, true ]

And indeed, the second upload fails, but the chain continues, and at the end, you know what succeeded and what failed.

Our promises have been processed sequentially, with error catching, and continuing on errors anyway.

 

 

 

This entry was posted in JavaScript, node.js, software development and tagged , , , , , , , , , , , , . Bookmark the permalink.

6 Responses to Processing an array of promises sequentially in node.js

Leave a Reply

Your email address will not be published. Required fields are marked *