Processing an array of promises sequentially in node.js

This post describes how you can perform a sequence of promises sequentially – one after another – using Kriskowal’s Q library in node.js. If you’re just interested in how to do this, and not in the other examples, scroll down to the last code snippet.

Imagine you have an array of filenames, and you want to upload those files to a server. Normally, you’d just fire off uploads asynchronously and wait for all of them to finish. However, what if the server you upload to has restrictions, for example maximum 1 concurrent upload, or a bandwidth limit?

In this post, we’ll be using the Q library to create and process promises, and log4js to get log lines with timestamps easily. First we create an uploader.js  module with a function uploadFile  that takes a filename, and uploads it to a server. For demonstration purposes, this function doesn’t actually upload a file, but simply waits for a random time, and then fulfills the promise.

The following code uploads a single file:

The output of this script is:

That’s uploading a single file. This is how you use Q to upload multiple files in parallel:

Here, we have an array of filenames, which we turn into an array of promises by using the map  method. After that, we use Q.allSettled()  to wait until all promises have either been fulfilled or rejected. We don’t use Q.all()  here, because that would stop processing as soon as one of the promises is rejected.

This leads to the following output:

As you can see, the three uploads are started immediately after each other, and run in parallel.

To turn an array of filenames into a sequentially processed array of promises, we can use reduce . If you’ve never used reduce  before (here is the documentation), this will look a bit weird.

Let’s go through this. We call the filenames.reduce()  with two arguments, a callback and an initial value. Because we passed a second argument to reduce(), it will now call the given callback for each element in filenames . The callback gets two arguments. The first time, the first argument is the initial value and the second argument is the first element of filenames . Each next time, the first argument is the return value of the previous time the callback was called, and the second argument is the next element of filenames .

In other words, the first argument of the callback is always a promise, and the second is an array element. We use an “empty promise”, Q.resolve() , as “seed” for this chain.

Using this code, each next step in the reduce()  chain is only called when the previous step has been completed, as can be seen in the output:

The code turns out to do exactly what we want. The above  reduce()  solution can be used to perform all kinds of promises sequentially, by inserting the right code in the callback to reduce() .

However, there is one thing to add, which is error handling. What if one of the promises is rejected? Let’s try that. We’ll make a failing-uploader.js , which we’ll rig to fail sometimes.

It turns out that an error stops the entire chain. When we modify Example 3 by changing require('uploader')  to require('failing-uploader') , and then running it, we get:

This might be what you want, or maybe you want to just register the error while continuing uploading the other files. In that case, you need to modify the callback to reduce() , for example like this:

 

 

This will catch the rejection a level deeper, so the chain can continue. The output of this code is:

And indeed, the second upload fails, but the chain continues, and at the end, you know what succeeded and what failed.

Our promises have been processed sequentially, with error catching, and continuing on errors anyway.

 

 

 

This entry was posted in JavaScript, node.js, software development and tagged , , , , , , , , , , , , . Bookmark the permalink.

6 Responses to Processing an array of promises sequentially in node.js

  1. dc says:

    Thanks man it helped me alot

  2. Anonymous says:

    I find this to be simpler:

    var funs = data.map(function(x) {
    return function() {
    return asyncFn(x);
    }
    });

    return funs.reduce(q.when, q.resolve());

  3. Adrian G says:

    Joost, thanks! This was quite helpful.

    May I ask what the purpose is of passing through the return value of deferred.resolve(), at the end of the async function within uploader.js?

    https://github.com/kriskowal/q/blob/v1/design/q7.js#L14-L25

    Looking at the source resolve() seems to return undefined, no? And setTimeout discards its callback’s return value AFAIK.

    Hopefully I’m right, because that would mean I’m finally starting to wrap my head around promises. :)

    Thanks again!

    • Joost says:

      Dear Adrian,

      you are correct. The return statements on lines 13 and 17 of failing-upload.js are useless.

      The reason I put them there is force of habit. Often, in such deferred pieces of code, you verify an important condition, and if it’s false, you want both to stop execution and reject the promise. That is then done by return deferred.reject(errorMsg);. However, here that keyword is not needed.

  4. Zsolt says:

    It was a godsend that I’ve found this article. I’ve applied it to call a stored procedure multiple times with knex.js. Before it always caused a MySQL deadlock so I had to make sure that knex calls the stored procedure only when the previous call finished. It saved the day for me.

Leave a Reply to Joost Cancel reply

Your email address will not be published.