Using Promises With Express JS

Using Promises With Express JS

A variety of tactics developers use in the wild to promisify their Express apps

2022-01-02

Promisifying Express apps

I firmly believe that the best way to master promises, or any technical skill for that matter, is to practice it repeatedly in a realistic setting. Considering Express is the most popular framework for backend web development by a large margin, what better context in which to sharpen one’s mastery of asynchronous JavaScript?

Caveat: This article assumes both a beginner level familiarity with promisifying JavaScript code and the basics of creating Express APIs.

In this article, we’ll look at a variety of tactics developers use in the wild to promisify their Express apps, give recommendations for when to use this or that approach, and explore complications that can arise in the quest to merge the efficiency of promises with the elegant simplicity of Express.

To begin, let’s start out with what is probably the most common intersection of these two skillsets - serving the result of an async operation via an Express endpoint. There are three basic patterns that developers widely apply to this situation. Below, we’ll examine all three and point out the pros and cons of that approach, using a simple endpoint that reads a file.

Three patterns for async APIs

A simple solution is to write code that takes a callback functon, and send res.end as our callback. This has the benefit of working the same whether or not the code consuming the callback is asynchronous or not. In fact, looking at the code, we’ll barely notice anything asynchronous is happening:


const fs = require('fs');
const express = require('express');

const readFilePromise = (filename, cb) =>
    fs.readFile(filename, 'utf-8', (err, data) => cb(data))

const app = express();

app.post('/get-file', async (req, res) => {
    readFilePromise('important_file.txt', res.end);
});

app.listen(3000, () => {
  console.log(`SSH app listening on port 3000`);
});

But the obscurity of the async nature of this code is in many ways a drawback, because the code is now less explicit. Even worse, we open the door to the infamous ‘callback hell’ antipattern that plagued JS before promises were integrated into the language. And finally, our code now assumes that a callback should be passed, making that the only way to access the data served by our code.

So let’s redo this, by wrapping it all in a promise. And we’ll add error handling while we’re at it.


const readFilePromise = (filename) => new Promise((resolve, reject) => {
    fs.readFile(filename, 'utf-8', (err, data) => {
        if (err) {
            reject(err);
        } else {
            resolve(data);
        }
    });
});

const app = express();

app.post('/get-file', (req, res) => {
    readFilePromise('important_file.txt')
        .then(data => res.end(data))
        .catch(err => res.end(`could not serve data: ${err}`))
});

Much cleaner, easier to follow, and simpler to immediately identify as asynchronous code. But can we make it even cleaner by using the async/await pattern introduced by ES7?


app.post('/get-file', async (req, res) => {
    const text = await readFilePromise('important_file.txt');
    res.end(text);
});

Great, we can pass anonymous functions that return a promise, so our endpoint function can be async with no problem. Be careful when doing this, however, to make sure that any middleware functions that go next() after this code will now be made aware that they’re dealing with a promise, and not a conventional sync function. And, in fact, that brings us to what used to be the biggest drawback of mixing promisified functions with Express endpoints.

Error handing with Express promises

Before Express 5, no error was triggered by Express upon a Promise rejection. Express’s error handling middleware (the next() that comes after the endpoint callback itself) ignored promise rejections, making debugging Express promise callbacks a massive pain. However, with the release of Express 5, this is no longer an issue. This can confuse newbies because so much writing on the topic is outdated. You can confidently use promises with Express without ever worrying or thinking about how debugging and error handling will have to be accomodated.

That said, in case you ever have to write code that is backwards compatible with older versions of Express, or support a legacy project using an old Express release, you can work around this by manually calling next() with the error text, like so:


app.post('/get-file', async (req, res, next) => {
    try {
        const text = awaitreadFilePromise('important_file.txt');
        res.end(text);
    } catch(err) {
        return next(err);
    }
});

The need for this kind of hacky construct is becoming less and less necessary every day, but it’s good to be aware of its existence. It makes debugging vastly less headachy when working on legacy Express APIs. Even if you know your code well, you can always make life easier for the next engineer who comes along and has to modify or debug the code you write.

If you are stuck with a legacy Express project and want to learn way more about dealing with those scenarios, check out the article Using Promises with Express by Ben Lugavere. It’s outdated in almost all respects, but if your working on a legacy project that might be just what you need.

To learn about more advanced error handling strategies in modern Express apps, there are many high quality examples on the Express 5 Error Handling docs.

Performance

Promises add overheard. New functions are defined and called that wouldn’t have been in purely synchronous code. The event loop is used in more complex ways. All of this means that, unless you will be doing something else while awaiting your promise, it’s best to try to skip them entirely.

For example, our exciting file serving API above could be written to run quicker, and use less code, like so:


app.post('/get-file', (req, res, next) => {
    fileContents = fs.readFileSync('important_file.txt');
    res.end(fileContents);
});

And we can prove it! Let’s compare the performance of reading a file synchronously, with the same operation by synchronous. First, we need to write some code that reads a file both synchronously, and asynchronously, all the while timing each operation:


const fs = require('fs');

// Node doesn't have top level await outside of modules,
// (unlike Chrome and Firefox) so we have to wrap
// everything in an async function
(async () => {
    // Let's time the async version first
    console.time('async');
    await new Promise(resolve => 
        fs.readFile('examples.js', data => {
            resolve(data);
        })
    );
    console.timeEnd('async')
    
    // Now we'll time the synchronous version
    console.time('sync');
    fs.readFileSync('examples.js');
    console.timeEnd('sync');
})()

Let’s run it in our terminal and see how the times come out:

    
$ node examples.js 
async: 1.075ms
sync: 0.074ms

Look at that, the synchronous code not only wins, it’s over twenty five percent faster!

So in other words, we only want to promisify the parts of a program that will really benefit from it, and keep most of the logic synchronous when it’s not otherwise beneficial to move it into promises.

There are other tricks to optimizing promises for peak performance. Pending promises in particular are killer for performance, and can even cause memory leaks if they never resolve. One hacky way to get around this is to have a setInterval'd daemon run in the background, periodically killing off pending promises that haven’t resolved after some obscene amount of time has passed. But a better way to implement this is within the promise itself, by rejecting the promise after a certain number of seconds using setTimeout. To demonstrate this, we can develop a proof of concept class, called TimeoutPromise, which implements promises that auto-reject after a certain amount of time passes.


class TimeoutPromise extends Promise {
    constructor(timeout, callback) {
        super(callback);
        setTimeout(timeout, () => {
            this.reject(new Error('Promise timed out.'));
        });
    }
}

But wait, won’t this call reject after timeout number of seconds? What if the promise resolves successfully, we’ll still call reject some amount of seconds later! That’s actually totally fine. A promise can only resolve once, so if it resolves and then later rejects, the rejection will be ignored. Only the first resolution counts, any further calls to resolve() or reject() will have no effect whatsoever.

Just for the sake of clarity, let’s see how this promise would look integrated with our API:


class TimeoutPromise extends Promise {
    constructor(timeout, callback) {
        super(callback);
        setTimeout(timeout, () => {
            this.reject(new Error('Promise timed out.'));
        });
    }
}

const readFilePromise = (filename) => {
    return new TimeoutPromise(1200, (resolve, reject) => {
        fs.readFile(filename, 'utf-8', (err, data) => {
            if (err) {
                reject(err);
            } else {
                resolve(data);
            }
        });
    })
};

const app = express();

app.post('/get-file', async (req, res) => {
    res.end(await fs.readFilePromise(1200, 'important_file.txt'));
});

Isn’t it marvelous how easily even fundamental object classes in JavaScript lend themselves to subtle and creative extensions? Given the current trend towards purely functional programming, it’s easy to forget how powerful object oriented features can be in terms of extending and mutating existing object types.

However, in a real live production environment, a class like this will quickly scale out to new functionality. The reason is feature creep. There are so many small ways pending promises can be subtly optimized for peak performance.

For example, some pending promises have identical input, so it’s arbitrary which finishes first and the results might as well be identical, right? How cool would it be if we could recycle pending promises running simultaneously, so as to reduce the number of pending promises just hanging out in the background sucking up memory. Well, allow me to introduce you to the wonderful world of…

Async Express Libraries

Pending Promise Recycler detects identical simultaneous promises and merges them into a single promise, thus reducing your code’s memory footprint and increasing the response time of your API. Wonderful. This is pretty much mandatory if you have a sufficiently large scale Express API project that you intend to have up for a considerable amount of time. Which, let’s face it, in the world of backend is pretty much every API ever written.

But when it comes to libraries that hold your hand into the spiderweb of complexity that is asynchronous Express programming, Pending Promise Recycler is the tip of the iceberg. Here is a brief overview of the major relevant dependencies you can consider, and how they make life easier:

  • Express Async Errors. Implements Express 5 style, automatic promise rejection error handling, into older versions of Express.
  • Express Async Handler. Similar to Express Async Errors, this library handles rejected promises behind the scenes and converts them into proper Express errors for easier debugging.
  • Express Promise. Integrates a promise-based domain specific language for querying databases directly into Express.
  • Express Async Router. Implements a promise-based alternative to the popular Express Router package.

As always, we should expect the JavaScript dependency scene to be crowded, even for very specific use cases. Make sure any dependencies you use are still actively updated for security vulnerabilities, and run npm audit to review dependencies for possible issues. Often security alerts are for vulnerable subdependencies, and you can review the code yourself to see if the vulnerability is in a library that’s actually being used in an exploitable way, or fork the library temporarily to use a nonvulnerable, newer version of the affected dependency. Good luck!

Conclusion

Due to the modular, simple software development interface of Express, and the clean approach ES6 took when implementing promises, combining the two is easy. Promisifying Express code is not substantially more advanced than promisifying any complex JavaScript project. But some aspects of Express do make it at least slightly different than just another JavaScript app. For example:

  1. It’s backend. This means we sacrifice certain functionality, like top-level await in the browser, and have to think about being compatible with the specific system we’re running on. File permissions, for example, do not work identically on Windows as they do on Unix based systems.
  2. It stays running for a long time. Often, APIs are intended to run indefinitely into the future. This makes memory leaks a massive concern. If any pending promises are being created faster than they are destroyed, we will eventually gobble up the system’s entire memory capacity. And even worst - the slower we do so, the harder it will be to debug and detect (as the proverb says - fail often, fail early).
  3. Express does a lot of ‘magic’ behind the scenes. As an opinionated framework, Express expects your use case to fall within certain common patterns. When new approaches are introduced before Express can integrate them properly, awkwardness can ensue. The difficulty of using promises with good error handling in older versions of Express is a perfect example of this tendency.

Nevertheless, while these caveats are good to keep in mind to avoid overconfidence and stay skeptical towards your own code, promisifying Express is mostly a solved problem that should be a joy as long as you follow proper guidance and best practices, such as those we’ve outlined here.

Promises and Express are two of the joys of using Node and JS in general. So have fun, and stay async!