From Promises to Async/Await: Navigating the Evolution of Asynchronicity in JavaScript
From callbacks to async/await - a journey through the evolution of asynchronicity in JavaScript.
Note: this piece was originally written as part of my interview for SuperTokens.
Let’s face it - thinking asynchronously about code can be challenging. It certainly doesn’t come as naturally as linearly thinking about code - when a line of code produces or executes something at the line it was written.
Still, asynchronous programming, particularly in JavaScript, comes with benefits. Most of them are performance and UX-related. The most important one would be efficient resource utilization.
For example - Imagine a fleet of ships where each ship explores different islands simultaneously, rather than waiting for one to return before sending out the next. This parallel exploration results in a much more efficient discovery process. Put practically, when we’re not blocking the main thread (the one rendering the user interface in the context of JavaScript) by running long, intensive tasks, we provide a better user experience. We can do more than one thing simultaneously without waiting for a processing-heavy task to be complete, resulting in a better-perceived performance.
Intro
In JavaScript, we have plenty of tools for making code run asynchronously and, by extension, more efficiently. One could easily argue that today’s most popular method of running code asynchronously in JavaScript is via async/await.
We’ll need some historical context first to get to async/await, how it works, and why it works that way.
First came Callbacks
In the “olden” days, we had callbacks. A callback is a function that you pass into another function as an argument (or, sometimes, a value), which is then invoked inside the outer function to complete some task. Think of it as saying, “Once you’re done with your task, call me back with the results.” This mechanism allows you to execute code after an asynchronous operation, such as downloading data from the internet or reading files from the disk. This ensures that specific tasks are performed at the right time in your program’s flow. It’s like asking a friend to call you back once they have the information you need so you can decide what to do next based on that information.
Callbacks are probably best explained via a practical example:
// Define a callback function that we want to run after a delay
function greet() {
console.log("Hello, World!");
}
// Use setTimeout to delay the execution of the callback function 'greet' by 2000 milliseconds (2 seconds)
setTimeout(greet, 2000);
// This line will execute immediately, showing that the rest of your code doesn't have to wait for the timeout
console.log("Waiting to greet...");
// Output order: 'Waiting to greet...' -> 'Hello, World!'.
MDN Link for reference: https://developer.mozilla.org/en-US/docs/Glossary/Callback_function
This approach worked pretty well until we started putting callbacks into callbacks into callbacks. The developer experience of that approach could be better. Enter “callback hell,” AKA “the pyramid of doom”:
setTimeout(function () {
console.log("First task done!");
setTimeout(function () {
console.log("Second task done!");
setTimeout(function () {
console.log("Third task done!");
setTimeout(function () {
console.log("Fourth task done!");
// This nesting can continue, making the code harder to read and manage
}, 1000); // Fourth task delay
}, 1000); // Third task delay
}, 1000); // Second task delay
}, 1000); // First task delay
// Output order: 'First task done!' -> 'Second task done!' -> 'Third task done!' -> 'Fourth task done!'
We had to do better. Imagine having to find a bug in the 6th nested callback in a chain of fetch
calls, and each of them processes their data separately and passes it on to the next callback. Deep nesting generally makes code more challenging to read than code written linearly.
We found a part of the answer to that problem in Promise
s, the next stop in our Async journey.
Then came Promises
A Promise
in JavaScript is like a contract for asynchronous operations, guaranteeing that, eventually, you’ll get a result: either a successful outcome (resolved) or an error (rejected). Promises made chaining asynchronous operations more straightforward and readable, with a theoretically well-defined route for error handling.
Translated into code, a Promise
is a built-in object in virtually every JavaScript runtime (browsers, node, deno, bun) out there that has the following shape and arguments:
// Creating a new Promise
const myPromise = new Promise((resolve, reject) => {
// Asynchronous operation
const condition = true; // This can be any async condition, e.g., fetching data
if (condition) {
resolve("Promise resolved!"); // Successfully complete the promise
} else {
reject("Promise rejected!"); // Fail the promise
}
});
MDN Link for reference: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Promise
We addressed the “pyramid of doom” with Promises, ironically, with callbacks. Standardized ones, that is. The Promise
constructor takes resolve
and reject
as input arguments, representing callback functions - a successful operation and an error.
On the outside of a Promise
, i.e., when we call it, we get an object with three properties: .then, ,
.catch, and
.finally.
.thenfires after
resolve, getting the value we passed to
resolve.
.catchis related to
rejectsimilarly - it allows us to react to the error that was the reason for the
Promisebeing
rejected.
.finallyfires when the
Promiseis settled. Settled means either rejected or resolved, so it allows us to avoid writing duplicate code in both
.thenand
.catch`.
To continue with the setTimeout
examples from before:
// Creating a new Promise that resolves after 2 seconds
const waitForIt = new Promise((resolve, reject) => {
setTimeout(() => {
resolve("Promise fulfilled!");
}, 2000);
});
// Using the Promise
waitForIt
.then((message) => console.log(message))
.catch((error) => console.error(error));
console.log("The promise is made...");
// Output order: 'The promise is made...' -> 'Promise fulfilled!'
But wait, you might be thinking. This snippet of code is more verbose than the initial callback example. And you’re absolutely correct! The most significant benefit of using promises is the ability to chain them without falling into the “callback hell.” In effect, the return value of a promise can be another promise, which you can then chain via a call to .then(...)
:
function fetchData() {
// Simulate fetching data from an API with a promise
return new Promise((resolve) =>
setTimeout(() => resolve("Data from API"), 1000)
);
}
function processData(data) {
// Simulate processing data with a promise
return new Promise((resolve) =>
setTimeout(() => resolve(data + " processed"), 1000)
);
}
function saveData(data) {
// Simulate saving data with a promise
return new Promise((resolve) =>
setTimeout(() => resolve(data + " and saved"), 1000)
);
}
// Start the chain by fetching data
fetchData() // fetchData returns a promise
.then((data) => {
console.log(data); // Logs "Data from API"
return processData(data); // Continue the chain by processing the fetched data via another promise
})
.then((processedData) => {
console.log(processedData); // Logs "Data from API processed"
return saveData(processedData); // Continue the chain by saving the processed data promise
})
.then((savedData) => {
console.log(savedData); // Logs "Data from API processed and saved"
})
.catch((error) => {
console.error("An error occurred:", error);
});
console.log("Asynchronous operations started...");
It is undoubtedly an improvement over callbacks. It’s still a bit verbose, sure, but at least it reads linearly, and we get to chain multiple sync or async operations while letting, for example, the render thread in the browser work uninterrupted.
Honorable mention - thenables
Before we move on, let’s take a slight detour. Promises weren’t exactly standardized from the get-go. Library authors sometimes had different ideas on how Promises should work before Promises (as we know them today) became a standard in the
JavaScript language. However, most did agree that a Promise should at least be able to be chained via .then
. Thus, the idea of “thenables” was born.
You can think of a thenable as a Promise-like object, which returns a .then
method, callable with two input arguments (callbacks): onFulfilled
and onRejected
.
A thenable looks like this:
const aThenable = {
then(onFulfilled, onRejected) {
onFulfilled({
// The thenable is fulfilled with another thenable
then(onFulfilled, onRejected) {
onFulfilled(42);
},
});
},
};
Promise.resolve(aThenable); // A promise fulfilled with 42
MDN link for reference: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Promise#thenables
Even though thenables are different, they are still callable and chainable, much like we discussed in the examples above.
But we can still do better.
The path to async / await
Phew, async / await
at last. Historically speaking, the biggest reason that async
and await
got introduced (and became popular) in JavaScript is that even though they help you run asynchronous code, they read like synchronous code. To drive that point home, let’s start by considering async
and await
separately.
Async
When we talk about async
in the context of JavaScript, we’re talking about the declaration: async function
(alternatively const f = async () => {...}
).
Declaring a function with the async keyword returns an AsyncFunction
object (reference here). Each time our async function is called, it returns a new Promise
.
In practical terms:
async function foo() {
return 1;
}
is similar to:
function foo() {
return new Promise(function (resolve, reject) {
return resolve(1);
});
}
There are some subtle differences between the two (link for the curious), but for the purposes of understanding async/await, those would be out of the scope of this article.
So, it produces a Promise
(which is asynchronous) but reads like synchronous code.
Next, let’s have a look at await
.
Await
To quote MDN:
The await operator is used to wait for a Promise and get its fulfillment value. It can only be used inside an async function or at the top level of a module (reference here).
In effect, using await
on a Promise
, “unpacks” its value similarly to how calling regular functions does. But, there’s a catch - we can’t just put the await
operator wherever we want - it has to be either:
- In the body of an
async
function - or at the top level of a module
Put practically, we can easily call both versions of foo()
defined in the examples above like this:
async function main() {
await foo(); // Returns 1 for both versions of foo defined above
}
// assuming we're not running inside a module;
// if we were, just `await foo();` would have sufficed.
main();
Async and Await combined
Let’s convert our (admittedly lengthy) example from using plain Promises
to async/await
:
function fetchData() {
// Simulate fetching data from an API with a promise
return new Promise((resolve) =>
setTimeout(() => resolve("Data from API"), 1000)
);
}
function processData(data) {
// Simulate processing data with a promise
return new Promise((resolve) =>
setTimeout(() => resolve(data + " processed"), 1000)
);
}
function saveData(data) {
// Simulate saving data with a promise
return new Promise((resolve) =>
setTimeout(() => resolve(data + " and saved"), 1000)
);
}
async function run() {
console.log("Asynchronous operations started...");
try {
const data = await fetchData();
const processedData = await processData(data);
const savedData = await saveData(processedData);
} catch (e) {
console.error("An error occurred:", error);
}
}
run();
Much easier to follow, right? Without having to resort to nesting functions inside functions or nesting Promises
inside of .then
chains, we have a neat-looking, linear code that reads much in the same way as we would synchronous code. In fact, you can treat it as such for most use cases out there.
Converting the functions to async functions
We could go a step further and convert the function
s to async function
s:
async function fetchData() {
// Simulate fetching data from an API with a promise
const data = await new Promise((resolve) =>
setTimeout(() => resolve("Data from API"), 1000)
);
return data;
}
async function processData(data) {
// Simulate processing data with a promise
const processedData = await new Promise((resolve) =>
setTimeout(() => resolve(data + " processed"), 1000)
);
return processedData;
}
async function saveData(data) {
// Simulate saving data with a promise
const savedData = await new Promise((resolve) =>
setTimeout(() => resolve(data + " and saved"), 1000)
);
return savedData;
}
async function run() {
console.log("Asynchronous operations started...");
try {
const data = await fetchData();
const processedData = await processData(data);
const savedData = await saveData(processedData);
} catch (e) {
console.error("An error occurred:", error);
}
}
run();
But it wouldn’t change how the functions work beyond making the code more verbose - we were already returning a Promise
.
Note that, sometimes, it might be better to return a plain Promise
. async/await
works just fine with those.
Practical examples and pitfalls
Now, with all that theory behind us, let’s see two practical usage examples of our newly acquired async/await
skills.
Fetch
One of the most common situations where you’ll see an async/await
combo is fetching data from an API. For the purposes of having a test API to call, we’ll use jsonplaceholder.
Let’s imagine we’re trying to fetch some user data:
async function fetchUserData(userId) {
// Fetch the data
const response = await fetch(
`https://jsonplaceholder.typicode.com/users/${userId}`
);
// Assuming our API returns JSON, parse the response as JSON
const userData = await response.json();
return userData;
}
async function run() {
// Feel free to change these to experiment and see how it works
const user = await fetchUserData(1);
console.log(user);
}
The way this works is similar to what we discussed in the theoretical examples above:
fetch
is already aPromise
(source)- The result of
fetch
(response
) is also aPromise.
These two promises are a common source of confusion, but the double await is not a mistake. First, we wait for a response, then parse the response to consume the API’s data. - Once we’ve parsed it, we return the JSON data.
Reading a file
Another common usage you might encounter is file operations (most often in node). Instead of having to deal with a callback, we can use the Promisified version of the file system utilities to, for example, read files:
import { readFile } from "node:fs/promises";
try {
const data = await fs.readFile(filePath, "utf8");
console.log(JSON.parse(data));
} catch (error) {
console.error("there was an error:", error.message);
}
Note: This is using the ES module syntax. Converting this example to CommonJS would be trivial, as it only requires changing the import
statement to require
. More here.
But, async/await
comes with a common set of errors and easily avoidable mistakes. Let’s have a look at some of those:
Forgetting await
While it may seem obvious, it can be a source of a lot of weird behavior - and I’ve seen it often (my code included ☺️). Consider the following example:
async function fetchData() {
const data = fetch("https://jsonplaceholder.typicode.com/todos"); // Missing 'await'
console.log(data); // Logs a Promise, not the actual data
}
async function fetchData2() {
const data = await fetch("https://jsonplaceholder.typicode.com/todos");
console.log(data); // Again logs a Promise, not the actual data. This time, a promise that should parse the response.
}
// Correct usage
async function fetchDataCorrectly() {
const response = await fetch("https://jsonplaceholder.typicode.com/todos"); // Correctly using 'await'
const data = await response.json(); // Assuming the response is JSON
console.log(data); // Logs the actual data
}
Worst of all, the wrong variations won’t produce an error at all. They will return a Promise
instead of the expected response.
Improper Error Handling
Using async/await
without try-catch blocks can lead to unhandled Promise
rejections.
Consider this:
async function fetchData() {
const response = await fetch("https://jsonplaceholder.typicode.com/todos"); // This might fail
const data = await response.json();
console.log(data);
}
// fetchData(); If the request fails, this will result in an unhandled promise rejection
// Correct usage
async function fetchDataCorrectly() {
try {
const response = await fetch("https://jsonplaceholder.typicode.com/todos");
const data = await response.json();
console.log(data);
} catch (error) {
console.error("Failed to fetch data:", error);
}
}
When handling errors, it’s helpful to give our users some idea of what went wrong. In the first case, the error will silently log to the console. In more complex cases, this gets worse, especially when we have to nest multiple levels of promises in async/await
calls. Make sure that you’re extra careful with error handling, particularly in async land.
Improper loop usage
However, the most confusing corner of async/await
is how they behave in loops. Let’s start with forEach
:
array.forEach(async (item) => {
await someAsyncFunction(item); // This will not wait
console.log(`Processed ${item}`);
});
A common mistake is to try to use await
directly inside .forEach
, which won’t work as expected because .forEach
won’t wait for the asynchronous operations to complete.
However, this should be fine if the goal is to perform side effects without caring about the order or completion. We can find similar behaviors with other iteration functions as well.
As a rule of thumb, it’s usually better to go with the classical way of iteration in JS - using for
, for..of
, while
, etc.:
for (let i = 0; i < array.length; i++) {
await someAsyncFunction(array[i]);
console.log(`Processed ${array[i]}`);
}
// -- or, for..of --
for (const item of array) {
await someAsyncFunction(item);
console.log(`Processed ${item}`);
}
These will work as expected, with an important caveat: they block on each iteration. That means the next one won’t get processed until the current Promise
in the iteration gets resolved. This approach has some implications when it comes to perceived speed. Naturally, this issue leads us to a discussion about performance, perceived performance, and parallel and serial processing.
Performance, order of execution, and parallelism
One commonly overlooked thing with await
is that it blocks the following lines of code from executing until the Promise
that it awaits is resolved (or rejected). As you can imagine, this defeats the purpose of asynchronicity - wasn’t the idea to make things seem smoother and faster for our users?
The practical answer to this question concerns our strategy when approaching asynchronous problems with async/await
. Half of the answer lies in the execution order (i.e., at which point we make the code await
) and the other half in running things in parallel when possible.
Order of execution
Imagine this (reasonably common) scenario: before rendering your UI, you have to wait for a big request to be resolved from the API. Or, maybe even more than one request from more than one API.
If we approach this in a straightforward order, we’d probably do something like
await
on afetch
(or fetches)- render the entire UI once the data is back
This approach risks our UI feeling too slow or even broken to our end-users. A better approach would be:
- render what can be rendered without the data
- show a loading message and placeholder elements where the API data should be once fetched
await
on afetch
(or multiple fetches)- In the case of multiple
fetch
requests, render pieces as they arrive in their appropriate UI elements
- In the case of multiple
- render the rest of the UI
Using this approach, we provide a better experience with a better perceived speed. The time the request takes to resolve is less significant when we show something is happening with the UI. That’s the true power of async/await
- controlling when the linear execution blocks and for what reason. Opposed to the developer experience we got with callbacks and even promises, this certainly feels like a better developer experience.
But we still have to cover the multi-request scenario. In cases where we are waiting for responses from multiple data endpoints, we can add yet another weapon to our async repertoire - parallelism.
Parallelism
Let’s imagine we need to request users, photos, and posts to render the important parts of an imaginary app. Using what we talked about so far, we’d probably come up with something like this:
async function fetchData() {
try {
// URLs for the JSONPlaceholder API endpoints
const usersUrl = "https://jsonplaceholder.typicode.com/users";
const photosUrl = "https://jsonplaceholder.typicode.com/photos";
const postsUrl = "https://jsonplaceholder.typicode.com/posts";
const usersRequest = await fetch(usersUrl);
const photosRequest = await fetch(photosUrl);
const postsRequest = await fetch(postsUrl);
const users = await usersRequest.json();
const photos = await photosRequest.json();
const posts = await postsRequest.json();
return {
users,
photos,
posts,
};
} catch (error) {
console.error("Failed to fetch data:", error);
}
}
// RENDER BASIC UI
// RENDER LOADING SKELETON
const data = fetchData();
// RENDER THE DATA-BOUND UI
This snippet will work fine, but we’re doing things inefficiently - we’re blocking on the user request, then on the photos request, then on the posts request… you get the idea. A total of six await
calls that have no mutual interdependence. Not ideal.
Luckily, we can fix that. On the one hand, Promise
, like most other things in JavaScript, can be put inside an array. On the other hand, the Promise
objects come with some neat built-in methods - namely, Promise.all
(reference here). Combining those two things allows us to fire the fetch
requests in parallel, opposed to awaiting them one by one:
async function fetchData() {
try {
// URLs for the JSONPlaceholder API endpoints
const usersUrl = "https://jsonplaceholder.typicode.com/users";
const photosUrl = "https://jsonplaceholder.typicode.com/photos";
const postsUrl = "https://jsonplaceholder.typicode.com/posts";
// Create an array of promises using fetch() for each URL
const promises = [
fetch(usersUrl).then((response) => response.json()),
fetch(photosUrl).then((response) => response.json()),
fetch(postsUrl).then((response) => response.json()),
];
// Use Promise.all to wait for all promises to resolve
const [users, photos, posts] = await Promise.all(promises);
return {
users,
photos,
posts,
};
} catch (error) {
console.error("Failed to fetch data:", error);
}
}
// RENDER BASIC UI
// RENDER LOADING SKELETON
const data = fetchData();
// RENDER THE DATA-BOUND UI
One neat trick with combining .then
with async / await
is that they aren’t mutually exclusive. Namely, we can mix them as shown in the promises
array - we await
the response instead of writing two await
calls - one for the request and one for parsing the response data.
With our promises neatly tucked into an array, we can call Promise.all
, await once instead of six times, and run the requests in parallel instead of one by one, resulting in data being fetched faster and rendered faster on the screen. All while we’re making our code less verbose and easier to follow. It’s a win for everyone.
Some caveats when it comes to parallelization
Of course, there’s no one-size-fits-all. The parallel approach works only when we have no data interdependence between requests. Additionally, it’s worth looking into Promise.any
and Promise.allSettled
. Depending on your use case, they may work better than Promise.all
. More here and here
Takeaways and the journey ahead
While async/await
is undoubtedly a developer experience improvement over using Promise.then
, choosing the right tool for the right job is essential. Using a “regular” Promise.then
where appropriate is certainly not a mistake.
Additionally, the execution control flow allowed by async/await
lets us ship better user experiences with our applications - if we’re careful with the blocking order and ensuring we’re running things in parallel where appropriate.
Be extra careful with errors - unhandled rejected Promises are bad for the user experience. They can also be hard to pinpoint and debug, especially in nested async/await
blocks.
One part of the async tool evolution in JavaScript that often gets overlooked (and misunderstood) is generators. They deserve an article of their own but are a valuable tool to have in your asynchronous toolbelt, especially when it comes to asynchronous iteration.
The journey in asynchronicity in JavaScript is undoubtedly far from over. Embrace the tools at your disposal, experiment with different approaches, and happy hacking!