By saadq


2016-06-01 18:55:58 8 Comments

Are there any issues with using async/await in a forEach loop? I'm trying to loop through an array of files and await on the contents of each file.

import fs from 'fs-promise'

async function printFiles () {
  const files = await getFilePaths() // Assume this works fine

  files.forEach(async (file) => {
    const contents = await fs.readFile(file, 'utf8')
    console.log(contents)
  })
}

printFiles()

This code does work, but could something go wrong with this? I had someone tell me that you're not supposed to use async/await in a higher order function like this, so I just wanted to ask if there was any issue with this.

15 comments

@Antonio Val 2017-07-10 08:15:51

The p-iteration module on npm implements the Array iteration methods so they can be used in a very straightforward way with async/await.

An example with your case:

const { forEach } = require('p-iteration');
const fs = require('fs-promise');

(async function printFiles () {
  const files = await getFilePaths();

  await forEach(files, async (file) => {
    const contents = await fs.readFile(file, 'utf8');
    console.log(contents);
  });
})();

@mikemaccana 2018-11-14 11:49:08

I like this as it has the same functions / methods as JS itself - in my case I needed some rather than forEach. Thanks!

@jgmjgm 2019-10-14 18:35:52

To see how that can go wrong, print console.log at the end of the method.

Things that can go wrong in general:

  • Arbitrary order.
  • printFiles can finish running before printing files.
  • Poor performance.

These are not always wrong but frequently are in standard use cases.

Generally, using forEach will result in all but the last. It'll call each function without awaiting for the function meaning it tells all of the functions to start then finishes without waiting for the functions to finish.

import fs from 'fs-promise'

async function printFiles () {
  const files = (await getFilePaths()).map(file => fs.readFile(file, 'utf8'))

  for(const file of files)
    console.log(await file)
}

printFiles()

This is an example in native JS that will preserve order, prevent the function from returning prematurely and in theory retain optimal performance.

This will:

  • Initiate all of the file reads to happen in parallel.
  • Preserve the order via the use of map to map file names to promises to wait for.
  • Wait for each promise in the order defined by the array.

With this solution the first file will be shown as soon as it is available without having to wait for the others to be available first.

It will also be loading all files at the same time rather than having to wait for the first to finish before the second file read can be started.

The only draw back of this and the original version is that if multiple reads are started at once then it's more difficult to handle errors on account of having more errors that can happen at a time.

With versions that read a file at a time then then will stop on a failure without wasting time trying to read any more files. Even with an elaborate cancellation system it can be hard to avoid it failing on the first file but reading most of the other files already as well.

Performance is not always predictable. While many systems will be faster with parallel file reads some will prefer sequential. Some are dynamic and may shift under load, optimisations that offer latency do not always yield good throughput under heavy contention.

There is also no error handling in that example. If something requires them to either all be successfully shown or not at all it won't do that.

In depth experimentation is recommended with console.log at each stage and fake file read solutions (random delay instead). Although many solutions appear to do the same in simple cases all have subtle differences that take some extra scrutiny to squeeze out.

Use this mock to help tell the difference between solutions:

(async () => {
  const start = +new Date();
  const mock = () => {
    return {
      fs: {readFile: file => new Promise((resolve, reject) => {
        // Instead of this just make three files and try each timing arrangement.
        // IE, all same, [100, 200, 300], [300, 200, 100], [100, 300, 200], etc.
        const time = Math.round(100 + Math.random() * 4900);
        console.log(`Read of ${file} started at ${new Date() - start} and will take ${time}ms.`)
        setTimeout(() => {
          // Bonus material here if random reject instead.
          console.log(`Read of ${file} finished, resolving promise at ${new Date() - start}.`);
          resolve(file);
        }, time);
      })},
      console: {log: file => console.log(`Console Log of ${file} finished at ${new Date() - start}.`)},
      getFilePaths: () => ['A', 'B', 'C', 'D', 'E']
    };
  };

  const printFiles = (({fs, console, getFilePaths}) => {
    return async function() {
      const files = (await getFilePaths()).map(file => fs.readFile(file, 'utf8'));

      for(const file of files)
        console.log(await file);
    };
  })(mock());

  console.log(`Running at ${new Date() - start}`);
  await printFiles();
  console.log(`Finished running at ${new Date() - start}`);
})();

@chharvey 2018-02-23 00:47:27

In addition to @Bergi’s answer, I’d like to offer a third alternative. It's very similar to @Bergi’s 2nd example, but instead of awaiting each readFile individually, you create an array of promises, each which you await at the end.

import fs from 'fs-promise';
async function printFiles () {
  const files = await getFilePaths();

  const promises = files.map((file) => fs.readFile(file, 'utf8'))

  const contents = await Promise.all(promises)

  contents.forEach(console.log);
}

Note that the function passed to .map() does not need to be async, since fs.readFile returns a Promise object anyway. Therefore promises is an array of Promise objects, which can be sent to Promise.all().

In @Bergi’s answer, the console may log file contents in the order they’re read. For example if a really small file finishes reading before a really large file, it will be logged first, even if the small file comes after the large file in the files array. However, in my method above, you are guaranteed the console will log the files in the same order as the provided array.

@Venryx 2019-10-11 00:52:26

I'm pretty sure you're incorrect: I'm pretty sure your method also can read the files out of order. Yes, it will log the output in the correct order (due to the await Promise.all), but the files may have been read in a different order, which contradicts your statement "you are guaranteed the console will log the files in the same order as they are read".

@chharvey 2019-10-11 00:54:11

@Venryx You're right, thanks for the correction. I've updated my answer.

@Francisco Mateo 2018-06-15 11:17:50

With ES2018, you are able to greatly simplify all of the above answers to:

async function printFiles () {
  const files = await getFilePaths()

  for await (const file of fs.readFile(file, 'utf8')) {
    console.log(contents)
  }
}

See spec: proposal-async-iteration


2018-09-10: This answer has been getting a lot attention recently, please see Axel Rauschmayer's blog post for further information about asynchronous iteration: ES2018: asynchronous iteration

@saadq 2018-06-15 16:40:28

Upvoted, would be great if you could put a link to the spec in your answer for anyone who wants to know more about async iteration.

@ma11hew28 2018-09-03 07:12:33

Where is file defined in your code?

@Francisco Mateo 2018-09-08 14:13:21

Unsure where file is defined. This is ported 1:1 from the question above.

@zeion 2018-09-18 21:11:58

Shouldn't it be contents instead of file in the iterator

@Yevhenii Herasymchuk 2019-01-08 16:27:39

Why people are upvoting this answer? Take a closer look at the answer, question, and proposal. After the of should be the async function which will return an array. It doesn't work and Francisco said;

@Antonio Val 2019-01-09 10:30:17

I don't think this answer address the initial question. for-await-of with a synchronous iterable (an array in our case) doesn’t cover the case of iterating concurrently an array using asynchronous operations in each iteration. If I'm not mistaken, using for-await-of with a synchronous iterable over non-promise values is the same as using a plain for-of.

@Yevhenii Herasymchuk 2019-01-09 10:58:10

Totally agree with @AntonioVal. It's not an answer.

@Vadim Shvetsov 2019-01-17 13:34:51

How we delegates files array to the fs.readFile here? It tooks from iterable?

@Robert Molina 2019-01-20 05:59:45

While I agree it's not an answer, upvoting a proposal is a way to increase its popularity potentially making it available earlier to use later on.

@Ira Herman 2019-09-09 22:57:25

Thank you! This helped so much. I was having a crazy difficult time with the timing using a forEach loop. Changed to for await (const item of items) {} and now it's working completely as expected!

@Rafi Henig 2019-09-11 01:07:59

Using this solution each iteration would await for the previous, and in case of operation is making some long calculations or reading a long file it would block the executions of the next, as opposed to mapping all the functions to promises and waiting for them to complete.

@jgmjgm 2019-10-14 18:23:05

This code doesn't work. Needs editing.

@ColinWa 2019-11-07 07:59:07

The question brought me here ... this answer gave me a solution. This is a true async / await For iteration.

@Bergi 2016-06-01 19:02:09

Sure the code does work, but I'm pretty sure it doesn't do what you expect it to do. It just fires off multiple asynchronous calls, but the printFiles function does immediately return after that.

If you want to read the files in sequence, you cannot use forEach indeed. Just use a modern for … of loop instead, in which await will work as expected:

async function printFiles () {
  const files = await getFilePaths();

  for (const file of files) {
    const contents = await fs.readFile(file, 'utf8');
    console.log(contents);
  }
}

If you want to read the files in parallel, you cannot use forEach indeed. Each of the async callback function calls does return a promise, but you're throwing them away instead of awaiting them. Just use map instead, and you can await the array of promises that you'll get with Promise.all:

async function printFiles () {
  const files = await getFilePaths();

  await Promise.all(files.map(async (file) => {
    const contents = await fs.readFile(file, 'utf8')
    console.log(contents)
  }));
}

@Demonbane 2016-08-15 18:04:34

Could you please explain why does for ... of ... work?

@Demonbane 2016-08-15 19:21:57

ok i know why... Using Babel will transform async/await to generator function and using forEach means that each iteration has an individual generator function, which has nothing to do with the others. so they will be executed independently and has no context of next() with others. Actually, a simple for() loop also works because the iterations are also in one single generator function.

@Bergi 2016-08-15 23:28:34

@Demonbane: In short, because it was designed to work :-) await suspends the current function evaluation, including all control structures. Yes, it is quite similar to generators in that regard (which is why they are used to polyfill async/await).

@voscausa 2016-11-15 00:46:14

Great example. Thank you. I used Array.from to search and replace a dom nodelist (HTML table rows). Map needs an array type collection. For array like objects like the Dom Nodelist you can use Array.from.

@arve0 2017-03-29 11:13:16

So files.map(async (file) => ... is equivalent to files.map((file) => new Promise((rej, res) => { ...?

@Bergi 2017-03-29 16:25:17

@arve0 Not really, an async function is quite different from a Promise executor callback, but yes the map callback returns a promise in both cases.

@Félix Gagnon-Grenier 2017-05-16 21:04:00

When you come to learn about JS promises, but instead use half an hour translating latin. Hope you're proud @Bergi ;)

@DKebler 2017-05-28 23:55:52

So as regards await in which camp does for( in ) reside? thumbs down with forEach or thumbs up with for of ??

@Bergi 2017-05-29 00:32:46

@DKebler It's a syntactic loop and will work with await, but you rarely will need to enumerate object properties.

@davidsonsns 2017-11-16 12:27:49

I also had this doubt, I found this link that I found useful Iteration_protocols

@Kevin B 2018-01-19 21:49:37

@Adi Sivasankaran 2018-02-23 05:20:29

This answer is the best one: await Promise.all(_.map(arr, async (val) => {...}); solved my issue. Of course, each async callback returns a promise that I was not awaiting on.

@Doug 2018-03-02 12:47:38

For those who don't know, Promise.all returns a single promise that resolves when all the promises in the array are resolved. (basically waits for all promises to conclude)

@doubleOrt 2018-03-20 13:17:46

I think the second paragraph should be "If you want to read the files in parallel, you cannot use for..of" ? Because you can read the files in parallel with the forEach, it's just that it will look very ugly because you cannot await it.

@Bergi 2018-03-20 13:24:21

@Taurus If you don't intend to await them, then for…of would work equally to forEach. No, I really mean that paragraph to emphasise that there is no place for .forEach in modern JS code.

@doubleOrt 2018-03-20 13:26:09

@Bergi Isn't that bad for performance ? If you need to make 3 requests, you can't make them all in parallel, so if each takes one second, your program will take 3 seconds to run. But if you use map or forEach, the requests will run in parallel.

@Bergi 2018-03-20 13:28:30

@Taurus I don't understand. What is bad for performance in comparison to what?

@doubleOrt 2018-03-20 13:30:14

for..of is bad for performance in comparison to forEach/map, because for..of will stop in each iteration, so if I make 3 requests, each request will have to wait for the preceding requests to complete. But with forEach/map, all the requests will be made in parallel.

@Bergi 2018-03-20 13:33:06

@Taurus No. queries.forEach(q => asyncRequest(q)); does exactly the same as for (const q of queries) asyncRequest(q);. There is no difference in performance, and both will run the requests in parallel. Of course, in neither you can wait for anything - for how to do that, see my answer.

@doubleOrt 2018-03-20 13:45:50

@Bergi 2018-03-20 13:53:17

@Taurus Yes, that's basically the two approaches from my answer. Neither of them uses forEach. Where's the problem?

@doubleOrt 2018-03-20 13:55:32

@Bergi I could have used forEach (would work the same as map regarding execution time). But as you can see from my previous comments, I was talking about either forEach or map versus for..of. I said: for..of is bad for performance in comparison to forEach/map.

@doubleOrt 2018-03-20 13:57:42

However, how is this true (from your answer): If you want to read the files in parallel, you cannot use forEach indeed. I think you can do that, same thing as your map example, except with forEach it will look ugly and unfit for an async/await.

@Bergi 2018-03-20 14:05:29

@Taurus But that's wrong: for…of is not bad for performance in comparison to forEach if used in the same unfit way without awaiting anything. And yes, "cannot be used" means "is unfit".

@doubleOrt 2018-03-20 14:07:57

@Bergi Are you suggesting a case where you don't await whatever request you make inside of a for..of ? If so, alright but you do agree that your second example is way more performant than your first example ?

@Bergi 2018-03-20 14:14:39

@Taurus Yes, that's what I wrote in my comment above. Of course it doesn't make any sense and is unfit to solve the OPs problem, regardless whether with forEach or for…of. And no, I would not compare the two (for…of+await vs Promise.all+map) in terms of "performance" at all - they are just different, in many other more important regards. Choosing sequential vs parallel executing is a decision that involves many other factors, and of course parallel typically finishes faster.

@doubleOrt 2018-03-20 14:18:03

@Bergi Of course there are cases where the for..of might be the only way to go (e.g if each request has a dependency on the previous request), but a beginner might always opt for the for..of solution because it is simpler.

@doubleOrt 2018-03-20 14:18:23

@Bergi However, I still don't understand how this is correct: If you want to read the files in parallel, you cannot use forEach indeed.

@Bergi 2018-03-20 14:23:58

@Taurus You cannot use it because you cannot await the result. Of course if you used it - like the OP did - then they would run in parallel, but forEach is absolutely unfit as you said yourself.

@doubleOrt 2018-03-20 14:29:03

@Bergi Oh, now I understand what you mean by that part. But it is really vague, because technically you can use it but that doesn't mean you should. Your answer basically conveys that it is not possible to run requests in parallel using forEach.

@Bergi 2018-03-20 14:33:14

@Taurus I thought "and wait for it to finish" was implied. But still, if "cannot" conveys the same as "absolutely should not" I'm fine. Keep it simple for the beginners: they should never use forEach, that's all they need to know.

@Donald E. Foss 2018-04-03 18:49:55

Another "couldn't you...", couldn't one use a .join with this as well to wait for all promises to be complete? Then you can iterate through the files in the list, etc. I really like @Babak's Task and traversable list. I'm going to find an excuse to use that somewhere as soon as I can.

@Bergi 2018-04-03 19:42:59

@DonaldE.Foss join is a method on arrays of strings, it doesn't wait for promises. What you can do is var string = (await Promise.all(…)).join(…).

@Bergi 2018-12-09 15:57:10

@Rick const declarations work perfectly fine in loops. The variables are never reassigned.

@Ashutosh Chamoli 2019-01-18 04:12:09

@Bergi Solution works for me. Though, I still don't understand why foreach with async doesn't work. Since, we have specified await in the inside the loop, it should wait for the response?

@Bergi 2019-01-18 09:34:33

@AshutoshChamoli Only the async function call waits and returns a promise. forEach doesn't care for that promise.

@Patrick Roberts 2019-05-08 16:50:07

@Bergi perhaps, since this is a canonical dupe target, you could add an alternative that avoids async / await syntax and just returns the promise to map() directly? I understand that the timing of console.log() would no longer be the same but there seems to be confusion about the fact that async isn't somehow required in order to use Promise.all() and map().

@Patrick Roberts 2019-05-08 16:53:28

or you could be so kind as to suggest alternate dupe targets that don't do ugly things like var promises = []; array.forEach(value => { promises.push(someAsync(value)); }); return Promise.all(promises); because I'm seeing a lot of those, or similar questions with better answers, but code that is too complicated to be reusable as a duplicate target.

@Bergi 2019-05-08 17:42:27

@PatrickRoberts I'm keeping a list of those even, for when one day I will write a new canonical question with detailed answers...

@Scaramouche 2019-09-17 06:26:49

@Demonbane; hi, for others coming here, I wanted to use await inside the loop and the oldie for loop didn't work for me, for..of did. :)

@steampowered 2019-11-06 15:42:55

The maintainer of Mongoose has a good discussion on the topic here: thecodebarbarian.com/…

@myDoggyWritesCode 2019-05-26 22:08:49

Bergi's solution works nicely when fs is promise based. You can use bluebird, fs-extra or fs-promise for this.

However, solution for node's native fs libary is as follows:

const result = await Promise.all(filePaths
    .map( async filePath => {
      const fileContents = await getAssetFromCache(filePath, async function() {

        // 1. Wrap with Promise    
        // 2. Return the result of the Promise
        return await new Promise((res, rej) => {
          fs.readFile(filePath, 'utf8', function(err, data) {
            if (data) {
              res(data);
            }
          });
        });
      });

      return fileContents;
    }));

Note: require('fs') compulsorily takes function as 3rd arguments, otherwise throws error:

TypeError [ERR_INVALID_CALLBACK]: Callback must be a function

@Matt 2018-03-22 15:11:16

Here are some forEachAsync prototypes. Note you'll need to await them:

Array.prototype.forEachAsync = async function (fn) {
    for (let t of this) { await fn(t) }
}

Array.prototype.forEachAsyncParallel = async function (fn) {
    await Promise.all(this.map(fn));
}

Note while you may include this in your own code, you should not include this in libraries you distribute to others (to avoid polluting their globals).

@DaniOcean 2018-03-28 13:55:33

Although I'd hesitate to add things directly to the prototype, this is a nice async forEach implementation

@mikemaccana 2018-04-03 13:29:27

As long as the name is unique in the future (like I'd use _forEachAsync) this is reasonable. I also think it's the nicest answer as it saves a lot of boilerplate code.

@Estus Flask 2018-11-05 06:25:21

They should be standalone functions. We've got modules to not pollute globals with our personal things.

@mikemaccana 2019-04-26 09:13:02

@estus That is to avoid polluting other people's code. If the code belongs to our personal organisation, and the globals are in a well identified file (globals.js would be good) we can add globals as we like.

@Estus Flask 2019-04-26 09:30:58

@mikemaccana That's to avoid generally accepted bad practices. That's true, this can be done as long as you use only first-party code, which happens rarely. The problem is that when you use third-party libs, there can be some other guy that feels the same way and modifies the same globals, just because it seemed a good idea at the time when a lib was written.

@mikemaccana 2019-04-26 09:34:10

@estus The practices are considered bad due to the reasons listed in the comment you're replying to. A third party who read my comment would not include prototype extras in library code they distribute to others.

@Estus Flask 2019-04-26 09:43:38

@mikemaccana That's correct. There's a lot of legacy code around that already does this and can potentially clash with your own globals, I see 'prototype pollution' warning with npm audit here and there. My point was to not recommend the approach listed in the answer to anyone who isn't fully aware of the consequences, especially in 2018.

@mikemaccana 2019-04-26 09:48:59

@estus Sure. I've added a warning to the question to save the (not particularly productive) discussion here.

@Beau 2019-03-12 23:31:29

Currently the Array.forEach prototype property doesn't support async operations, but we can create our own poly-fill to meet our needs.

// Example of asyncForEach Array poly-fill for NodeJs
// file: asyncForEach.js
// Define asynForEach function 
async function asyncForEach(iteratorFunction){
  let indexer = 0
  for(let data of this){
    await iteratorFunction(data, indexer)
    indexer++
  }
}
// Append it as an Array prototype property
Array.prototype.asyncForEach = asyncForEach
module.exports = {Array}

And that's it! You now have an async forEach method available on any arrays that are defined after these to operations.

Let's test it...

// Nodejs style
// file: someOtherFile.js

const readline = require('readline')
Array = require('./asyncForEach').Array
const log = console.log

// Create a stream interface
function createReader(options={prompt: '>'}){
  return readline.createInterface({
    input: process.stdin
    ,output: process.stdout
    ,prompt: options.prompt !== undefined ? options.prompt : '>'
  })
}
// Create a cli stream reader
async function getUserIn(question, options={prompt:'>'}){
  log(question)
  let reader = createReader(options)
  return new Promise((res)=>{
    reader.on('line', (answer)=>{
      process.stdout.cursorTo(0, 0)
      process.stdout.clearScreenDown()
      reader.close()
      res(answer)
    })
  })
}

let questions = [
  `What's your name`
  ,`What's your favorite programming language`
  ,`What's your favorite async function`
]
let responses = {}

async function getResponses(){
// Notice we have to prepend await before calling the async Array function
// in order for it to function as expected
  await questions.asyncForEach(async function(question, index){
    let answer = await getUserIn(question)
    responses[question] = answer
  })
}

async function main(){
  await getResponses()
  log(responses)
}
main()
// Should prompt user for an answer to each question and then 
// log each question and answer as an object to the terminal

We could do the same for some of the other array functions like map...

async function asyncMap(iteratorFunction){
  let newMap = []
  let indexer = 0
  for(let data of this){
    newMap[indexer] = await iteratorFunction(data, indexer, this)
    indexer++
  }
  return newMap
}

Array.prototype.asyncMap = asyncMap

... and so on :)

Some things to note:

  • Your iteratorFunction must be an async function or promise
  • Any arrays created before Array.prototype.<yourAsyncFunc> = <yourAsyncFunc> will not have this feature available

@Timothy Zorn 2018-03-26 19:48:58

Instead of Promise.all in conjunction with Array.prototype.map (which does not guarantee the order in which the Promises are resolved), I use Array.prototype.reduce, starting with a resolved Promise:

async function printFiles () {
  const files = await getFilePaths();

  await files.reduce(async (promise, file) => {
    // This line will wait for the last async function to finish.
    // The first iteration uses an already resolved Promise
    // so, it will immediately continue.
    await promise;
    const contents = await fs.readFile(file, 'utf8');
    console.log(contents);
  }, Promise.resolve());
}

@parrker9 2018-03-28 20:48:58

This works perfectly, thank you so much. Could you explain what is happening here with Promise.resolve() and await promise;?

@GollyJer 2018-06-09 00:24:33

This is pretty cool. Am I right in thinking the files will be read in order and not all at once?

@Timothy Zorn 2018-06-17 15:00:08

@parrker9 Promise.resolve() returns an already resolved Promise object, so that reduce has a Promise to start with. await promise; will wait for the last Promise in the chain to resolve. @GollyJer The files will be processed sequentially, one at a time.

@Shay Yzhakov 2019-05-30 12:54:47

Very cool use of reduce, thanks for the comment! I'll just denote that, in contrast to some of the other methods mentioned in the comments, this one is synchronous, meaning that the files are read one after another and not in parallel (since the next iteration of reduce function relies on the previous iteration, it must be synchronous).

@Timothy Zorn 2019-05-30 16:51:53

@Shay, You mean sequential, not synchronous. This is still asynchronous - if other things are scheduled, they will run in between the iterations here.

@Scott Rudiger 2018-06-21 16:55:47

Similar to Antonio Val's p-iteration, an alternative npm module is async-af:

const AsyncAF = require('async-af');
const fs = require('fs-promise');

function printFiles() {
  // since AsyncAF accepts promises or non-promises, there's no need to await here
  const files = getFilePaths();

  AsyncAF(files).forEach(async file => {
    const contents = await fs.readFile(file, 'utf8');
    console.log(contents);
  });
}

printFiles();

Alternatively, async-af has a static method (log/logAF) that logs the results of promises:

const AsyncAF = require('async-af');
const fs = require('fs-promise');

function printFiles() {
  const files = getFilePaths();

  AsyncAF(files).forEach(file => {
    AsyncAF.log(fs.readFile(file, 'utf8'));
  });
}

printFiles();

However, the main advantage of the library is that you can chain asynchronous methods to do something like:

const aaf = require('async-af');
const fs = require('fs-promise');

const printFiles = () => aaf(getFilePaths())
  .map(file => fs.readFile(file, 'utf8'))
  .forEach(file => aaf.log(file));

printFiles();

async-af

@Babakness 2018-02-28 04:41:08

Using Task, futurize, and a traversable List, you can simply do

async function printFiles() {
  const files = await getFiles();

  List(files).traverse( Task.of, f => readFile( f, 'utf-8'))
    .fork( console.error, console.log)
}

Here is how you'd set this up

import fs from 'fs';
import { futurize } from 'futurize';
import Task from 'data.task';
import { List } from 'immutable-ext';

const future = futurizeP(Task)
const readFile = future(fs.readFile)

Another way to have structured the desired code would be

const printFiles = files => 
  List(files).traverse( Task.of, fn => readFile( fn, 'utf-8'))
    .fork( console.error, console.log)

Or perhaps even more functionally oriented

// 90% of encodings are utf-8, making that use case super easy is prudent

// handy-library.js
export const readFile = f =>
  future(fs.readFile)( f, 'utf-8' )

export const arrayToTaskList = list => taskFn => 
  List(files).traverse( Task.of, taskFn ) 

export const readFiles = files =>
  arrayToTaskList( files, readFile )

export const printFiles = files => 
  readFiles(files).fork( console.error, console.log)

Then from the parent function

async function main() {
  /* awesome code with side-effects before */
  printFiles( await getFiles() );
  /* awesome code with side-effects after */
}

If you really wanted more flexibility in encoding, you could just do this (for fun, I'm using the proposed Pipe Forward operator )

import { curry, flip } from 'ramda'

export const readFile = fs.readFile 
  |> future,
  |> curry,
  |> flip

export const readFileUtf8 = readFile('utf-8')

PS - I didn't try this code on the console, might have some typos... "straight freestyle, off the top of the dome!" as the 90s kids would say. :-p

@Zachary Ryan Smith 2018-02-04 16:03:47

I would use the well-tested (millions of downloads per week) pify and async modules. If you are unfamiliar with the async module, I highly recommend you check out its docs. I've seen multiple devs waste time recreating its methods, or worse, making difficult-to-maintain async code when higher-order async methods would simplify code.

const async = require('async')
const fs = require('fs-promise')
const pify = require('pify')

async function getFilePaths() {
    return Promise.resolve([
        './package.json',
        './package-lock.json',
    ]);
}

async function printFiles () {
  const files = await getFilePaths()

  await pify(async.eachSeries)(files, async (file) => {  // <-- run in series
  // await pify(async.each)(files, async (file) => {  // <-- run in parallel
    const contents = await fs.readFile(file, 'utf8')
    console.log(contents)
  })
  console.log('HAMBONE')
}

printFiles().then(() => {
    console.log('HAMBUNNY')
})
// ORDER OF LOGS:
// package.json contents
// package-lock.json contents
// HAMBONE
// HAMBUNNY
```

@jbustamovej 2018-02-20 06:24:16

This is a step in the wrong direction. Here's a mapping guide I created to help get folks stuck in callback hell into the modern JS era: github.com/jmjpro/async-package-to-async-await/blob/master/…‌​.

@Zachary Ryan Smith 2018-02-21 01:54:20

as you can see here, I am interested in and open to using async/await instead of the async lib. Right now, I think that each has a time and place. I'm not convinced that the async lib == "callback hell" and async/await == "the modern JS era". imo, when async lib > async/await: 1. complex flow (eg, queue, cargo, even auto when things get complicated) 2. concurrency 3. supporting arrays/objects/iterables 4. err handling

@Jay Edwards 2017-09-22 23:03:30

it's pretty painless to pop a couple methods in a file that will handle asynchronous data in a serialized order and give a more conventional flavour to your code. For example:

module.exports = function () {
  var self = this;

  this.each = async (items, fn) => {
    if (items && items.length) {
      await Promise.all(
        items.map(async (item) => {
          await fn(item);
        }));
    }
  };

  this.reduce = async (items, fn, initialValue) => {
    await self.each(
      items, async (item) => {
        initialValue = await fn(initialValue, item);
      });
    return initialValue;
  };
};

now, assuming that's saved at './myAsync.js' you can do something similar to the below in an adjacent file:

...
/* your server setup here */
...
var MyAsync = require('./myAsync');
var Cat = require('./models/Cat');
var Doje = require('./models/Doje');
var example = async () => {
  var myAsync = new MyAsync();
  var doje = await Doje.findOne({ name: 'Doje', noises: [] }).save();
  var cleanParams = [];

  // FOR EACH EXAMPLE
  await myAsync.each(['bork', 'concern', 'heck'], 
    async (elem) => {
      if (elem !== 'heck') {
        await doje.update({ $push: { 'noises': elem }});
      }
    });

  var cat = await Cat.findOne({ name: 'Nyan' });

  // REDUCE EXAMPLE
  var friendsOfNyanCat = await myAsync.reduce(cat.friends,
    async (catArray, friendId) => {
      var friend = await Friend.findById(friendId);
      if (friend.name !== 'Long cat') {
        catArray.push(friend.name);
      }
    }, []);
  // Assuming Long Cat was a friend of Nyan Cat...
  assert(friendsOfNyanCat.length === (cat.friends.length - 1));
}

@Jay Edwards 2017-09-26 09:08:40

Minor addendum, don't forget to wrap your await/asyncs in try/catch blocks!!

@LeOn - Han Li 2017-09-24 20:00:43

One important caveat is: The await + for .. of method and the forEach + async way actually have different effect.

Having await inside a real for loop will make sure all async calls are executed one by one. And the forEach + async way will fire off all promises at the same time, which is faster but sometimes overwhelmed(if you do some DB query or visit some web services with volume restrictions and do not want to fire 100,000 calls at a time).

You can also use reduce + promise(less elegant) if you do not use async/await and want to make sure files are read one after another.

files.reduce((lastPromise, file) => 
 lastPromise.then(() => 
   fs.readFile(file, 'utf8')
 ), Promise.resolve()
)

Or you can create a forEachAsync to help but basically use the same for loop underlying.

Array.prototype.forEachAsync = async function(cb){
    for(let x of this){
        await cb(x);
    }
}

@Bergi 2017-11-16 13:57:13

Have a look at How to define method in javascript on Array.prototype and Object.prototype so that it doesn't appear in for in loop. Also you probably should use the same iteration as native forEach - accessing indices instead of relying on iterability - and pass the index to the callback.

@Timothy Zorn 2018-03-26 19:54:17

You can use Array.prototype.reduce in a way that uses an async function. I've shown an example in my answer: stackoverflow.com/a/49499491/2537258

@Hooman Askari 2017-08-26 10:47:21

Both the solutions above work, however, Antonio's does the job with less code, here is how it helped me resolve data from my database, from several different child refs and then pushing them all into an array and resolving it in a promise after all is done:

Promise.all(PacksList.map((pack)=>{
    return fireBaseRef.child(pack.folderPath).once('value',(snap)=>{
        snap.forEach( childSnap => {
            const file = childSnap.val()
            file.id = childSnap.key;
            allItems.push( file )
        })
    })
})).then(()=>store.dispatch( actions.allMockupItems(allItems)))

Related Questions

Sponsored Content

41 Answered Questions

[SOLVED] Loop through an array in JavaScript

38 Answered Questions

[SOLVED] How do I loop through or enumerate a JavaScript object?

44 Answered Questions

[SOLVED] JavaScript closure inside loops – simple practical example

21 Answered Questions

[SOLVED] How and when to use ‘async’ and ‘await’

21 Answered Questions

[SOLVED] How to loop through a plain JavaScript object with the objects as members?

  • 2009-05-28 16:18:14
  • edt
  • 1574606 View
  • 1513 Score
  • 21 Answer
  • Tags:   javascript

1 Answered Questions

[SOLVED] Async Await without using async Await

1 Answered Questions

[SOLVED] Node.js async/await not working

1 Answered Questions

Using Promises with Await/Async Correctly

2 Answered Questions

1 Answered Questions

Sponsored Content