#### [SOLVED] Is there a better way to do partial sums of array items in JavaScript?

I wonder if there is a better way of generating a better performing solution for partial sums of an array.

Given an array say x = [ 0, 1, 2, 3, 4, 5 ], I generated sub-arrays of the items, and then computed the sum of each array which gives:

[ 0, 1, 3, 6, 10, 15 ]

So the full code is:

x.map((y,i)=>x.filter((t,j)=>j<=i))
.map(ii=>ii.reduce((x,y)=>x+y,0))

I wonder if flat map or some other array method will have a solution that does not require expanding each subarray.

#### @weegee 2019-06-10 17:55:01

A way can be to use for each and then slice the arrays to get the elements one by one and then sum them all by array.reduce You can do this like

let x = [0, 1, 2, 3, 4, 5]
let sum = []
x.forEach((_, index) => {
index++;
sum.push(x.slice(0, index).reduce((a, b) => a + b))
})
console.log(sum)

We are getting [0] then [0,1] then [0,1,2] then [0,1,2,3] and via [0,1,2].reduce((a, b) => a + b)) we get 3. Just push that to a new array. Which is your answer.

We can go even shorter by doing this. To me, this seems a very optimized solution.

let ar = [0, 1, 2, 3, 4, 5]
let s = 0
let arr = []
ar.forEach((n, i) => arr.push(s += n))
console.log(arr)

Or with .map, you can

let ar = [0, 1, 2, 3, 4, 5], s = 0
console.log(ar.map((n, i) => s += n))

#### @Ry- 2019-06-10 18:27:43

“To me, this seems a very optimized solution.” It isn’t. “You can also use .map instead of forEach like” But why?

#### @weegee 2019-06-10 18:47:22

Also, my last solution is better than yours jsben.ch/BAg36 You can even check it there @Ry-

#### @Ry- 2019-06-10 18:49:42

I mean, if we’re comparing last solutions: jsben.ch/y56yW

#### @weegee 2019-06-10 18:52:42

That's total cheating, You are taking long inputs for my test and shorter for yours. Look at this now. The same inputs jsben.ch/PzJ8s @Ry-

#### @Ry- 2019-06-10 18:56:16

Sorry, leftover. You’ll notice it doesn’t change the outcome.

#### @weegee 2019-06-10 18:58:07

Well, it does. Run the test multiple times @Ry- It turns out that the time is dynamic and varies for arbitrary attempts. (although, most of the times, my code block comes first)

#### @Ry- 2019-06-10 19:02:51

No matter how many times I run it, yours never comes first. This is only tangentially relevant to my original comment, though (in that discarding .map’s return value wastes time as well).

#### @weegee 2019-06-10 19:05:38

What I said is that the time is dynamic and varies for arbitrary attempts. Here, my solution always comes first. Your solution comes first negligible times only. @Ry-

#### @weegee 2019-06-10 19:11:08

Also, there's no difference in my last approach and yours. Comparing the time is useless there. @Ry-

#### @Ry- 2019-06-10 19:14:48

There’s a difference in that it pointlessly uses map for a push side effect and discards the result. (Well, I guess it’s not pointless, because it means you’re not technically duplicating my answer.)

#### @weegee 2019-06-10 19:23:52

Also, .map is faster than .forEach. That's why I used .map most of the times. @Ry-

#### @Ry- 2019-06-10 19:26:38

You mean faster to type? ’cause it’s not faster to run. It makes a new array. That’s what it’s for.

#### @weegee 2019-06-10 19:29:23

@Ry- yes it is for that. Read this amazing article and I quote from it "In reality, you shouldn’t be using .forEach() unless other methods such as .map() can’t do the trick. .map() is actually slightly faster than .forEach()"

#### @Ry- 2019-06-10 19:30:35

You misinterpreted it. The article is talking about .map used correctly, i.e. arr.map(x => x * 2) vs. let result = []; arr.forEach(x => { result.push(x * 2) }). let result = []; arr.map(x => { result.push(x * 2) }) is just the worst of both worlds.

#### @weegee 2019-06-10 19:37:31

I agree but .map is faster and that's the only case I'm using it for. I will use .map where I can, to reduce code times a max as possible. Also, anyone can use it however they want right? @Ry-

#### @Ry- 2019-06-10 19:50:11

No, I just explained why it’s not faster in the way you seem to think that it is. Try a benchmark. Sure, you can use it anyway, and it’ll be bad practice anyway.

#### @weegee 2019-06-10 19:57:59

@Ry- okay, I'll accept your approach as I'm a junior and .map returns an array which can also lead to some array waste. Edited my answer

#### @Joe23 2019-06-14 07:14:45

It's possible to use map directly, if you keep an external accumulator variable:

const x = [ 0, 1, 2, 3, 4, 5 ];

let acc = 0;
const prefixSum = x.map(x => acc += x);

console.log(prefixSum);

#### @LiuXiMin 2019-06-15 08:53:27

Clever trick with assignment expression.

#### @Joe23 2019-07-10 12:10:19

Why the downvote? Honestly curious.

#### @LiuXiMin 2019-07-11 00:21:34

I upvoted. I guess those who downvoted are not familiar with assignment expression, not comfortable with x => acc += x.

#### @user633183 2019-06-10 01:44:27

Below, scan takes a mapping function f and an initial accumulator r -

const scan = (f, r, [ x, ...xs ]) =>
x === undefined
? [ r ]
: [ r, ...scan (f, f (r, x), xs) ]

const add = (x, y) =>
x + y

const print = (...vs) =>
vs .forEach (v => console .log (v))

const data =
[ 0, 1, 2, 3, 4, 5 ]

print
, scan (Math.max, 3, data)
)

// [ 0, 0, 1, 3, 6, 10, 15 ]
// [ 3, 3, 3, 3, 3, 4, 5 ]
// [ 0 ]

If you need a program that does not take an initial accumulator, the first element of the input array can be used instead. This variation is called scan1 -

const scan = (f, r, [ x, ...xs ]) =>
x === undefined
? [ r ]
: [ r, ...scan (f, f (r, x), xs) ]

const scan1 = (f, [ x, ...xs ]) =>
x === undefined
? []
: scan (f, x, xs)

const add = (x, y) =>
x + y

const print = (...vs) =>
vs .forEach (v => console .log (v))

const data =
[ 0, 1, 2, 3, 4, 5 ]

print
, scan1 (Math.max, data)
, scan1 (Math.min, data)
)

// [ 0, 1, 3, 6, 10, 15 ]
// [ 0, 1, 2, 3, 4, 5 ]
// [ 0, 0, 0, 0, 0, 0 ]
// []

Performance optimisations can be made and stack-overflow woes can be remedied, if necessary, all without sacrificing functional style -

const scan = (f, init, xs) =>
loop
( ( r = []
, a = init
, i = 0
) =>
i >= xs.length
? push (a, r)
: recur
( push (a, r)
, f (a, xs[i])
, i + 1
)
)

Now let's run it with a big input -

// BIG data!
const data =
Array .from (Array (10000), (_, x) => x)

// fast and stack-safe
console .time ("scan")
const result = scan (add, 0, data)
console .timeEnd ("scan")
// scan: 8.07 ms

console .log (result)
// [ 0, 0, 1, 3, 6, 10, 15, ..., 49985001 ]

This depends on the following generic functional procedures -

const recur = (...values) =>
({ recur, values })

const loop = f =>
{ let r = f ()
while (r && r.recur === recur)
r = f (...r.values)
return r
}

const push = (x, xs) =>
( xs .push (x)
, xs
)

Expand the snippet below to verify the results in your own browser -

const recur = (...values) =>
({ recur, values })

const loop = f =>
{ let r = f ()
while (r && r.recur === recur)
r = f (...r.values)
return r
}

const push = (x, xs) =>
( xs .push (x)
, xs
)

const scan = (f, init, xs) =>
loop
( ( r = []
, a = init
, i = 0
) =>
i >= xs.length
? push (a, r)
: recur
( push (a, r)
, f (a, xs[i])
, i + 1
)
)

const add = (x, y) =>
x + y

const data =
Array .from (Array (10000), (_, x) => x)

console .time ("scan")
const result = scan (add, 0, data)
console .timeEnd ("scan")

console .log (result)
// [ 0, 0, 1, 3, 6, 10, 15, ..., 49995000 ]

#### @Ry- 2019-06-10 15:48:23

scan is good. Implementing scan in JavaScript as if it were Haskell is not good, because it gives you quadratic time complexity and stack overflows on arrays that aren’t even particularly big.

#### @user633183 2019-06-10 16:34:59

@Ry-, Thanks but I don't see the quadratic time complexity. The intermediate arrays are definitely wasteful, but the simple programs show precisely what needs to be done. Optimisation with this kind of program is easy so I usually leave them for the end-user who knows their requirements best. An update to the answer shows a stack-safe scan. Your comment is appreciated.

#### @Ry- 2019-06-10 16:41:39

[ r, ...scan (f, f (r, x), xs) ] spread – copy of 0 + 1 + … + n−1 items, quadratic. [ x, ...xs ] destructuring – copy of n−1 + n−2 + … + 0 items, quadratic. Thanks for adding the other version.

#### @user633183 2019-06-10 16:48:10

That's more like O(n*(n+1)/2), which is almost half the complexity of O(n^2), yeah? The optimised revision runs in O(n) though.

#### @Ry- 2019-06-10 16:50:34

Yes, but O(n*(n+1)/2) = O(n²). The “half” is a constant factor.

#### @Scott Sauyet 2019-06-10 16:52:37

Of course the universality of reduce means that we can also choose to implement it on reduce. One way: const scan2 = (f, init, xs) => xs .reduce ( (as, x) => [ ...as, f (as.slice(-1)[0], x) ], [init] ) .slice (1)

#### @Scott Sauyet 2019-06-10 17:14:58

Another way to remove that additional time complexity is the replace [ r, ...scan (...) ] with [r .concat (...)], assuming that concat is an atomic operation

#### @user633183 2019-06-10 17:16:13

@Ry-, thanks for clarification.

#### @user633183 2019-06-10 17:20:11

@ScottSauyet you can avoid those intermediate arrays and right-side lookups by using a compound accumulator in reduce; ie, using push from the update, scan2 = (f, init, xs) => push (...xs.reduce (([ r, a ], x) => [ f (r, x), push (r, a) ], [ init, [] ])). I think JavaScript compilers could easily optimize [a, ...b] to [a].concat(b) but the underlying creation of a new array still happens. For significantly large inputs, concat will always perform slower than push in this program. Thanks for the comment :D

#### @user633183 2019-06-10 17:23:37

@ScottSauyet I'm not an expert on all of the recursion schemes. I first saw this as an anamorphism but someone recently introduced me to apomorphism and it might be more applicable here. See comments on this Q&A. More research requires more time! -_-

#### @Ry- 2019-06-10 19:07:02

@ScottSauyet: [r, ...s] and [r].concat(s) are the same in that respect.

#### @user633183 2019-06-10 19:30:57

@Ian, I appreciate your efforts but the added syntax is seen as noisy and distracting. I understand many JavaScript programmers are numb to this and may find the style in my answers unfamiliar. This style has been adapted to improve semantic consistency and increase comprehension but it takes time to undo the old ways of thinking. I invite you to read some of my other answers to see the style applied in various ways. If you'd like to post an ES5 version of my answer using your syntax styles, you are more than welcome.

#### @Scott Sauyet 2019-06-10 20:13:28

@Ry-: You're right. In fact, in my small test (jsperf.com/another-spread-vs-concat), the spread syntax was 50% faster than concat. OTOH, with the speed of either of these, it seems likely that the O(n^2) is likely to be an actual impediment in many reasonable cases... in which case, I vote for clean code.

#### @Ry- 2019-06-10 20:57:51

@ScottSauyet: Is one of those supposed to be “unlikely”…?

#### @user633183 2019-06-10 21:08:37

@ScottSauyet, I admit that's surprising, especially because I over-simplified in my original comment. [a, ...b] is more like [a].concat(Array.from(b)) because spread argument accepts any iterable. We'd want to look at this with larger array inputs because big inputs are the reason we would optimise in the first place. Thanks everyone for discussion :)

#### @Ian 2019-06-11 14:15:18

@user633183 ES6 features are a great improvement to the language and should definitely be used, but this style is effectively implementing those features to such a manner that it obfuscates your code. Arrow functions, for example, have a specific intended purpose as callbacks, and they improve readability and syntax when used in that manner. Explicit function declarations still also have a purpose. Just because something can be shorter doesn't mean it should. Regardless it's your answer/your choice so I'll just respectfully disagree and sit aside.

#### @Ry- 2019-06-11 17:18:29

@bob: Discussing about performance is meaningless in a question that specifically asks for a better-performing solution? Interesting take. (I’m also sorry that my outright and specific disagreement is being interpreted as passive.)

#### @Scott Sauyet 2019-06-12 16:54:17

@bob: I don't see any particularly aggressive comments on any of the answers here. I found the discussions here quite interesting, and I like the variety of answers.

#### @bob 2019-06-12 18:39:49

@ScottSauyet I don't want to open this again, but when you look at his answer it is the less general version of this one. So a justified criticism would've been "please don't use concat/spread for arrays containing lots of data, because it is incredibly inefficient. But downvoting this answer and waltzing in with the bog-o hammer without prior attempt to settle this isn't the right way. Please note that another user felt encouraged to change this answer extensively, because the discussion provided a breeding ground for it. Anyway, thank you for getting me to reconsider my actions, Scott.

#### @Scott Sauyet 2019-06-12 18:46:54

@bob: wow, I hadn't seen Ian's drastic and destructive edit to this answer. I imagine there might have been (now-deleted) comments regarding that. If there were, I didn't see them either, although I did wonder about the comment directed at Ian by user633183.

#### @user633183 2019-06-12 19:57:46

Ian never commented, I just @'d Ian to remark on the edit that was made. I am still learning wrt to computational complexity and the note from Ry was helpful, to me at least. SO is an open forum and I welcome everyone's comments and criticisms.

#### @Orionis 2019-06-10 16:26:05

Here is a simple answer using a recursive function.

var array = [ 0, 1, 2, 3, 4, 5 ];

function sumArray(arrayToSum, index){
if(index < arrayToSum.length-1){
arrayToSum[index+1] = arrayToSum[index] + arrayToSum[index+1];
return sumArray(arrayToSum, index+1);
}else
return arrayToSum;

}
sumArray(array, 0);

console.log(array);

#### @Peter Cordes 2019-06-10 18:19:45

This has zero advantages and requires the JIT-compiler to optimize away the recursion, otherwise it could easily overflow the call stack for large arrays. Also it stores the running total to the array and reads it back, instead of keeping it in a variable.

#### @Krzysztof Atłasik 2019-06-10 06:56:46

The flat map won't be useful in your case, because you're not trying to flatten your partial results coming as lists, but we can probably try to resolve your problem in a single reduce:

[0, 1, 2, 3, 4, 5]
.reduce(
([arr, sum], el) => { // We pass along array and running sum
const next = sum + el
return [[...arr, next], next]
},
[[], 0] // We need to seed our reduce with empty array and accumulator for calculating running sum
)[0] // Array containing array and the last sum is returned, so we need to take only the first element

It also iterates the array only once, so it might be a little more performant, than a solution creating slices and then summing them.

Or a version with array.push, which reuses same array:

[0, 1, 2, 3, 4, 5]
.reduce(
([arr, sum], el) => { // We pass along array and running sum
const next = sum + el
arr.push(next)
return [arr, next]
},
[[], 0] // We need to seed our reduce with empty array and accumulator for calculating running sum
)[0]

#### @Ry- 2019-06-10 15:15:08

This will be equally slow (O(n²)) since it copies the array each time it wants to add something to it.

#### @Krzysztof Atłasik 2019-06-10 15:48:35

@Ry- you can easily replace it with push, but it will be mutating array, which would make it not purely functional. Have you seen tag functional-programming on question?

#### @Ry- 2019-06-10 15:51:26

Nothing is purely functional. You own the array. It’s completely fine.

#### @Krzysztof Atłasik 2019-06-10 15:54:35

@Ry- Not really, you can have purely functional methods. But as you mentioned, a version using push is referentially transparent, so it's also fine. Edited my answer to include the second version.

#### @Steve 2019-06-10 11:29:38

If you asking is there a faster or more efficient way then the other answers are sufficient.

However, I'd argue that something similar to your current solution is easier to read and more declarative if we phrase it as a mapping function.

Specifically something like "Map each value to itself plus all the previous values in the array".

You could use a filter, as you have done in your question, but i think a slice is clearer.

const x = [ 0, 1, 2, 3, 4, 5 ];

// A common generic helper function
const sum = (acc, val) => acc + val

const sums = x.map((val, i, self) => val + self.slice(0, i).reduce(sum, 0))

#### @Ry- 2019-06-10 15:46:46

This describes CertainPerformance’s answer, by the way.

#### @Steve 2019-06-10 16:20:40

The code uses the same approach but I don't think that answer explains the value of the approach. Answers should be more than just the code.

#### @Pablo Lozano 2019-06-10 10:04:07

You just need to add in every step the current value to the previous result, so you could use a simple reduce.

const array = [0, 1, 2, 3, 4, 5, 6];

const sums = array.reduce((acc,current,index) => {
const prev = acc.length ? acc[index-1] : 0;
acc.push(prev + current);
return acc;
},[]);

console.log(sums.toString());

#### @Code Maniac 2019-06-10 01:46:07

You can simply use a for loop with a variable to keep track of the last sum

let x = [ 0, 1, 2, 3, 4, 5 ]

let sum = (arr) => {
let sum = 0
let final = []
for(let i=0; i<arr.length; i++){
sum+= arr[i]
final.push(sum)
}
return final
}

console.log(sum(x))

You can also use map:

let x = [0, 1, 2, 3, 4, 5]

let sum = (arr) => {
let sum = 0
return arr.map(current => sum += current )
}

console.log(sum(x))

#### @Ry- 2019-06-10 15:15:56

That second code snippet looks strikingly familiar, down to the fact that it’s hidden.

#### @Code Maniac 2019-06-10 16:13:26

@Ry- and the above line states you can also use map, all i wanted to show both the conventional and functional approach, if people feels it's a valid reason to down vote i am happy with it :)

#### @Ry- 2019-06-10 00:34:20

Much, by keeping a running total:

function* partialSums(iterable) {
let s = 0;

for (const x of iterable) {
s += x;
yield s;
}
}

const x = [0, 1, 2, 3, 4, 5];
console.log(Array.from(partialSums(x)).join(', '));

Linear time, online. (You can also produce an array directly; expand below.)

const partialSums = arr => {
let s = 0;
return arr.map(x => s += x);
};

const x = [0, 1, 2, 3, 4, 5];
console.log(partialSums(x).join(', '));

#### @LogicalBranch 2019-06-10 15:13:20

Urgh... yield is such a weak word. Add each item to an array and return it like a man.

#### @Ry- 2019-06-10 15:14:03

@LogicalBranch: That wouldn’t be online, so you may as well use the second code snippet then.

#### @spender 2019-06-10 19:20:35

@LogicalBranch Why? Iterators FTW in 2019, no?

#### @HSchmale 2019-06-11 02:28:20

@LogicalBranch Iterators are the way to go when doing partial sums if you have a large array or you only needs up to a certain value. Why do you dislike yield? I know it's not as in your face as return but it is descriptive of what it's doing.

#### @bob 2019-06-11 11:26:33

This is a nice imperative answer, However, it doesn't fit a question tagged with functional programming. This is merely a 1:1 alloction of a specific helper function to a very specific problem. Zero generalization, nothing learned.

#### @Ry- 2019-06-11 15:16:42

@bob: It generalizes well to scan and it’s FP because it’s a function. What’s learned: cargo cult FP by shoehorning any task into the magic array functions and avoiding all mutation no matter how local in JavaScript is harmful.

#### @Patrick Francis Omogbeme 2019-06-11 17:04:45

@Ry- I just knew there would be a concise approach that I was missing and you hit it. Thanks aplenty.

#### @bob 2019-06-11 19:14:12

@Ry- Umm, so in the In the end you wind up with scan, which is implemented in another answer to this very question and which you happen to downvote I guess, as you downvoted at least another answer.

#### @Ry- 2019-06-11 19:22:16

@bob: I removed my downvote after the non-quadratic-time version was added (even though it isn’t functional by your definition). But yes, the nice way to end up with scan is by replacing s += x with acc = fn(acc, x) in this generator function.

#### @bob 2019-06-11 19:25:48

@Ry- Cargo cult, what a nice academic term. However, no one seriously says local mutations are harmful. No one seriously says there is purity in referential-identity-Javascript. I would even claim that Array.prototype.concat is one of the most harmful methods in JS, since it pretends as if Arrays are persistant data structures. There is a lot justified criticism on FP in JS. So please chime in, justified criticism is always appreciated.

#### @Ry- 2019-06-11 19:30:26

@bob: So if you agree that repeated concat (and equivalently, spread) are harmful ways to pretend one’s being more functional while making complicated solutions to problems that can be solved by creating flatMap or scan, and that local mutations are good ways to implement necessary functions in practical JavaScript… what is it that you take issue with?

#### @CertainPerformance 2019-06-10 00:14:05

One option is to use a single .map which uses .reduce inside to sum up the sliced partial array:

const x = [0, 1, 2, 3, 4, 5];

const sum = (x, y) => x + y;
const partialSums = x.map((_, i, arr) => arr.slice(0, i + 1).reduce(sum));
console.log(partialSums);

### [SOLVED] How do I remove a particular element from an array in JavaScript?

• 2011-04-23 22:17:18
• Walker
• 5955152 View
• 7420 Score
• Tags:   javascript arrays

### [SOLVED] What's the simplest way to print a Java array?

• 2009-01-03 20:39:39
• Alex Spurling
• 2090746 View
• 1803 Score
• Tags:   java arrays printing

### [SOLVED] How can I add new array elements at the beginning of an array in Javascript?

• 2011-11-10 00:35:22
• Moon
• 721026 View
• 1427 Score
• Tags:   javascript arrays