Cover image

JavaScript's Array Reduce: Mutable vs. Immutable Accumulators

July 11, 2023

6 min read

Introduction

Copy heading link

JavaScript's reduce array method can be incredibly useful in data transformations and building data structures, while at the same time iterating through an existing data structure. It is great for building arrays, objects, strings, or keeping an integer count. The thing that separates .reduce() it from other array methods is that each iteration has access to an accumulator from the previous iteration. This allows you to keep building the final output.

JavaScript is agnostic about the programming paradigms you choose to implement. It allows the mutation of objects and arrays, even if they are declared with the const keyword. This can be disturbing for functional programmers, but convenient for others.

Test methods

Copy heading link

For this experiment, I wanted to explore four different ways of interacting with the accumulator. The first way is to mutate the accumulator directly and return it for the next iteration. The second way is to create a shallow clone of it and return a new copy of it for the next iteration. The third way is to use a third-party library called immer to artificially mutate the accumulator, and lastly, use lodash's deep clone to clone the accumulator on each iteration. I mainly want to test convenience, reliability, and performance.

Benchmark function

Copy heading link

To set up the performance testing, I have written a custom benchmark function:

TypeScript

Here is some more setup code to emulate an API call:

TypeScript

Experiment 1: Mutate directly

Copy heading link

This implementation pushes successful API calls onto a list directly and returns it. All existing arrays and objects are subject to direct mutation. This is the most convenient implementation.

TypeScript

Experiment 2: Shallow clone

Copy heading link

This implementation returns a shallow copy of the list. Nested objects or arrays are not cloned and could be subject to mutation. The shallow clone only returns a new reference of the root array.

TypeScript

Experiment 3: Immer

Copy heading link

Immer's produce function creates a draft that can be mutated directly. At the end of the function, the final draft is returned to the reducer as a new deep copy. Immer uses JavaScript proxies to make this happen. There is a bit more code involved here, as an additional .produce() function is called. However, everything that happens inside the callback (2nd argument of produce) is completely safe from mutating the input list.

TypeScript

Experiment 4: Lodash cloneDeep

Copy heading link

This implementation recursively clones all nested properties of the array. This deep clone can then be mutated and returned to the reducer. It is a very safe approach but you have to inconveniently introduce a library.

TypeScript

Running the experiment

Copy heading link
TypeScript

Each reduce function is run one million times. The benchmark function measures the start and end times and returns the milliseconds between. This 5 times for consistency and to make sure there aren't any outliers or edge cases.

Run #1

Copy heading link
Shell

Run #2

Copy heading link
Shell

Run #3

Copy heading link
Shell

Run #4

Copy heading link
Shell

Run #5

Copy heading link
Shell

Results

Copy heading link

Each run exhibited similar, consistent results. With very little variance, we can safely take run #5 to analyze.

The direct mutation implementation was by far the quickest to iterate one million times, doing it in just 29ms, compared to the immer implementation, which took 14.3 seconds. The shallow implementation is about 4x times slower than the direct mutation and about 4x faster than the lodash clone deep implementation.

I can imagine the immer and deep clone implementations could exceed even further depending on the levels of nesting. In this case, we only had a fairly flat object with an array with only 3 properties. The shallow clone does not do too badly, and I suspect the overhead comes from calling the new Array constructor when we use the spread operator to create a new array. Don't forget, on every iteration a new array is created in memory and pretty much thrown away at the end. Whereas, the mutable implementation uses the same array reference in memory on each iteration.

Conclusion

Copy heading link

So which is the preferred way to reduce an array?

If you are a strict functional programmer and want complete safety, then a deep clone implementation is for you. Using Lodash's cloneDeep method seems to be twice as quick as Immer's implementation.

If you care about performance and have hundreds of thousands of items to iterate over, I would recommend the shallow clone or mutate directly. Both can potentially be just as unsafe. You just need to make sure that the accumulator is only required for the scope of this reducer function. Keep in mind that JavaScript is a single-threaded language and there is no risk of threads fighting over contention of the input array.

If your array size and complexity of the objects nested inside are fairly flat, then it doesn't matter which implementation you use. It comes down to what your team is most comfortable with, what is most convenient to reason with, and what gives you the most confidence in producing bug-free code.

Newsletter