This is part of a series called “Refactoring Javascript: Collection Pipelines.”You can read the introduction here.

In a previous article, I showed how simple collection pipelines can simplify the readability of your code with an example of IP validation.

const validateIP = (ip: string): boolean => {
  const numbers = ip.split('.');
  return numbers.length === 4
    && numbers
      .filter((x) => Number(x).toString() === x)
      .map((x) => parseInt(x, 10))
      .filter((x) => x >= 0 && x <= 255)
      .length === 4;

This code is nice because it breaks each step of the validation down into a single step. Each step pipes a new collection into the next stage of the pipeline. This makes it very easy to reason about the flow of data and how it’s changed at each step.

This approach does have a downside, though. Each step of the pipeline needs to iterate over the entire collection passed through by the previous step.

Our first filter needs to filter four items overall in the original collection; the resulting array gets passed to map(), which also needs to operate over the entire collection before passing it to another filter(), which, again, operates over the entire collection. We’re doing up to 12 operations here (a max of four for each map/filter). What if we could combine these all into a single step?

The typical example of a reduce function is to take an array of numbers and sum them. It looks something like this:

const total = (arr) => arr.reduce((total, val) => total + val, 0);

This is easy to understand but not very useful.

reduce() is so much more powerful than summing numbers. reduce() iterates over a collection once and lets you apply a callback — while providing access to the accumulated result of each previous iteration. It then returns a single value. If you think about it, this gives you a lot of power. For example, you can use reduce() instead of map().

#nodejs #refactoring #react #programming #javascript

The Power of Reduce()
1.10 GEEK