Washing your code: avoid mutation

Artem Sapegin - Mar 25 '20 - - Dev Community

You’re reading an excerpt of my upcoming book on clean code, “Washing your code: write once, read seven times.” Preorder it on Leanpub or read a draft online.


Mutations happen when we change a JavaScript object or array without creating a new variable or reassigning an existing one:

const puppy = {
  name: 'Dessi',
  age: 9
};
puppy.age = 10;
Enter fullscreen mode Exit fullscreen mode

Here we’re mutating the original puppy object by changing its age property.

Mutations are often problematic. Consider this function:

function printSortedArray(array) {
  array.sort();
  for (const item of array) {
    console.log(item);
  }
}
Enter fullscreen mode Exit fullscreen mode

The problem here is that the .sort() array method mutates the array we’re passing into our function, likely not what we’d expect when calling a function named printSortedArray.

Some of the problems with mutation:

  • Mutation may lead to unexpected and hard-to-debug issues, where data becomes incorrect somewhere, and you have no idea where it happens.
  • Mutation makes code harder to understand: at any time, an array or object may have a different value, so we need to be very careful when reading the code.
  • Mutation of function arguments makes the behavior of a function surprising.

Immutability or immutable data structures, meaning that to change a value we have to create a new array or object, would solve this problem. Unfortunately, JavaScript doesn’t support immutability natively, and all solutions are more crutches than actual solutions. But even just avoiding mutations in our code makes it easier to understand.

Also, don’t forget that const in JavaScript only prevents reassignments — not mutations. We’ve discussed reassignments in the previous chapter, Avoid reassigning variables.

Avoid mutating operations

One of the most common use cases for mutation is updating an object:

function parseExample(content, lang, modifiers) {
  const example = {
    content,
    lang
  };

  if (modifiers) {
    if (hasStringModifiers(modifiers)) {
      example.settings = modifiers
        .split(' ')
        .reduce((obj, modifier) => {
          obj[modifier] = true;
          return obj;
        }, {});
    } else {
      try {
        example.settings = JSON.parse(modifiers);
      } catch (err) {
        return {
          error: `Cannot parse modifiers`
        };
      }
    }
  }

  return example;
}
Enter fullscreen mode Exit fullscreen mode

Here we’re creating an object with three fields, one of which, settings, is optional. And we’re doing it by mutating the initial example object when it should have an optional field.

I prefer to see the whole object shape in a single place instead of having to read the whole function to find all possible object shape variations. Usually it doesn’t matter whether a property has an undefined value or doesn’t exist at all. I haven’t seen many cases where it mattered for a good reason.

We also have a special error case here that returns an entirely different object with a lone error property. But it’s really a special case because none of the properties of two objects overlap, and it doesn’t make sense to merge them.

I use ternaries for simple cases, and extract code to a function for more complex cases. Here we have a good case for the latter because of a nested condition and a try/catch block.

Let’s refactor it:

function getSettings(modifiers) {
  if (!modifiers) {
    return undefined;
  }

  if (hasStringModifiers(modifiers)) {
    return modifiers.split(' ').reduce((obj, modifier) => {
      obj[modifier] = true;
      return obj;
    }, {});
  }

  return JSON.parse(modifiers);
}

function parseExample(content, lang, modifiers) {
  try {
    return {
      content,
      lang,
      settings: getSettings(modifiers)
    };
  } catch (err) {
    return {
      error: `Cannot parse modifiers`
    };
  }
}
Enter fullscreen mode Exit fullscreen mode

Now it’s easier to understand what the code does, and the possible shapes of the return object are clear. We’ve also removed all mutations and reduced nesting a little.

Beware of the mutating array methods

Not all methods in JavaScript return a new array or object. Some methods mutate the original value in place. For example, push() is one of the most commonly used.

Replacing imperative code, full of loops and conditions, with declarative code is one of my favorite refactorings. And one of the most common suggestions I give in code reviews.

Consider this code:

const generateOptionalRows = () => {
  const rows = [];

  if (product1.colors.length + product2.colors.length > 0) {
    rows.push({
      row: 'Colors',
      product1: <ProductOptions options={product1.colors} />,
      product2: <ProductOptions options={product2.colors} />
    });
  }

  if (product1.sizes.length + product2.sizes.length > 0) {
    rows.push({
      row: 'Sizes',
      product1: <ProductOptions options={product1.sizes} />,
      product2: <ProductOptions options={product2.sizes} />
    });
  }

  return rows;
};

const rows = [
  {
    row: 'Name',
    product1: <Text>{product1.name}</Text>,
    product2: <Text>{product2.name}</Text>
  },
  // More rows...
  ...generateOptionalRows()
];
Enter fullscreen mode Exit fullscreen mode

Here we have two ways of defining table rows: a plain array with always visible rows, and a function that returns optional rows. The latter mutates the original array using the .push() method.

Array mutation itself isn’t the most significant issue of this code. However, code with mutations likely hides other issues — mutation is a good sign to look closer. Here the main problem is imperative array building and different ways for handling required and optional rows. Replacing imperative code with declarative and eliminating conditions often makes code more readable and maintainable.

Let’s merge all possible rows into a single declarative array:

const rows = [
  {
    row: 'Name',
    product1: <Text>{product1.name}</Text>,
    product2: <Text>{product2.name}</Text>
  },
  // More rows...
  {
    row: 'Colors',
    product1: <ProductOptions options={product1.colors} />,
    product2: <ProductOptions options={product2.colors} />,
    isVisible: (product1, product2) =>
      (product1.colors.length > 0 || product2.colors.length) > 0
  },
  {
    row: 'Sizes',
    product1: <ProductOptions options={product1.sizes} />,
    product2: <ProductOptions options={product2.sizes} />,
    isVisible: (product1, product2) =>
      (product1.sizes.length > 0 || product2.sizes.length) > 0
  }
];

const visibleRows = rows.filter(row => {
  if (typeof row.isVisible === 'function') {
    return row.isVisible(product1, product2);
  }
  return true;
});
Enter fullscreen mode Exit fullscreen mode

Now we’re defining all rows in a single array. All rows are visible by default unless they have the isVisible function that returns false. We’ve improved code readability and maintainability:

  • there’s only one way of defining rows;
  • no need to check two places to see all available rows;
  • no need to decide which method to use to add a new row;
  • easier to make an existing row optional by adding isVisible function to it.

Here’s another example:

const defaults = { ...options };
const prompts = [];
const parameters = Object.entries(task.parameters);

for (const [name, prompt] of parameters) {
  const hasInitial = typeof prompt.initial !== 'undefined';
  const hasDefault = typeof defaults[name] !== 'undefined';

  if (hasInitial && !hasDefault) {
    defaults[name] = prompt.initial;
  }

  prompts.push({ ...prompt, name, initial: defaults[name] });
}
Enter fullscreen mode Exit fullscreen mode

At first sight, this code doesn’t look very bad: it converts an object into an array by pushing new items into the prompts array. But if we take a closer look, there’s another mutation inside a condition in the middle that mutates the defaults object. And this is a bigger problem because it’s easy to miss while reading the code.

The code is actually doing two loops: one to convert the task.parameters object to the prompts array, and another to update defaults with values from task.parameters. I’d split them to make it clear:

const parameters = Object.entries(task.parameters);

const defaults = parameters.reduce(
  (acc, [name, prompt]) => ({
    ...acc,
    [name]:
      prompt.initial !== undefined ? prompt.initial : options[name]
  }),
  {}
);

const prompts = parameters.map(([name, prompt]) => ({
  ...prompt,
  name,
  initial: defaults[name]
}));
Enter fullscreen mode Exit fullscreen mode

Other mutating array methods to watch out for are:

Avoid mutation of function arguments

Objects or arrays that are passed to a function can be mutated inside that function, and this affects the original object:

const mutate = object => {
  object.secret = 'Loves pizza';
};

const person = { name: 'Chuck Norris' };
mutate(person);
// -> { name: 'Chuck Norris', secret: 'Loves pizza' }
Enter fullscreen mode Exit fullscreen mode

Here the person object is mutated inside the mutate function.

Function argument mutation can be intentional and accidental, and both are problematic:

  • It’s harder to understand how a function works and how to use it because it doesn’t return a value but changes one of the incoming arguments.
  • Accidental argument mutation is even worse because function consumers don’t expect it. And it can lead to hard-to-find bugs when a value that is mutated inside a function is later used somewhere else.

Consider this example:

const addIfGreaterThanZero = (list, count, message) => {
  if (count > 0) {
    list.push({
      id: message,
      count
    });
  }
};

const getMessageProps = (
  adults,
  children,
  infants,
  youths,
  seniors
) => {
  const messageProps = [];
  addIfGreaterThanZero(messageProps, adults, 'ADULTS');
  addIfGreaterThanZero(messageProps, children, 'CHILDREN');
  addIfGreaterThanZero(messageProps, infants, 'INFANTS');
  addIfGreaterThanZero(messageProps, youths, 'YOUTHS');
  addIfGreaterThanZero(messageProps, seniors, 'SENIORS');
  return messageProps;
};
Enter fullscreen mode Exit fullscreen mode

It converts a bunch of number variables to a messageProps array that groups people of different ages with their count:

[
  {
    id: 'ADULTS',
    count: 7
  },
  {
    id: 'SENIORS',
    count: 2
  }
];
Enter fullscreen mode Exit fullscreen mode

The problem with this code is that the addIfGreaterThanZero function mutates the array we’re passing to it. This is an example of an intentional mutation: it’s required for this function to work. However, it’s not the best API for what this function does.

We can change this function to return a new array instead:

const addIfGreaterThanZero = (list, count, message) => {
  if (count > 0) {
    return [
      ...list,
      {
        id: message,
        count
      }
    ];
  }
  return list;
};
Enter fullscreen mode Exit fullscreen mode

But I don’t think we need this function at all:

const MESSAGE_IDS = [
  'ADULTS',
  'CHILDREN',
  'INFANTS',
  'YOUTHS',
  'SENIORS'
];
const getMessageProps = (
  adults,
  children,
  infants,
  youths,
  seniors
) => {
  return [adults, children, infants, youths, seniors]
    .map((count, index) => ({
      id: MESSAGE_IDS[index],
      count
    }))
    .filter(({ count }) => count > 0);
};
Enter fullscreen mode Exit fullscreen mode

Now it’s easier to understand what the code does. There’s no repetition, and the intent is clear: the getMessageProps function converts a list of values to an array of objects and removes “empty” items.

We can simplify it further:

const MESSAGE_IDS = [
  'ADULTS',
  'CHILDREN',
  'INFANTS',
  'YOUTHS',
  'SENIORS'
];
const getMessageProps = (...counts) => {
  return counts
    .map((count, index) => ({
      id: MESSAGE_IDS[index],
      count
    }))
    .filter(({ count }) => count > 0);
};
Enter fullscreen mode Exit fullscreen mode

But this makes the function API less discoverable and can make editor autocomplete less useful. It also gives the wrong impression that the function accepts any number of arguments and that the count order is unimportant — the number and order of arguments were clear in the previous iteration.

We can also use .reduce() method instead of .map() / .filter() chaining:

const MESSAGE_IDS = [
  'ADULTS',
  'CHILDREN',
  'INFANTS',
  'YOUTHS',
  'SENIORS'
];
const getMessageProps = (...counts) => {
  return counts.reduce((acc, count, index) => {
    if (count > 0) {
      acc.push({
        id: MESSAGE_IDS[index],
        count
      });
    }
    return acc;
  }, []);
};
Enter fullscreen mode Exit fullscreen mode

I’m not a huge fan of .reduce() because it often makes code harder to read and intent less clear. With .map() / .filter() chaining, it’s clear that we’re first converting an array to another array with the same number of items, and then removing array items we don’t need. With .reduce() it’s less obvious.

So I’d stop two steps ago with this refactoring.

Probably the only valid reason to mutate function arguments is performance optimization: when you work with a huge piece of data, and creating a new object or array would be too slow. But like with all performance optimizations: measure first to know whether you actually have a problem, and avoid premature optimization.

Make mutations explicit if you have to use them

Sometimes we can’t avoid mutations, for example, because of an unfortunate language API that does mutation.

Array’s .sort() method is an infamous example of that:

const counts = [6, 3, 2];
const puppies = counts.sort().map(n => `${n} puppies`);
Enter fullscreen mode Exit fullscreen mode

This example gives the impression that the counts array isn’t changing, and we’re just creating a new puppies array with the sorted array. But the .sort() method returns a sorted array and mutates the original array at the same time. This kind of code is hazardous and can lead to hard-to-find bugs. Many developers don’t realize that the .sort() method is mutating because the code seems to work fine.

It’s better to make the mutation explicit:

const counts = [6, 3, 2];
const sortedCounts = [...counts].sort();
const puppies = sortedCounts.map(n => `${n} puppies`);
Enter fullscreen mode Exit fullscreen mode

Here we’re making a shallow copy of the counts array using the spread syntax and then sorting it, so the original array stays the same.

Another option is to wrap a mutating API into a new API that doesn’t mutate original values:

function sort(array) {
  return [...counts].sort();
}

const counts = [6, 3, 2];
const puppies = sort(counts).map(n => `${n} puppies`);
Enter fullscreen mode Exit fullscreen mode

Or use a third-party library, like Lodash and its sortBy function:

const counts = [6, 3, 2];
const puppies = _.sortBy(counts).map(n => `${n} puppies`);
Enter fullscreen mode Exit fullscreen mode

Updating objects

Modern JavaScript makes it easier to do immutable data updates thanks to the spread syntax. Before the spread syntax we had to write something like:

const prev = { coffee: 1 };
const next = Object.assign({}, prev, { pizza: 42 });
// -> { coffee: 1, pizza: 42 }
Enter fullscreen mode Exit fullscreen mode

Note the empty object as the first argument: it was necessary; otherwise, Object.assign would mutate the initial object: it considers the first argument as a target. It mutates the first argument and also returns it — this is a very unfortunate API.

Now we can write:

const prev = { coffee: 1 };
const next = { ...prev, pizza: 42 };
Enter fullscreen mode Exit fullscreen mode

This does the same thing but is less verbose, and no need to remember Object.assign quirks.

And before the Object.assign in ECMAScript 2015, we didn’t even try to avoid mutations: it was too painful.

Redux has a great page on immutable update patterns: it describes patterns for updating arrays and objects without mutations, and it’s useful even if you don’t use Redux.

And still, spread syntax quickly gets incredibly verbose:

function addDrink(meals, drink) {
  return {
    ...meals,
    lunch: {
      ...meals.lunch,
      drinks: [...meals.lunch.drinks, drink]
    }
  };
}
Enter fullscreen mode Exit fullscreen mode

We need to spread each level of the object to change a nested value; otherwise, we’ll overwrite the initial object with a new one:

function addDrink(meals, drink) {
  return {
    ...meals,
    lunch: {
      drinks: [drink]
    }
  };
}
Enter fullscreen mode Exit fullscreen mode

Here we’re keeping only the first level of properties of the initial object: lunch and drinks will have only the new properties.

Also, spread and Object.assign only do shallow cloning: only the first level properties are copies, but all nested properties are references to the original object, meaning mutation of a nested property mutates the original object.

Keeping your objects as shallow as possible might be a good idea if you update them often.

While we’re waiting for JavaScipt to get native immutability, there are two non-exclusive ways we can make our lives easier today:

  • prevent mutations;
  • simplify object updates.

Preventing mutations is good because it’s so easy to miss them during code reviews, and then spend many hours debugging weird issues.

One way to prevent mutations is to use a linter. ESLint has several plugins that try to do just that, and we’ll discuss them in the Tooling chapter.

eslint-plugin-better-mutation disallows any mutations, except for local variables in functions. This is a great idea because it prevents bugs caused by the mutation of shared objects but allows you to use mutations locally. Unfortunately, it breaks even in simple cases, such as a mutation occurring inside .forEach().

Another way to prevent mutations is to mark all objects and arrays as read-only in TypeScript or Flow.

For example, using the readonly modifier in TypeScript:

interface Point {
  readonly x: number;
  readonly y: number;
}
Enter fullscreen mode Exit fullscreen mode

Or using the Readonly utility type:

type Point = Readonly<{
  readonly x: number;
  readonly y: number;
}>;
Enter fullscreen mode Exit fullscreen mode

And similar for arrays:

function sort(array: readonly any[]) {
  return [...counts].sort();
}
Enter fullscreen mode Exit fullscreen mode

Note that both readonly modifier and Readonly utility type are shallow, so we need to add them to all nested objects too.

eslint-plugin-functional has a rule to require read-only types everywhere, which may be more convenient than remembering to do that yourself. Unfortunately, it only supports readonly modifier but not Readonly utility type.

I think it’s a good idea, because there’s no runtime cost, though it makes type definitions more verbose.

I’d prefer an option in TypeScript to make all types read-only by default with a way to opt out.

Similar to making objects read-only on the type level, we can make them read-only at runtime with Object.freeze. Object.freeze is also shallow, so we’d have to use a library like deep-freeze to ensure that nested objects are also frozen, and we might want to have freezing only in development since it can otherwise slow our app down.

I don’t think freezing is worth it on its own unless it is part of another library.

Simplifying object updates is another option that we can combine with mutation prevention.

The most popular way to simplify object updates is to use the Immutable.js library:

import { Map } from 'immutable';
const map1 = Map({ food: 'pizza', drink: 'coffee' });
const map2 = map1.set('drink', 'vodka');
// -> Map({ food: 'pizza', drink: 'vodka' })
Enter fullscreen mode Exit fullscreen mode

I’m not a big fan of it because it has completely custom API that one has to learn. Also, converting arrays and objects from plain JavaScript to Immutable.js and back every time we need to work with any native JavaScript API or almost any third-party API, is annoying and feels like Immutable.js creates more problems than it solves.

Another option is Immer, which allows you to use any mutating operations on a draft version of an object, without affecting the original object in any way. Immer intercepts each operation, and creates a new object:

import produce from 'immer';
const map1 = { food: 'pizza', drink: 'coffee' };
const map2 = produce(map1, draftState => {
  draftState.drink = 'vodka';
});
// -> { food: 'pizza', drink: 'vodka' }
Enter fullscreen mode Exit fullscreen mode

And Immer will freeze the resulting object in development.

Even mutation is not so bad sometimes

In rare cases, imperative code with mutations isn’t so bad, and rewriting it in a declarative way without mutations doesn’t make it better.

Consider this example:

const getDateRange = (startDate, endDate) => {
  const dateArray = [];
  let currentDate = startDate;
  while (currentDate <= endDate) {
    dateArray.push(currentDate);
    currentDate = addDays(currentDate, 1);
  }
  return dateArray;
};
Enter fullscreen mode Exit fullscreen mode

Here we’re making an array of dates to fill a given date range.

I don’t have good ideas on how to rewrite this code without an imperative loop, reassignment, and mutation. And here we can live with this:

  • all “bad” things are isolated in a small function;
  • the function has a meaningful name;
  • the code is clear enough;
  • the function is pure: it doesn’t have any internal state and avoids mutating its arguments.

It’s better to have simple and clear code with mutations than complex and messy code without them. But if you do use mutations, it’s wise to isolate them to a small function with a meaningful name and clear API.


Start thinking about:

  • Rewriting imperative code with mutations in a pure declarative way to improve its readability.
  • Keeping the complete object shape in a single place; when you create a new object, make its shape as clear as possible.
  • Deduplicating logic and separating “what” from “how.”
  • Avoiding mutation of function arguments to prevent hard-to-find bugs.
  • Using .map() / .filter() chaining instead of .reduce().
  • Making mutations explicit if you have to use them.
  • Preventing mutations in your code using a linter or read-only types.

If you have any feedback, tweet me, open an issue on GitHub, or email me at artem@sapegin.ru. Preorder the book on Leanpub or read a draft online.

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Terabox Video Player