Since I started working as a software engineer, I have heard and read many theories about the perfect percentage of code coverage to achieve with your team.
This number varies greatly depending on the language, technology, and whether you're doing front-end or back-end. Some would argue that testing React components is not necessary, while others would put more effort into the business logic and the server capabilities.
For many years, I have advocated for what makes sense to me: it's not about the number; it's about testing the critical features. The goal is to be confident that your changes will not introduce breaking changes.
But, in a recent engagement, a line in the contract stated otherwise: everything has to be unit tested to 100%.
After going through the stages of grief one by one, I finally accepted it. I need to reach 100% coverage, and everything that is excluded from the coverage will need to be thoroughly justified.
The goal is not to discuss whether it's the right number or approach, but rather tips that helped the team achieve it and the learnings from this experience.
After a few months, this is how we made it.
Start immediately
First things first: start on day one. The last thing you want to do is deliver some features before getting started. When setting up the project, repository, and everything else, take the time to set up the testing framework and configure it. Only then can you start delivering. Starting with 100% coverage and maintaining it through the project is easier than starting from zero after writing thousands of lines of code.
Make it easy and appealing
From a developer experience point of view, there are a lot of tools out there that can help on your journey. The goal is to get rid of these moments where you're just waiting for the tests to finish running before you can commit or merge your code.
I recently discovered and used Vitest UI, which adds a pleasing UI to testing. It makes it visually appealing rather than a shell prompt.
Since I started working as a software engineer, I have heard and read many theories about the perfect percentage of code coverage to achieve with your team.
This number varies greatly depending on the language, technology, and whether you're doing front-end or back-end. Some would argue that testing React components is not necessary, while others would put more effort into the business logic and the server capabilities.
For many years, I have advocated for what makes sense to me: it's not about the number; it's about testing the critical features. The goal is to be confident that your changes will not introduce breaking changes.
But, in a recent engagement, a line in the contract stated otherwise: everything has to be unit tested to 100%.
After going through the stages of grief one by one, I finally accepted it. I need to reach 100% coverage, and everything that is excluded from the coverage will need to be thoroughly justified.
The goal is not to discuss whether it's the right number or approach, but rather tips that helped the team achieve it and the learnings from this experience.
After a few months, this is how we made it.
Start immediately
First things first: start on day one. The last thing you want to do is to deliver a couple of features before getting started. When you are setting up the project, repository, and everything else, take the time to set up the testing framework and configure it. Only then can you start delivering. It is easier to start with 100% coverage and maintain it through the project than starting from zero after writing thousands of lines of code.
Make it easy and appealing.
From a developer experience point of view, there are a lot of tools out there that can help on your journey. The goal is to get rid of these moments where you're just waiting for the tests to finish running before you can commit or merge your code.
I recently discovered and used Vitest UI, which adds a pleasing UI to testing. It makes it visually appealing rather than a shell prompt.
Locally, I strongly encourage everyone to set up and use Husky to run tests, at least before pushing your code. But because it can be skipped (I see you nasty developers đź‘€), I instead recommend enforcing coverage in your pipeline.
For the continuous integration, I used Vitest Coverage Report for GitHub Action. Every time a Pull Request is opened, coverage runs and displays the results. If it does not make the threshold (i.e. 100%), the pipeline will fail, showing a nice ❌ and disabling the merge button.
Don't repeat yourself
Writing a test is not different from writing feature code: Do not repeat yourself. A lot of the setup, mocks, and functions not only can but should be made reusable. Communicate with your team and share the knowledge so they don’t reinvent the wheel every time.
For example, Vitest mocks can be reused automatically when they are put in a mocks folder. It is a mechanism I very often use because it saves a lot of time for modules that are used everywhere.
Consider the following structure:
/src
/myModule.ts
/__mocks__
/myModule.ts
/__tests__
/myModule.test.ts
And the following example:
// src/myModule.ts
export const fetchData = () => "Real Data";
// src/__mocks__/myModule.ts
export const fetchData = () => "Mocked Data";
// src/__tests__/myModule.test.ts
import { fetchData } from "../myModule";
import { vi } from 'vitest';
vi.mock("../myModule");
test("uses mocked fetchData", () => {
expect(fetchData()).toBe("Mocked Data");
});
When Vitest encounters vi.mock("../myModule")
, Vitest uses the __mocks__/myModule.ts
mock file instead. This is done automatically, and it can be reused in different files.
Use an AI friend
AI tools have quickly taken an important place in the lives of software engineers. And they can be very powerful when it comes to testing.
While I do not recommend making the AI write the tests for you, I do believe that it can still save you a lot of time and effort:
- Ask for a list of positive and negative tests based on a feature or a function.
- Make it generate fake objects for mocks to support your tests.
That would give you a solid backbone to get started and enough confidence that your tests are making the app more robust.
Don't waste your time
As a rule of thumb, always remember that if it takes too long to write a test, it means that you are trying to cover a function that is doing too much. Writing test is like writing code, and SOLID principles and best practices should still be enforced. If you write clear and small pieces of code, your tests will also be small and intelligible.
It is also easier to end up in a case where the test itself is hard to maintain and/or hard to understand. The problem is probably the same: the code is too complicated and intricate. In both scenarios, you need to roll up your sleeves and start refactoring!
Learnings
As I mentioned earlier, I was quite sceptical about the need to cover everything. But now I can say that proudly: I was wrong. Not only would I do it again, but I will advocate for it.
Over time, I became faster and faster at writing tests. Every day, I had to read the documentation a bit more, discovering easier and faster ways to use the testing framework. The use cases became more intricate, forcing me to use all the features of Vitest, from mocks to spies and more, but also from React Testing Library. Over time, testing more advanced components with clicks, API calls, asynchronous methods, loading effects, etc, became easier and more natural.
With the codebase growing larger, I had to refactor the tests multiple times. Originally, tests were sitting beside their files, but I moved them to a specific tests folder over time. I also created feature files for some cases where the original test file was too long. I also considered moving all tests to a tests/ folder next to src/, something that looks more like a structure you can find on a Java codebase.
The mindset rapidly changed, going from “How can we implement this?” to “How can we make sure it works according to the specifications? What do we need to check?". This had a massive impact on the velocity. The entire team started doing smaller changes and features. Pull requests went from 15 files and more on average to 7-10, maximum.
Reviewing the features was also easier: looking at the scenarios tested was often enough to understand if the feature was doing its job.
After a while, adding tests just became automatic. We would implement a new feature and test it at the same time. And while approaching the deadline, we even started to have a Test-Driven Development mindset.
In conclusion, I realised that what makes a difference is how well you know your testing framework. Testing is not "the cherry on the cake", it is part of the job. Hence, you need to master the testing tools as much as you master your development framework. And the faster way to do it is by using it every day, learning every day, and writing a lot of tests. So, next time your team is asking how much coverage is expected, tell them to cover everything. I Believe especially juniors can learn a lot from this experience, while mid-level engineers can provide help and support. And when it becomes a reflex rather than more work, lower the threshold. This experience has also been a good way to introduce the team to Test Driven Development.