Ask HN: How much emphasis to put on unit testing and when?

Posted by theturtlemoves 2 days ago

Counter9Comment18OpenOriginal

I'm wondering if a shift has occurred. When I started as a junior software engineer, over a decade ago, I learned about unit testing, integration testing, system testing. The whole codebase we worked on was thoroughly unit tested, and had layers of integration tests and system tests as well. I've worked for other employers since and in some cases any kind of automated testing was completely absent. Still, the message I got when reading and keeping up with best practices was: unit test ALL the things!

I've personally found that when the architecture of the system is not mature yet, unit tests can get in the way. Terribly so. Integration tests or system tests to assert behavior seem the starting point in this and other scenario's, including when there are no tests at all yet.

I've recently read a statement about letting go of a strict "unit test everything" mindset and go for integration tests instead. I'm thinking it probably depends, as with everything, on the type of system you're working on, the maturity of the system, the engineers' experience with automated testing, etc.

I'd be interested to learn when each type of testing helps you and when it gets in the way (and what it gets in the way of).

Comments

Comment by jackfranklyn 2 days ago

My rule of thumb: unit test logic, integration test plumbing.

If a function does actual computation or has branching logic that could go wrong in subtle ways, unit test it. If it's mostly wiring things together (fetch data, transform, save), an integration test catches more real bugs.

The "unit test everything" era came from a time when spinning up test databases was painful. Now with Docker and in-memory DBs, integration tests are often faster to write AND more useful.

Where unit tests still win: algorithmic code, parsing, anything with lots of edge cases. Writing unit tests forces you to think about boundaries. Integration tests just tell you "it worked this time."

The worst outcome is having tests that make refactoring painful but don't catch real bugs. I've seen codebases where every internal method signature change breaks 50 unit tests that were essentially testing implementation details.

Comment by rvz 2 days ago

The moment the software is in production, making a lot of money and is stable, I then add lots of tests (both unit and integration) to prevent a $1,000 issue turning into a $100,000 problem later down the line.

Instead of testing everything to being perfect, 100% test coverage and never releasing and the company questioning why the deadlines were missed for the project because of testing dogma.

> I've personally found that when the architecture of the system is not mature yet, unit tests can get in the way. Terribly so. Integration tests or system tests to assert behavior seem the starting point in this and other scenario's, including when there are no tests at all yet.

Totally agree.

Comment by DriftRegion 2 days ago

Writing unit tests is futile exercise without a specification.

The software under test is always modeling something -- business logic, a communications protocol, a control algorithm, a standard, etc. Behind each of those things is a specification. If a specification doesn't exist then the software is called a prototype. For sustained long term incremental development a specification must exist.

The purpose of unit tests is to assert specification-defined invariants at the module interface level.

Unit tests are durable iff the specification they uphold is explicit and accessible to developers and the scope of the test is small. It's futile to write good tests for a module which has ambiguous utility.

priors: I worked in embedded SW and am now a PhD student.

Comment by 9rx 2 days ago

Tests are the specification.

That they also happen to be executable is there only to automatically ensure that the program actually conforms to the specification.

Comment by 2 days ago

Comment by apothegm 2 days ago

Unit tests are super useful when first writing a function or class to confirm it does what you think it does.

Then throw away the unit tests and write integration or E2E tests instead. Then you can refactor under the hood while ensuring overall system behavior is as expected.

There are some exceptions where you might want to hang on to a small subset of unit tests. They can be useful for demonstrating how to use an interface or class. They can help support particularly complex bits of logic. If a certain part of your codebase is fragile and regression prone, unit test coverage can help.

Otherwise, they just calcify the code.

Comment by what 1 day ago

Your unit tests shouldn’t break from a refactor, unless you’re testing the implementation instead of the behavior, which would be dumb.

Comment by apothegm 1 day ago

A refactor often changes boundaries. Which breaks unit tests. Its integration tests that are truly agnostic to implementation.

Comment by 9rx 2 days ago

Testing, which is better known as documentation, is the contract for users that defines what your program is for and how the user can depend on it forever into the future. The test runner is not the focus, rather the effort to validate that what the contract says is actually true.

Test what the user will use. That could mean an end-user interface, or it could mean a public API. Again, the goal is for the user to understand what you are trying to accomplish for them and to have guarantees about what you won't change in the future.

Comment by muzani 1 day ago

Read Test Driven Development by Example, Kent Beck. It's the original. It's very sensible. There's no requirement to get 100% coverage. It's meant to make things easier.

Comment by hahahahhaah 2 days ago

It is a matter of horses for courses. Be driven by what the tests cost and what they accomplish.

Costs may be:

* Developer time to make and maintain.

* CI time means slower CI and higher costs there.

* Ossification of source code (especially unit tests, less so integration). Meaning harder to refactor as you also need to rewrite tests.

Benefits:

* Finds bugs

* Can be a handy local dev loop and local debugging loop

* Documents code and proves that documentation is correct

* Helps with AI assistance

* Integration tests should make refactoring easier or more confident.

I would err on the side of good coverage (80% excl stupid stuff) unless I have in hand a specific reasin not to.

Comment by karmakaze 2 days ago

There's another aspect of unit testing. It makes the units testable. The greatest benefit of this is that the units tend to be more coherent. A large blob of code that isn't unit tested may not have clear boundaries or a functional raison d'etre. Tests also serve as documentation or demos of the units which is great for onboarding devs later on.

Maybe AI analysis/synthesis will change the math on this, but beyond early prototypes and PoC's, tests pay for themselves.

Comment by commandersaki 2 days ago

My opinion is unit tests where you can easily craft inputs and check outputs.

If that becomes hard or you find yourself mocking a lot, then stop, and instead write integration and e2e tests.

Remember unit tests get tightly coupled to your code base, so use them wisely.

Comment by bjourne 2 days ago

The most important ones are those that catch bugs. If your system crashes you write a test that fails until you fix the bug. Don't punt on it because "it's hard to test" (code smell). Tests that never fail are of miniscule value.

Comment by aurareturn 2 days ago

I don't write any unit tests. Instead, I only do integration/system tests.

At the end of the day, I need to know that the system works and does what it is suppose to do. Unit tests adds too much complexity in my opinion and isn't worth it.

Comment by 9rx 2 days ago

Kent Beck, credited with coining the term "unit test", defines unit tests as tests that run without affecting other tests. In other words, unit tests are tests that do not introduce side-effects. I suggest that if you are writing anything other than unit tests, as originally defined, in this day of age, you are doing something horribly wrong. Side-effects is how you end up with tests that randomly break, which is a nightmare for those running them. A good software steward would not deem that acceptable.

I've seen some other definitions out there, but they didn't make any sense. Obviously someone was trolling in those cases. Presumably you were thinking of one of them? Absolutely you wouldn't write tests that don't make any sense. I am not sure why anyone would.

Comment by RaftPeople 1 day ago

> Kent Beck, credited with coining the term "unit test"

The term "unit test" has been used since the 1960's (if not earlier).

Comment by 9rx 1 day ago

Definitely not. The Art of Software Testing wrote about "module testing" in 1979, which was revised as "unit testing" in a later revision after "unit test" was part of the popular lexicon. Perhaps that is what you are thinking of?

Beck is clear that he "rediscovered" the idea. People most certainly understood the importance of tests being isolated from each other in the 1960s. Is that, maybe, what you mean?

Comment by brudgers 1 day ago

Unit tests and YAGNI came out of Extreme Programming. Of the two, YAGNI is usually the more important because Extreme Programming is about doing the simplest thing possible. Good luck.