Referring more to smaller places like my own - few hundred employees with ~20 person IT team (~10 developers).

I read enough about testing that it seems industry standard. But whenever I talk to coworkers and my EM, it’s generally, “That would be nice, but it’s not practical for our size and the business would allow us to slow down for that.” We have ~5 manual testers, so things aren’t considered “untested”, but issues still frequently slip through. It’s insurance software so at least bugs aren’t killing people, but our quality still freaks me out a bit.

I try to write automated tests for my own code, since it seems valuable, but I avoid it whenever it’s not straightforward. I’ve read books on testing, but they generally feel like either toy examples or far more effort than my company would be willing to spend. Over time I’m wondering if I’m just overly idealistic, and automated testing is more of a FAANG / bigger company thing.

  • Ephera
    link
    fedilink
    arrow-up
    1
    ·
    7 months ago

    Our standard practice is to introduce a thin layer in front of any I/O code, so that we can mock/simulate that part in tests.

    So, if your database-library has an insert()-function, you’d introduce a interface/trait with an insert()-function, which’s default implementation just calls that database-library and nothing else. And then in the test, you stick your assertions behind that trait.

    So, we don’t actually test the interaction with outside systems most of the time, because well:

    • that database-library is tested,
    • the compiler ensures we’re calling that library correctly (assuming no use of a scripting language), and
    • it’s often easier to simulate the behavior of the outside system correctly, than to set it up for each test case.

    We do usually aim to get integration tests with all outside systems going, too, to ensure that we’re not completely off the mark with the behavior that we’re simulating, but those are then often reduced to just the happy flow.