These options are flawed. I am somewhere in the middle of of the first two: mostly integration tests, with critical domain logic unit tested. Certainly not 100% of the app's functionality, closer to 80%
I agree. this poll forces me to choose between a test suite that tests "all functionality" and "a few critical things". I think a lot of people who value high levels of testing coverage still fall somewhat short of all functionality, but are way above "a few critical things".
I'm using rails these days, and I have 100% test coverage on models and controllers (though that really just means that all the model and controller code is executed when I run my tests, these tools can't really tell if you've tested the code intelligently, though I hope I have).
I don't have a full suite of integration tests that validate all of the view logic, though there are some checks. I also have integration tests that validate external dependencies (file storage, database connectivity, etc), though again, there may be some holes.
I picked "all", since that's closest to where I am. But my best choice would be "we maintain a high (95%+) level of testing coverage". I don't think I'm splitting hairs here, because there may be a practical tradeoff between high levels and complete levels of test coverage.
NOTE: "high" levels of testing can mean different things to different people... doesn't have to be 95%, which I would consider to be higher than absolutely necessary. It depends so much on what you're actually testing (anyone who has used a coverage tool knows you can often "trick" the tool into awarding the 100% bar without doing much other than just making sure the tests run the code... which is useful in its way but can let all kinds of errors slip through).
I'm had the same thought Obie. I find high level integration tests provide most of the value for me, with unit testing when I need help with designing code. Having a decent suite of high level tests saves me from having to smoke test the entire app every time I make sweeping changes. If the suite is passing, I know the features are working, at least in the basic cases I was testing for. I still have to do some level of manual testing, but it's nowhere near as much as I did before I became more obsessed with testing.
Interesting... I think I'm with you on this one. There have been a few occasions where I got my test coverage through integration tests rather than unit or functional tests as well.
My real goal is to have tests that will sound the alarm if I've done something that breaks the application. I think this is similar to the "smoke test" you're talking about. Don't want to have to fire up the server and walk through all the use cases - it's very useful to have integration tests that will do this instead.
Agreed. would have been great if manual testing was included. We have full time QA people who actually write very detailed test plans based on project specs/requirements and have time included in all our projects for testing and bug fixing at the end.