Testing, Everybody’s An Expert in Hindsight
Posted by Eric Jacobson at Wednesday, September 05, 2012I just came from an Escape Review Meeting. Or as some like to call it, a “Blame Review Meeting”. I can’t help but feel empathy for one of the testers who felt a bit…blamed.
With each production bug, we ask, “Could we do something to catch bugs of this nature?”. The System 1 response is “no, way too difficult to expect a test to have caught it”. But after 5 minutes of discussion, the System 2 response emerges, “yes, I can imagine a suite of tests thorough enough to have caught it, we should have tests for all that”. Ouch, this can really start to weigh on the poor tester.
So what’s a tester to do?
- First, consider meekness. As counterintuitive as it seems, I believe defending your test approach is not going to win respect. IMO, there is always room for improvement. People respect those who are open to criticism and new ideas.
- Second, entertain the advice but don’t promise the world. Tell them about the Orange Juice Test (see below).
The Orange Juice Test is from Jerry Weinberg’s book, The Secrets of Consulting. I’ll paraphrase it:
A client asked three different hotels to supply said client with 700 glasses of fresh squeezed orange juice tomorrow morning, served at the same time. Hotel #1 said “there’s no way”. Hotel #2 said “no problem”. Hotel #3 said “we can do that, but here’s what it’s going to cost you”. The client didn’t really want orange juice. They picked Hotel #3.
If the team wants you to take on new test responsibilities or coverage areas, there is probably a cost. What are you going to give up? Speed? Other test coverage? Your kids? Make the costs clear, let the team decide, and there should be no additional pain on your part.
Remember, you’re a tester, relax.
I am also having the same sort of difficulty at my work. In my performance review I was told to change the font of my e-mail (i don know how much my email font impacted the performance). I've mentioned we can improve our testing strategies and test methodologies so that Hot Fixes would come down (we are doing at the least 2 Hot Fixes every release and we do releases every month). They argued test methodologies and test practices are perfectly fine. Something else is causing the hot fixes. So its hard to get our idea to table to test leaders and test managers. Pls advice.