When I first began getting paid to test software, I was a little confused. I knew it was invigorating to catch scary bugs prior to production but I wasn't really sure how valuable my job was to my dev team or the rest of the world! In fact, I didn't really know if testing software was anything to make a career out of in the first place.
A few years ago I came across Harry Robinson's Bumper Stickers for Testers post on StickyMinds.com. It was at that point that I decided my job as a software tester was valuable (and even a little cool). Harry and all the other software testers who contributed the excellent material on said post inspired me to take pride in my job and now I even sport a couple bumper stickers to show it (see below). If you test software, I encourage you to do the same.
Allen, one of the developers on my team, recently told me about some poorly written bugs he had received from a tester. Allen said,
"Basically these bugs describe the application behavior as a result of user actions working with an application, without any explicit identification of the buggy behavior. This tester wrote many, many bugs with one or two sentences. His bug description pattern was like this:
When I do A, it does B.
Examples:
- When I try to move the video object, the background image stays still. (OK. Where is the bug?)
- When I set the video file to a path located in another course, the file is copied to the current course when rendering the page in IE browser. (To some, this may be the correct behavior. So where is the bug?)
This is an ambiguous way to describe a bug. It frustrates me! "
About a year ago, Allen offered similar candor on several of my bugs. Since then, I have made it a point to force the following two statements into every bug I log.
Expected Results:
Actual Results:
No, it’s not an original idea. But it is an excellent idea that makes it nearly impossible for one to ever log another ambiguous bug. If the tester had used it for one of Allen’s examples above, we might see the following.
Expected Results: The background image moves with the video object.
Actual Results: The background image does not move with the video object. Instead, the background image stays still.
Expected Results/Actual Results. Don't log a bug without these!
I have a recurring dream when I get sick. In the dream I’m tasked with riding an elevator up and down to various floors. On each floor I encounter a bunch of numbers bouncing around acting out of control. Part of my task is to make sense out of these numbers on each floor. Like, maybe put them in order…I’m never sure. However, I usually give up and ride the elevator to another random floor and try again. The dream just loops through this sequence. It sounds silly but during the dream it is quite scary and unstoppable for some reason.
What does this have to do with testing? The state of our project feels so chaotic that I keep getting flashes of this dream. My days are filled getting emails or people stopping by my cube with error messages accompanied with vague or no repro steps. Each email is critical and the next seems to preempt the previous. The amount of tests I’ve actually executed myself has been minimal over the past month. Instead, I’ve been attempting to determine what other people are doing. The email threads typically get hijacked by people chunking in non-related problems and I can often identify people misunderstanding each other because each thread becomes more ambiguous than the next. These threads contain bugs that get lost because nobody can figure out enough info to log them. Ahhhhhhh! So I get back on the elevator and see if I can make sense out of the next floor.
IMHO, much of this chaos could be avoided if people would log the bug, no matter how few infos are known. In an extreme case, I still believe it would be valuable to log the bug if all you have is a crumby screen capture of an error. Something like:
Bug#20074 - “No repro steps but someone got this error once…”
The next time this error is encountered we now have something to compare it with. “Hey, this is the same error as Bug#20074, did we notice any clues this time? No?” Well, we can at least update the bug to indicate we saw the error again in the next build and someone else got it.” The emails referring to this problem can say “This may be Bug#20074”. And so on. Once we have a bug, no matter how hollow the bug is, the problem becomes more than someone’s sighting of Bigfoot. It becomes a problem we can actually collect information against in an organized manner. And hopefully, I can stop riding the elevator.