They can always count on me to do the right thing.
That statement was made to me by another tester while we were sitting having a coffee a while ago. It came as while discussing problems encountered in his shop in test planning and design - and of course by the testers trying to follow the steps and struggling with them and getting confused and not able to follow the detailed steps and getting dragged farther down and... ok - whew.
Let me see if I can sort it out a little.
The testing load is getting heavier - sounds a bit like much heavier. The number of testers is constant. The number of projects are going up. The number of demands for documenting what the tests do is increasing and more people are looking for answers.
Massively frustrating for him. His approach was to aggressively test the entire application. No matter how little or how much the impact of the project was, the application needed to be tested. Updating one version of a third party app with a newer version of the same app, both would be need to be aggressively tested.
The thing is, I can relate to the problem.
When you are looking at what software testing "is" you may want to look it up in wikipedia or use your favorite search engine to see what you can learn. Then you may come up with a whole pile of contradictory statements and definitions.
Then there is what these things mean. Testing validates requirements are met. Testing validates that the functional aspects of the software are correct. Testing ensures the correctness of the software.
The Catch is that doing these things boils down to "Test Everything."
We can't Test Everything. We can sort out some things that need to be tested. To do that, we must consider something else.
We must consider the context of the project. We must consider the mission the project team needs. We must consider what it is that we can do to provide meaningful information. We must consider how we can support the decision making process around the needs of the project.
To do this, we must not take the same approach to every project. We must not take the same rules in to every project.
We must, instead, consider what we can do to provide meaningful information to the project. We must consider the scope of the project. What is it that is needed? What is it that people want to learn?
Then he said it again, with more emphasis: They can always count on me to do the right thing.
I asked if the right thing ever varied. Is it possible that the right thing can be different from one project to the other?
He glared at me and informed me that I clearly do not understand what testing is. He left.
Clearly, by the model he is using, I do not.
I also had to pay the bill for the coffee.
Tuesday, May 28, 2013
Monday, May 13, 2013
To This Testing
I had an interesting conversation recently. A fellow tester type made a comment to the effect that "Without knowing what your testing is going to do, what all the steps that need to be done are and what to expect in each step, you're just doing ad hoc testing and what's the point of that?"
I sipped my coffee and said "I suggest all testing should be ad hoc. Otherwise things are rather pointless, no?"
That probably was not the right thing to say in that situation. Judging by the reaction, it probably caused more angst than was needed. Having said that, my dear, long departed Latin teacher would have approved of the response. She had this thing about people incorrectly using language, particularly Latin.
My concern, at least with the point in the conversation, is that people use words based on their understanding of the meaning, typically through common usage. Unfortunately, they don't really know what the words really mean.
I know. I am arguing against custom. I am arguing against the conventional wisdom - the bit that "everyone knows" what something means.
That is precisely the problem.
Everyone thinks they know what something is or what it means and yet no one takes the time to actually find out. The sad thing is that I find this not unusual.
I find it not unusual among testers, nor among other software folks in general. I find it not unusual among other professions either. Alas, it would seem there is a vast conspiracy to pretend people are communicating by repackaging phrases they heard and, without understanding their meaning, attach new meanings that suits their purpose and intent.
Meanings get watered down. Things go pear-shaped. People are astounded when they do because there was always clear communication.
Sorry. Rubbish.
Passing buzzwords off as meaningful communication is only accurate if everyone knows what those terms mean. It is when the terms begin to shift in how they are used, or when they begin to be misused, that their value declines.
And that is how ad hoc stopped having meaning.
The point of ad hoc anything in much of the world is in the lines of an ad hoc committee - a committee formed to deal with a specific purpose or end. It can also be an impromptu occasion. In that case, impromptu refers to the organization and planning of the committee itself - not with how it functions - something totally lost on people unfamiliar with Roberts Rules of Order and other Parliamentary Procedure guides.
Good Testing Is Always To This Testing
Good testing serves a specific purpose. It is directed to a specific end.
Lousy testing is the testing that warranted the "Test is Dead" stuff a year or two ago. The thing that is so wrong is this is precisely the "testing" that so many people hold up as superior work.
It is carefully considered. It is carefully measured. It is carefully scripted. And then the scripts are very carefully followed. It has stuff that is easily counted. Sometimes it is executed by the people who developed these carefully counted scripts. Sometimes it is executed by other people. Sometimes those people get in trouble if they do not follow the script precisely.
We want to believe that this is a dying trend. We want to believe that this is on the way out. We want to believe that we have moved beyond this. And we want to believe that what we read and see in the cool kid hang-outs and in the online forums and in the Agile discussion lists and Lean discussion lists and the coolness that gets talked about at conferences.
And then someone says "But that is just ad hoc testing."
Rubbish.
When testing does what testing is supposed to do, it fulfills a specific purpose. That purpose will change based on the mission the testing is expected to fulfill.
Good testing addresses that mission.
When testing is done by people who are encouraged and expected to think, it more likely to fulfill that mission, that purpose, than when "testing" consists of following detailed scripts as written, without variation.
If that is what you consider testing, you should probably have studied Latin.
I sipped my coffee and said "I suggest all testing should be ad hoc. Otherwise things are rather pointless, no?"
That probably was not the right thing to say in that situation. Judging by the reaction, it probably caused more angst than was needed. Having said that, my dear, long departed Latin teacher would have approved of the response. She had this thing about people incorrectly using language, particularly Latin.
My concern, at least with the point in the conversation, is that people use words based on their understanding of the meaning, typically through common usage. Unfortunately, they don't really know what the words really mean.
I know. I am arguing against custom. I am arguing against the conventional wisdom - the bit that "everyone knows" what something means.
That is precisely the problem.
Everyone thinks they know what something is or what it means and yet no one takes the time to actually find out. The sad thing is that I find this not unusual.
I find it not unusual among testers, nor among other software folks in general. I find it not unusual among other professions either. Alas, it would seem there is a vast conspiracy to pretend people are communicating by repackaging phrases they heard and, without understanding their meaning, attach new meanings that suits their purpose and intent.
Meanings get watered down. Things go pear-shaped. People are astounded when they do because there was always clear communication.
Sorry. Rubbish.
Passing buzzwords off as meaningful communication is only accurate if everyone knows what those terms mean. It is when the terms begin to shift in how they are used, or when they begin to be misused, that their value declines.
And that is how ad hoc stopped having meaning.
The point of ad hoc anything in much of the world is in the lines of an ad hoc committee - a committee formed to deal with a specific purpose or end. It can also be an impromptu occasion. In that case, impromptu refers to the organization and planning of the committee itself - not with how it functions - something totally lost on people unfamiliar with Roberts Rules of Order and other Parliamentary Procedure guides.
Good Testing Is Always To This Testing
Good testing serves a specific purpose. It is directed to a specific end.
Lousy testing is the testing that warranted the "Test is Dead" stuff a year or two ago. The thing that is so wrong is this is precisely the "testing" that so many people hold up as superior work.
It is carefully considered. It is carefully measured. It is carefully scripted. And then the scripts are very carefully followed. It has stuff that is easily counted. Sometimes it is executed by the people who developed these carefully counted scripts. Sometimes it is executed by other people. Sometimes those people get in trouble if they do not follow the script precisely.
We want to believe that this is a dying trend. We want to believe that this is on the way out. We want to believe that we have moved beyond this. And we want to believe that what we read and see in the cool kid hang-outs and in the online forums and in the Agile discussion lists and Lean discussion lists and the coolness that gets talked about at conferences.
And then someone says "But that is just ad hoc testing."
Rubbish.
When testing does what testing is supposed to do, it fulfills a specific purpose. That purpose will change based on the mission the testing is expected to fulfill.
Good testing addresses that mission.
When testing is done by people who are encouraged and expected to think, it more likely to fulfill that mission, that purpose, than when "testing" consists of following detailed scripts as written, without variation.
If that is what you consider testing, you should probably have studied Latin.
Subscribe to:
Posts (Atom)