Several years ago, I was sitting having a quiet drink waiting for a
couple of friends to arrive when a fellow walked over and sat down. Did
not ask if he could, just sat down. Rude bugger.
He was a senior boss type in his own organization. I was... not a boss type.
Still,
he sits down and asks me a question. "I don't know how you deal with
this. I would think you get this all the time. It seems like every few
months I get called into my manager's office to answer questions about
why we do what we do and why we don't do what these "experts" say we
should be doing. It can be everything from metrics that are supposed to
show us where all the problems are or sometimes some best practice that
someone is pushing or some tool or other for test management or some
requirements tracking tool that lets you show what you're testing and
how much testing you're doing and how good a job testing you're doing or
something. I explain to him why that stuff isn't real and how those
things don't actually work. He seems OK with it, then another couple
months and I'm back having the same conversation with him. Don't you
find that frustrating?"
My response was something really non-committal. Something like, "Yup, that can be really frustrating and a huge energy draw."
He
felt better after venting or maybe getting some form of confirmation
that it really IS frustrating and went away - he went to hang with other
manager boss types.
Here's what was running through my mind as I finished my beer.
Maybe the reason why this keeps coming up is in what he said to me.
Your
manager or manager's manager or someone up the food chain is looking
for information. If they are not seeing anything they understand or can
report to THEIR manager on, they'll look for what is commonly discussed
among their peers or in article or books or webinars or conference
presentations.
They are looking for information they
can use that is presented in a way they and other managers can
understand. Let's be realistic. They don't have time to filter through
18 pages of buzz words, technical jargon, mumbo-jumbo and falderal. They want a
single page, bullet pointed list that summarizes all that rubbish. They
would also probably like some graphic representation that presents
information clearly - and accurately. They don't have time or patience
to sift through a dozen footnotes explaining the graphic.
You
may object strongly to some level of manager higher than you being
sucked in by snake-oil salesmen or some other word for con-artists.
Still,
if the con-artists and snake-oil salesmen are presenting them with a
tool or a "solution" or a method that gives them something resembling what
they want and need, that will seem like The Solution to them, no matter
how wrong you think The Solution is.
Then
again, maybe the solution people are looking for, the one that looks
right, based on their understanding will work. Maybe, just maybe, people
will land on something that sounds like what "experts" are talking
about. There are looking at the results of "software testing tools" in
their favorite search engine and wondering why their company is not
using one or some of these tools.
Then they enter "best
software testing tools" into the search engine and see MORE results.
And these are for the BEST testing tools. Some are "Manual Testing
Tools" some are "Automated Testing Tools" and when you read the ad copy
on the webpage - they sound AWESOME.
Then they wonder why their company is not using one of these BEST tools.
They
read articles online or in a magazine and they talk about "test
everything" because if you don't then bugs might get through. And they read about how they can have software with ZERO BUGS. And then
there are the articles about choosing the RIGHT things to test since
there isn't really time to test everything. And then they get confused
because given the stuff they read in their search results talk about
testing faster and better with these tools - and deliver results that
can be tracked.
So they think about how to track
results and how to measure things like "improvement" and "quality" and
to see if there is anyway to tell if anything is being done, let alone
done right, and when changes are implemented, do they make any
difference at all,
This leads to things like
methodologies, processes, process models, and SCRUM an KANBAN and how
"Agile" is better than other ways of working and how to be Agile and how
to measure how Agile you are and how to show that being Agile is better
and how to Scale Agile and Disciplined Agile... and... and...
Still, the bosses want to know
why we (the resident "experts" in testing) don't do things like they
read about or hear about in meetings or conferences or training sessions
or podcasts. We can explain how those things are not really helpful and
don't really work - and the software being made still sucks and if more
large customers don't like the software and cancel and it is likely
that someday we'll run out of new customers to use the software we make
and if we can't keep more of our customers happy, and then we are all
screwed completely.
Why don't managers understand what we are doing and what good testing is?
Why
is it they keep coming back and asking fundamental questions about what
we do, how to evaluate progress, how to look for improvement in
quality, how to track customer satisfaction, how to know the software
works the way the sales people say it works...
Why is that?
When
managers, bosses, whatever, are looking for help to make things happen -
make things better or find some sense of progress, how do we respond?
Do
we scoff openly at them and say "that will never work?" Personally, I
find it not wise to scoff openly at managers and directors and VPs of whatever, but your experience may be
different than mine.
Maybe we say "This is not going
to help because..." and explain why it won't do what they are hoping it
will. Perhaps our more sophisticated thought leader types might
patiently explain what a "good" thing is (metric or tracking tool or
some other tool.)
Then maybe give some sage advice like
"You need to decide what you want to learn from this and what you
intend to do with what you learn."
They blink. Maybe
they realize they have no idea what that means. Let's face it - an awful
lot of people have no idea what that statement means.
So far, except for the scoffing part, there is not a lot to object to in my experience.
The problem, and the reason why the questions keep coming back, is we have not provided an example of a good alternative.
If
our method of teaching managers about testing and test management
extends only as far as what won't work and why things are a bad idea,
then we have greater issues.
We have only done half of what we need to do.
If a tool will not do what is needed, is there an alternative? Maybe there will need to be more than one working in parallel.
If
they are trying to discover something about quality of the software, do
we suggest paths to discover what is needed? Do we offer to help them
with this and work on finding the solution together?
In short, if all we do is tell them something won't work, we are not doing our job.
We have no grounds to complain if we have not worked hard to provide viable alternatives they can understand.
Maybe the great gulf and obstacle to understanding is more simply put.
People (like Managers, Developers, Product Owners, Business Analysts, Project Managers, Scrum Masters and Testers) have no shared concept of what testing itself is.
Can we blame them if they do not understand what Good Testing is?
Sunday, November 18, 2018
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment