Monday, February 17, 2020

The Automated Testing Trap

A brief sidebar at the latest local Tester meetup (#GRTesters) caused a certain amount of thinking and remembering. This is the result.

Before venturing back into contract work, I was working for a large company where bosses had very definite ideas about "testing" and "automation" and "code." Some of them were quite reasonable. Some of them resulted in me, and others, pushing back really, really hard on their assertions.

I got responses like "Have you read the book {big boss} talked about? If you have, you would understand his message." Yeah, except I knew the authors of that book. We had presented at the same conferences and had taken each other's workshops. I also knew the people their work was build upon, whom they cited. So, I'm sorry. I still have no comprehension what this means.

I got pretty well ignored after that. Then I read the "new" job descriptions for people doing what I did. Oh my.

I pushed back really hard on those - You SAY you want certain skills, and somehow you ignore the skills needed to make use of the skills you say are required. I think there is a disconnect.

The Problem of Automation

To begin, I am a fan of using automation for testing. I want to use the best possible tools to make sure the best possible outcome. I want to use tools that help me do good work.

I see more and more positions for "Automation Testers" that focus on the primary development language, characteristics of the stack, tool set, Git repository and on and on. Many of these read like the job descriptions I was speaking out about a few years ago. I have kind of been pushing hard against these types of characterizations for some time. (I think that is why some folks consider me to be "anti-automation.")

Please, don't misunderstand me. Those things can be important, in some contexts for some organizations.

To paraphrase the Wendy's commercial from 1984, "Where's the testing?"

By placing the emphasis of the job descriptions, notices and search terms into the easy to understand structure used for developers, there is a disconnect. By focusing on development skills at the cost of all other skills, for example, actual skill and experience in testing, you are limiting the role you seek to fill.

More importantly, you put the product and your organization, at risk - if not potentially in mortal danger.

On Situational Awareness

After a bit of consideration, I had a thought.I realize it does not fit ALL tech type managers. Still, many I have met clearly have a mindset something like THIS when it comes to testing. It is reflected throughout development teams.

I don't place the blame for this on the "code camps" many developers are coming from. I do not place this on the colleges and universities where many more have come from and continue to come from.

Instead, I think it is a failure on many levels to understand what Software Development as a whole is. I think it is a complete failure in "situational awareness." People are not aware what the full picture is. They are focused on their little corner of it, if that.

The failure, as Sherlock Holmes would put it, is "You see, but you do not observe."

Part of this is failing to recognize there are many forms and types of "Automation." Something like a typical CI environment, running low level tests that are effectively base-line tests for core functionality, is one aspect. Building meaningful regression tests to exercise software function to cover known behaviors is another. Adding to and removing from both of these sets of test suites takes time and careful consideration.

Discounting the need for experience in testing over experience writing code is a disaster waiting to happen.

On Testing

Good Testing is not a box to be checked.
Good Testing takes careful, considered work.
Good Testing is crucial to understanding how your software product behaves before your customer gets their hands on it.

The issue I see in many, many organizations is most "leaders" of the software organization have no idea what Good Testing is, nor what is involved in achieving that. I wrote a series of blog posts on this some time ago.

Start here: You Call THAT Testing?
Then go here: The Failure of Testers
Then go here: The Failure of Management
And then go here: Testers Rising to the Challenge

Finally, there is this: Why Don't Managers Understand What Good Testing Is?

I have spent the last many, many years trying to help people understand that thinking, good, testing is impossible to be removed from good software development. Apparently it is easier to talk accept the glossy paper snake-oil people sell while sipping craft beer and artisanal cocktails.

"Automation" will not help your company, team or project. Automation driven by informed planning, driven by people trained in testing can likely help a great deal.

Sunday, February 16, 2020

Encouraging, Motivating & Cajoling: Getting people to do the job...

The local testing meet up (#GRTesters) I'm part of had an interesting discussion this past Thursday (13 Feb, 2020). We do round table discussions a fair amount of the time, which allows a reasonably free exchange of ideas - sometimes helped by locally made wine and beer, and sometimes a nice Toscana not made locally. This is based on my notes from the conversation.

The official topic of conversation was the title of this post. It came about from a conversation a few months ago around a rather vague, but troubling prospect.

How do you get people to do a job they were "voluntold" to do which they really don't want to doat all? People walk into a meeting with an uncertain subject, and find out for the next 6 months, or more, they will be "helping out" on a "special project" that is "really important" to the company. It will take a lot of work, probably some extra hours, to get this stuff done.

Oh, by the way, you also need to get your other work done on time, too. OK?

Perfect organizations don't have this issue. It seems most of us don't work in perfect organizations. Threats and intimidation tend to be counter productive. When people are put into a position they don't really want to have, how can we get good work done and keep some form of harmony in the working group? Is that possible?

THAT was the idea behind the discussion.

I was a tad concerned. There were folks from what are considered "high performing" local companies. I had visions of people looking at me as if I had three heads and had pasta stuck in my beard. (We were in a corner of a local Sicilian restaurant.)

Instead, people jumped in. Here is a summary of the discussion, because I found the results to be really interesting and potentially important.

To Start, "Short Term"

A couple of people jumped in with "Maybe not for 6 months, but shorter term things, a week or two, I've seen this work..." ideas.

One idea, was something like the company bringing in lunch three days a week for the duration of the effort - a week or two they had seen. One idea, which I've used in the past, was to relax "dress code" and other typical office goofyness. Jeans whenever, t-shirts instead of "collared shirts", snacks in the project room - ALL THE TIME.

For many companies, these are nothing new and a fair number do this all the time, anyway. For other companies, these little things can help ease the burden a bit.

For a longer effort, one participant (who works for an "all remote" company) said he had seen money help as a short term thing. Something like, the project finishes, quality is OK (by some definition of OK) and the people doing the work, giving up time away from friends and family to make this project happen and get their usual work done as well - THEY get a "piece of the action." They share in the bonus or incentive pay which managers or directors might normally expect.

This sent us down a tangent around "leadership" and what do "leaders" actually do. The energy seemed to be around the idea that if people have got pretty much any sense of professional reliability, they'll jump in and do the best work they can, without any special effort from bosses. Therefore, instead of the boss getting money, why not the people who actually made it happen?

Longer Term, ergo, Harder

One idea or term that kept getting bandied about, was "team." We took a bit of time to distinguish between "people who report to the same boss" and "people working together on a common effort."

If a group is really working as a team, at least part of this might be addressed. People pulled from their "real jobs" might get the support of others they normally work with to cover at least part of their non-project work.

The group working on the "special project" needs to actually work as a team, toward a common purpose. If there is a small group "telling people what to do" without explaining the purpose, or allowing the individuals to learn and understand the purpose behind the work. it is unlikely there will be a meaningful level of success.

The challenge is to encourage, coach and teach people through the learning process and keep them engaged and actively participating. Keeping people engaged takes a couple other things not noted yet...

Making it work

In the end, there are some things everyone agreed on. Among them, for this to work, people need to (at least act like) professional, adult workers doing their best possible work for their employer, while they work for them.

Teams need to be teams and actually work together - as a team - supporting each other, holding each other up with support and holding each other accountable to contribute to the best of their abilities to team success.

Without these things, not much else matters. The cool gimmicks and "motivational techniques" won't get you where you need to be. People need to do it.

That includes managers, product owners, product managers, scrum masters, project managers, and on and on, supporting the team in tangible ways, then getting out of their way to allow them to do their best work.

Finally -

Thanks to those who actively participated in the conversation Sarah, Jace, Keith & Greg. They jumped right in and had no qualms explaining their views.

Saturday, February 15, 2020

But What Does a QA, Tester Guy Know About...

... Product?
... BA work?
... Development?

My, what an interesting few days. I was asked those questions Thursday afternoon and Friday morning. It was an interesting set of discussions. My answer was similar for each of them.

Starting with the easier one to explain first.


There's a secret that people with a specific agenda don't want others to know, I think.

The better someone doing testing or general quality work understands what developers do, the language they work in and the database that holds the data, the better questions they can ask around the software and the better testing they can do.

It seems obvious to me, but I think it isn't so obvious. Let me see if I can explain what I mean.

If a tester has a basic understanding of, say, SQL, and can read and write basic queries, then she can build more targeted test structures and more vigorously explore the application's behavior.

If a tester can at least read through and understand what is happening in the code, she can use her "tester mindset" and check her understanding against the requirements - either in a traditional "Requirements Document" or in notes on the User Story/Story Card.

She can look for discrepancies between her understanding of the requirements, specifications, whatever, and what she sees in the code. This can lead to conversation with the people writing the code and the BA and Product Owner/Product Manager to clarify and make sure everyone has a shared vision of the project.

This is similar to how a good tester can understand what a BA does.

Business Analysis

Testers look at the requirements produced by Business Analysts. Ideally, they participate in discussions where the requirements are discovered and defined. For many organizations this is not the case. (I suspect it goes a long way toward explaining the disbelief around a tester/Qa person understanding what a BA does.)

I remember past projects and companies and clients and places where things did not work well.  Each time I asked about reviewing the requirements or specifications or user stories, then discussing them with the people who created them, I was often looked at with confusion. Like, "Why would you want to do that? Everything is written down here." This was often followed by "No testers have gone back to this stuff before. Why do you want to?"

I learned to control myself.  My automatic reaction typically was to smile to myself and think "And that explains the state of testing for these projects." 

What I actually said was "It is good to know what the documented requirements are.  It is also good to know what existing tests have been set up or run already.  It is essential to know what problem the project is intended to address.  It is crucial to know what people are doing now and what they need the software to do for them."

When looking at how to test something, or how to approach a piece of software, I've learned that the most effective way for me was to gain just that level of understanding. If I can understand the problem the BA was describing better, I can do a better job of testing to make sure the customer needs, problems and desires are being addressed.

I sometimes get told that a BA will explain everything in the documentation and I should simply leave them alone. Except, everyone translates information that fits their view. Everyone filters out unimportant bits - which sometimes is really important.

For that reason, I find conversation is the most powerful tool a tester has to build good, meaningful tests. They need to go beyond the handful of words written down and make sure they have the same perspective the BA was trying to represent.


The purpose of every software organization is to deliver a viable, working product to customers. Every development model, quality program, testing model, and approach, waterfall, Agile, SCRUM, Kanban, or XP aims to deliver the best product possible. 

These goals often fail, in my experience, with what I think of as the Edsel.

Anyone drive an Edsel lately? Wait. What? You haven't? I bet your neighbors have one though, right? Well, maybe not.

The Edsel was an amazing piece of engineering, design and marketing. It had absolutely bleeding edge features for the time it was designed and produced. The goal was a "perfect product." 

When working with software projects, I have a really fundamental opposition to trying to advance by "hitting home-runs." Most baseball players strike out, most of the time. 

Incremental improvement, based on firm understanding of the needs and problems we are trying to address seem a more solid solution to developing ideas into software.

If we don't understand the needs and problems our customers have can we really make a product people want to use, let alone pay for?

I get the idea of "revolutionary." I get the value of being the "first to market." I also get that ideas are never perfect, and building on good ideas can lead to better ones. Take a minute and gather some data, then load it into VisiCalc  and see the odds of the breakthrough leading to corporate success.

Good ideas grow on each other. If you set a fixed direction and channel every available resource into that, will you be able to respond to changes in the market, or world? Will you be able to respond to problems found in creating the breakthrough?

I find it helps to have a general idea with a broad, fuzzy view of what the final product will look like. The work to build that view needs to be sharper and more clear as it is being worked on. The tasks for next week need to be well understood. At least, they need to be understood enough to make progress toward the final end goal.

What Does a Tester Know?

As a "quality guy" I have seen enough examples of well meaning, well intended direction carved in stone that results in chaos, if not disaster. The key to success, as I see it, is to remember that ideas are transitory. They shape and shift and change over time.

By keeping the conversation around the purpose of the product alive, and keeping every person involved "in the loop" as equals and as contributors to success, amazing things are possible.

The higher and more formidable the barriers the less likely conversation and communication will happen.