I have 1 (One,Singular) conference speaking appearance for this year, and I am there now. I took the Amtrak Pere Marquette down from Grand Rapids, to Chicago Union Station this morning.
I arrived at the conference hotel, the Palmer House, now a Hilton property, it is one of the grand, old hotels of Chicago.The venue is absolutely lovely.One of the grand, ornate buildings that now seem too much effort for so many people. Alas for these times. (I'll be adding pictures later, maybe, I promise, sort of.)
Many years at conferences, I would spend massive amounts of time typing frantically and trying to live blog the conference as I experience it. I started doing that mostly as a service to myself - I wanted to take notes, in depth, and found I could do so as a blog post, and publish my thoughts/perceptions, in an attempt to share them.
A few speakers over the years have been a tad annoyed at my comments. A smaller number have been downright upset. My reaction? If the harsh words in the blog seemed unfair, oh well.
This time, this conference, I'm going to try a difference approach. I'll look for trends in conversations and sessions and see if I can make sense of them, and possibly find some lessons to be considered.
The first few days of the week, there were classes going. Today is tutorial day, the conference itself starts tomorrow. I'm speaking in the first speaker slot after the opening keynote, then scurrying over to the "Mentoring Corner" to talk with new speakers, or potential speakers and help them with whatever I can help them with.
That will consume a large portion of Wednesday, day 1 of the conference. Thursday, I need to leave early to deal with "day job" stuff so I will miss the closing keynote, These things contributed to my "don't live blog" decision. It also helped me to decide to focus on sessions with ideas I could bring back to colleagues and ideas that might be applied more broadly than considered.
And so, we begin.
==
Today was a low-key day, this far. Nice relaxing train ride in, connected to some meetings with the then employer (MI-GSO|Pcubed) and the client. Did a little paperwork, hung out and just rather rested. Fish & chips and a glass of local beer for lunch from one of the establishments in the hotel. Pretty good fish - haddock, nicely fried. Beer was a local whitsun ale. Not that you need to know that. Really boss, I got the Non-Alcohol version! (I wonder if she'll believe me...)
Ran through my presentation a couple times. Each time I do it, I feel more comfortable - and I like how it has shaped up.
Took a wee nap, followed by a shower (amazing how those wake you up when tired.)
Then hung out and met friends and colleagues as they rolled in, either arriving or from the tutorial they were presenting or in. Now, I'm sitting waiting for the "Pre-Conference Keynote" to wrap the end of the tutorial day, and get people ready for the social events this evening.
Jose Diaz is on the stage, looking pretty much ready to dive in.
This session is "Real Life is Not an Edge Case," presented by Rachel Kibler (@racheljoi) who is one of the SpeakEasy speakers! SpeakEasy helps new or inexperienced speakers find their voice.
Cool.
Here we go.
==
The message from Rachel last night was a good one for many people to consider, and remember. In short, when we look at the simple, straightforward path. Unfortunately we miss loads of opportunities to examine paths that will happen in real life, for the people using the software.
Our biases, and the biases of designers and people building the application and systems will impact how the software works, and may not be relevant to a large portion of people using it. THese things can be variations on bias-driven cores. They can also be under levels of stress we have not anticipated.
Perhaps, that is the single point - Stress testing of the application is one thing. Testing the application as a person who is UNDER stress is another. We need to be able to recognize the limitations we are putting on people having really bad days (not just a collection of things that get us down) - ambulances, sirens, police officers, hospitals, all count as "really bad days." How easy is our software to use? What messages are we sending to someone, based on the biased algorithm generating the UI/interactions?
==
Speaker dinner w)as amazing. I got to sit with and visit with people I won't be able to hear speak - Bonnie Aumann (@bonniea) and later, JennyBramble (@jennydoesthings) and great conversation with Ray Arell (@elmoray).
Lean coffee this morning, and now an interesting keynote by Janet Gregory and Susan Bligh.
More later...
==
Interesting mix of experiences this morning. Energy between the keynote this morning and my talk was very similar. I spent the balance of the morning, and early afternoon, at the Mentoring Corner, aimed at helping people who would be speakers at tech conferences home their presentation, or, in one exciting case, "I decided I could speak at a conference after coming here..."
Lunch was wonderful. Fajitas and conversation with Susan Bligh and others. Which made me late for Ash Coleman's keynote. Oops.
Context is Everything - Except "context" shifts in ways that most folks, including "testers" don't really expect, anticipate, or consider - like, intent of the customer/user, or possible underlying complexities that are based on life experiences of the developers, designers, testers and customers.
We must "go beyond the context of the expected tester, and lean more into the context of oneself."
--
updated
--
OK, Folks caught main points of Ash's talk on twitter pretty well. (Here, I made a link thing for you: https://twitter.com/search?f=tweets&vertical=default&q=%20%40AshColeman30%20%23AgileTDUSA&src=typd )
What struck me as significant was how context, perspective, life story, experience all shift by person. What seems obvious to one is earth-shattering revelation to another, and that is OK. Being aware that this MIGHT be happening can help us communicate.
==
An interesting and busy afternoon. I spent a pile of time, like, most of the afternoon, talking with people and looking at what they were doing and what they want to do.
And mayhem is about to commence in the closing keynote for the day - "Bad-Gile: The Game Show"
==
Mayhem was loads of fun, and provided valuable lessons for those who paid attention. In short, admitting and "owning" errors, mistakes and failures is hard. Recognizing that Agile practices will rarely (never say never) be identical to itself - no two projects, efforts of sets of work will look the same. Do not try and MAKE them look the same.
The evening reception and party were a great deal of fun. People in various forms of 1920's costumes, from flappers to competing Babe Ruths. The ball player not the candy bar named for the ball player.
Loads of good conversations, "networking" and generally hanging with friends, some new, some we've known for time.
===
Thursday morning, for me was a quick breakfast followed by Lean Coffee. Lean Coffee was interesting - great conversations around important ideas - books, learning, continuing learning after a conference.
Now, this morning's keynote - Recognizing Cultural Bias in AI, presented by Camille Eddy (@nikkymill). Thoughs later!
==
That was an immensely intense presentation. Difficult to deliver, difficult for some in the room (I suspect) to hear, yet very needed. There are trends here in the messages from several speakers I need to think on. Expect a further blog post with some of these concepts.
==
Sitting in Ray Arell's session (@elmoray) on Scrum Must Die.
Thoughts on this later as well...
--
updated
--
It is later:
Ray's message is similar to some ideas I've been having of late. People see the learning frameworks as the ONLY way to do things, because that is something related to what their instructor said because even if they are paying close attention. Sadly, many training efforts ignore the principles of Agile and jump immediately to the ceremonies and ritual.
The result is similar to keeping training wheels on bicycles long after they are needed, because that is what "you are supposed to do," according to Ray. Sadly, I've seen many instances where that is the case.
The FIRST way people "experience Agile" becomes the only RIGHT way they know. IF they were not properly trained, or they half paid attention because they were too busy reading emails of doing "important real work" then, yeah. This is what you tend to get.
And this is a huge problem, contributing to commercialization and commoditization of some really powerful ideas. Because we can take anything that seems like a good idea and make snake oil out of it...
==
Next session, after some lovely conversations in the open space/mentoring space, Dan Billings (@thetestdoctor) is presenting on security roles in testing. Interesting opening (gotta love anything featuring James Doohan... well, William Shatner is there too, I guess...)
--
updated
--
Dan's talk was well received (given the number and range of people in the room, pretty much anything with Star Trek references would be well received!)
People caught a great deal of the highlights on twitter, so, here's a link - https://twitter.com/search?f=tweets&vertical=default&q=%40TheTestDoctor%20%23AgileTDUSA&src=typd
My general takeaway, we're not as secure or safe as we like to think we are, nor as we are often told we are. Be aware, build strong diverse teams, explore and follow thread, challenge the comfortable status quo (it likely isn't good enough.)
To mix metaphors (not unlike the question asked "Where's Chebacca?" at the end of the presentation) "Constant Vigilance" - Alister Moody.
==
Dan's was actually the last session I attended. Instead, I spent some more time sitting in conversation with people, gathering thoughts and preparing to head back home.
I gathered ideas into a collection of notes I will be writing up in another blog post - more on the ideas I encountered than anything else.
The venue for this conference was absolutely beautiful. Having organized conferences, however,I can see where it would be a challenge to coordinate smooth traffic flow and keeping the "feeling" conferences tend to generate.
In spite of this, the organizers and staff did an excellent job of engaging people, making the learning fun and helping people build connections to further their understanding.
Tuesday, June 25, 2019
Saturday, May 11, 2019
On Being There
I remember a few years ago, a fellow wandered into a standup
after being rather MIA for two weeks. As in, he was physically in the building,
just ignored all the normal “rituals” around Agile, and Scrum in particular.
Now, this was interesting, because people would see him in the hallways, his
calendar was always full, but the items were blocked so no one could see what
they were, including the manager and team lead. He would not answer phone
calls, return emails and mysteriously was never at his desk.
Then came the day he showed up in a standup toward the end
of the sprint.
“Hey, I don’t have any tasks on the board so I was doing
other stuff.”
“Like what?”
“Oh, stuff that Bill asked me to do. It has been keeping me
busy. What are you guys all doing?”
This was brought back to me very, very vividly a few weeks
ago.
Since the first of the year, I’ve been working across
Michigan, some 180 miles, from where I live. So drive to the office crazy early
Monday morning, staying a few miles from the office location during the week,
then driving home Friday night. This presents a bunch of challenges, as one
might imagine. The family/home/pets/friends/relationship stuff is hard when you
are home almost 72 hours per week and sleeping a fair portion of that time (as
I am reminded often, I’m “not 40 anymore.”)
Then add to this, trying to fit in with a bunch of pipe band
drummers you are supposed to be working with, but that has turned into “giving
guidance and offering suggestions” based on recordings, simply because practice
happens evenings during the week, 185 miles from where I am. So, one of the
very, very rare weekend practices, I walked in and found… I was not fitting in.
At all. That is really bad when you are supposed to all play things together,
the same way.
Why? How’d this happen?
Simple. Minor little things were agreed on when the four of
them were there, and worked on, and internalized – and then when I showed up,
it was “Oh, yeah. Um, we changed that a little so now it is THIS. OK?” One or
two of those are not a big deal. A bunch can be added over 3 months of work.
This is in addition to the stuff I WAS told about. “Oh, we
changed the ending to this tune, so, here’s what it looks like. OK?”
The point of regular, weekly rehearsals is to keep everyone
aligned and moving the same direction. The goal is to make sure there are no
major, and only a very few minor, differences in style, interpretation and
technique. Most importantly, to make sure the shared vision and purpose are
present and driving the approach to music and presentation.
This, by the way, is the purpose behind the “rituals” of
Scrum – the standups, the refinement meetings, the retrospectives and the demos
– moving toward a more perfect alignment in purpose and reason for the team and
the product.
All it takes is one person missing then magically showing up
to totally disrupt the results.
Tuesday, May 7, 2019
A ScrumMaster Has No Name
Let me start very directly for those who really don’t like reading my
normal format blog posts:
normal format blog posts:
If you are hoping that becoming a ScrumMaster will win you praise and
broad recognition and honors and be recognized as THE PERSON who
saved these troubled projects or made the teams awesome, this is not
the role, or the job, for you.
broad recognition and honors and be recognized as THE PERSON who
saved these troubled projects or made the teams awesome, this is not
the role, or the job, for you.
If you are hoping that becoming a ScrumMaster will get you the needed
authority to compel people to obey your command and deliver awesome-
ness every 2 weeks, this is not the role, or the job, for you.
authority to compel people to obey your command and deliver awesome-
ness every 2 weeks, this is not the role, or the job, for you.
If you are hoping that becoming a ScrumMaster will result in teams
improving performances, increased quality and time to delivery will get
you the satisfaction of knowing you helped people make things happen,
deliver items of value to their organization and its customers and have
the team not even notice you in the corner getting things to happen,
then this might possibly be the role, or job, for you.
improving performances, increased quality and time to delivery will get
you the satisfaction of knowing you helped people make things happen,
deliver items of value to their organization and its customers and have
the team not even notice you in the corner getting things to happen,
then this might possibly be the role, or job, for you.
There.
That’s the gist.
Now to explain.
At one time, many large, usually European although some American,
cities had such things as “Gentleman Clubs.” Now, these were not the
ilk of certain business establishments today that advertise themselves
as such. Instead, they were rather formal places intended to provide
people of a certain, order shall we say?, with a place where they could
step in, find a welcome from the staff, have a beverage and maybe
some refreshment, smoke a pipe (perhaps) or play a hand of cards or
simply have conversation with polite, like-minded individuals.
cities had such things as “Gentleman Clubs.” Now, these were not the
ilk of certain business establishments today that advertise themselves
as such. Instead, they were rather formal places intended to provide
people of a certain, order shall we say?, with a place where they could
step in, find a welcome from the staff, have a beverage and maybe
some refreshment, smoke a pipe (perhaps) or play a hand of cards or
simply have conversation with polite, like-minded individuals.
These would be arranged and ordered much as society itself was –
there were those that were for the “better sorts” (meaning wealthy,
particularly coming from older, more established wealthy families)
and other aimed at other levels of society – men of business, or
whatever. They would discuss politics, indeed, rather famously there
were a pair of such clubs in London, aimed at the very upper levels
of Society, very similar expectations (qualifications) to become a
member. The great differentiator, was, what political party you
supported: Tory or Whig. Yes, yes, yes. I am well aware there were
radical followers of Wilkes who were reformist Whigs, of sort, but
still, they were the main two parties.
there were those that were for the “better sorts” (meaning wealthy,
particularly coming from older, more established wealthy families)
and other aimed at other levels of society – men of business, or
whatever. They would discuss politics, indeed, rather famously there
were a pair of such clubs in London, aimed at the very upper levels
of Society, very similar expectations (qualifications) to become a
member. The great differentiator, was, what political party you
supported: Tory or Whig. Yes, yes, yes. I am well aware there were
radical followers of Wilkes who were reformist Whigs, of sort, but
still, they were the main two parties.
Anyway, you see my point, I think.
There have been “clubs” for some time.
Frankly, becoming a ScrumMaster does not get you into one of the
"better clubs." Walking into a room does not get you hushed awe,
deferential bows, looks or even glances. Members of the “better
classes” will not come over quickly to greet you or take your hand.
"better clubs." Walking into a room does not get you hushed awe,
deferential bows, looks or even glances. Members of the “better
classes” will not come over quickly to greet you or take your hand.
We are not the ones for whom such things are ordered.
We might very well be the ones making sure they are ordered and
the members of the Club coming in are greeted appropriately and
have their needs attended to. The members of the Club might
know our name, typically the surname or family name, for that
is how such things work.
the members of the Club coming in are greeted appropriately and
have their needs attended to. The members of the Club might
know our name, typically the surname or family name, for that
is how such things work.
“Ah, Walen. Very good to see you again. How are things this
evening?”
evening?”
“Oh, thank you sir. It is good to see you back as well.
Mister/Doctor/Sir/Lord {name} is in the {some} room and asked
I let you know should you arrive this evening. Can I help you
with you coat, hat and walking stick? Can I have Barlett bring
you your usual?”
Mister/Doctor/Sir/Lord {name} is in the {some} room and asked
I let you know should you arrive this evening. Can I help you
with you coat, hat and walking stick? Can I have Barlett bring
you your usual?”
Well, maybe we aren’t in a Georgian era Gentleman’s Club
in London. (I’m fairly certain we are not based on the fact it
has been a while since I’ve heard conversation of that manner
that was not part of a stage production or a tongue-in-cheek
pantomime.)
in London. (I’m fairly certain we are not based on the fact it
has been a while since I’ve heard conversation of that manner
that was not part of a stage production or a tongue-in-cheek
pantomime.)
But the analogy holds, I think.
Our purpose as ScrumMasters is to help facilitate the work
people do. Help them answer questions they are not sure
how to frame, let alone ask, and help them discover more
apt lessons than “Try vertical slicing of this story…”
Because, much of the time, without an understanding of the
work at hand, such things are rejected as buzz-word non-
sense, and rightly so.
people do. Help them answer questions they are not sure
how to frame, let alone ask, and help them discover more
apt lessons than “Try vertical slicing of this story…”
Because, much of the time, without an understanding of the
work at hand, such things are rejected as buzz-word non-
sense, and rightly so.
Our purpose as ScrumMasters is to help facilitate the
communication that must happen on projects, large or small.
We can ask “would a meeting with {person} or their manager
help get these answers? Would you like me to set something
up since they are not responding to emails or phone calls?”
communication that must happen on projects, large or small.
We can ask “would a meeting with {person} or their manager
help get these answers? Would you like me to set something
up since they are not responding to emails or phone calls?”
Our purpose as ScrumMasters is to help remove roadblocks or
impediments. The Scrum Guide (OK, I know LOADS of people
have never actually read it, maybe it might be a good idea if
you’re going to call yourself a ScrumMaster or Scrum Master
or some variant of that) makes that really, really clear. We don’t
need to be the one fixing the problem or removing the roadblock.
Sometimes we might be.
impediments. The Scrum Guide (OK, I know LOADS of people
have never actually read it, maybe it might be a good idea if
you’re going to call yourself a ScrumMaster or Scrum Master
or some variant of that) makes that really, really clear. We don’t
need to be the one fixing the problem or removing the roadblock.
Sometimes we might be.
My standing “humorous/joke advice” for newly minted Project
Managers holds true for newly minted ScrumMasters/Scrum
Masters/Whatever: “Make sure the coffee and tea are fresh and
that people have their beverage of choice available. Bring it to
them if need be.” (Yeah, I’ve offended people with that line.)
Managers holds true for newly minted ScrumMasters/Scrum
Masters/Whatever: “Make sure the coffee and tea are fresh and
that people have their beverage of choice available. Bring it to
them if need be.” (Yeah, I’ve offended people with that line.)
Back when I was writing and fixing customer-facing production
code, sometimes in the middle of dealing with a serious problem,
the thing that really would have helped, and the thing I could
really use except my brain was too closely engaged in the problem
at hand to stop and go get, was a cup of fresh coffee. Someone
walking in and handing me a coffee (or tea or some carbonated
& caffeinated beverage) when I was neck deep in broken code
was a life-saver.
code, sometimes in the middle of dealing with a serious problem,
the thing that really would have helped, and the thing I could
really use except my brain was too closely engaged in the problem
at hand to stop and go get, was a cup of fresh coffee. Someone
walking in and handing me a coffee (or tea or some carbonated
& caffeinated beverage) when I was neck deep in broken code
was a life-saver.
When team morale is an impediment. DO SOMETHING. Bring
them coffee, tea, bagels, donuts, muffins, cookies/biscuits, sweets,
nuts, fruit – SOMETHING. Let them know you are aware of what
is happening, that you recognize you cannot help with the technical
problems (unless you can) and you can contribute THIS to the effort.
them coffee, tea, bagels, donuts, muffins, cookies/biscuits, sweets,
nuts, fruit – SOMETHING. Let them know you are aware of what
is happening, that you recognize you cannot help with the technical
problems (unless you can) and you can contribute THIS to the effort.
When they are done and walking toward the door, don’t forget to
give them the correct coat, hat and walking stick. When everyone
has headed out, then make sure the room is tidy, then get your
muffler and hat from the hook in the staff/servant’s room, and
quietly head out the back door.
give them the correct coat, hat and walking stick. When everyone
has headed out, then make sure the room is tidy, then get your
muffler and hat from the hook in the staff/servant’s room, and
quietly head out the back door.
Because it is about the team. It is not about you.
A ScumMaster has no name.
Friday, November 30, 2018
The Man's the Gowd For A' That
The title here is the last line of the first verse of the Robert Burns' poem and song commonly referred to as "A Man's a Man." For the late 1790's, it reflected a huge portion of the enlightenment's understanding of human kind.
Simple, a short phone call that was followed by a short conversation with my lady-wife.
I've been looking for a new software adventure for some time. Yes, I've had several opportunities come across the desk, many have not felt right to me. Some I applied for and things have been slow in progressing. A week or so ago, a placement specialist/recruiter/head-hunter called me. He had seen the resume I submitted for a different position, and wondered if I would be interested in one that had come into their office that morning.
He then described the job I was looking for.
We talked about the generalities and then dug down into greater specifics, as these conversations tend to go. He said he'd run the information past his manager and get back to me. An hour or so later he called again. We chatted some more.
I made a couple minor tweaks to the cover letter and resume to tailor it better for this position (not making things up - it bugs me when people do that - but emphasizing work I took for granted that others not doing stuff with Agile or Scrum or Testing would be looking for.
We agreed on a billable rate and off we went.
We had a couple emails back and forth since then, just checking in.
Last night, as we were watching the fish in the fish tank (really, that is what we were doing) waiting for the "dinner's greatest hits" to warm up in the oven, the phone rang - it was him again.
"Hello, is this Pete?"
"Yes it is."
"Hi Pete, this is {him} we talked last week about submitting you for a position at {company}. Do you remember?"
"Of course, {him} I remember. How are you doing today?"
A simple polite nothing - small talk in some ways, but a bridge that is so important.
The change in tone and energy was immediate. From being rather mechanical, almost awkward, everything became much more human.
"I am good today, thank you for asking."
The manner of the conversation changed with that simple question. Recognizing him as a person, recognizing he was trying to do good work to support his family and, incidentally, help a client company connect with a candidate with specific skills.
We finished the business, I wished him a good evening at the end and the conversation ended.
My lady-wife was watching with great interest.
"His entire energy changed when you asked how he was doing, didn't it."
Yup. It did. At the end, you could almost hear him smiling.
Sometimes, the purpose of such small things as asking how someone is, asked in a sincere manner, does more for that person than any other thing you could do right then. Such "polite nothings" are similar to the honorifics that once were part of everyday society.
"Good morning, Mr Jones."
"Good afternoon, Miss Radzikowska."
"Good evening, Ms Neal."
Giving people such a greeting sometimes feels awkward today, when many people cast off such artifice and defer to first names as being more "real" or "honest."
I'm not so sure.
I prefer to not abandon them out of hand and presume a familiarity that is not honestly present. Such things help keep the wheels and cogs of society moving as smoothly as possible, when they tend to be clunky at best.
Reach out with open handed kindness to another human person. Recognize them as worthy of respect and kindness. We don't know what they are struggling with themselves and sometimes small things might help them get through the day.
Be kind, even when it is hard for you to feel kind.
As Burns wrote over 200 years ago -
Is there for honest PovertyWhat makes me think of this today?
That hings his head, an' a' that;
The coward slave - we pass him by,
We dare be poor for a' that!
For a' that, an' a' that!
Our toils obscure an' a' that,
The rank is but the guinea's stamp,
The Man's the gowd for a' that.
Simple, a short phone call that was followed by a short conversation with my lady-wife.
I've been looking for a new software adventure for some time. Yes, I've had several opportunities come across the desk, many have not felt right to me. Some I applied for and things have been slow in progressing. A week or so ago, a placement specialist/recruiter/head-hunter called me. He had seen the resume I submitted for a different position, and wondered if I would be interested in one that had come into their office that morning.
He then described the job I was looking for.
We talked about the generalities and then dug down into greater specifics, as these conversations tend to go. He said he'd run the information past his manager and get back to me. An hour or so later he called again. We chatted some more.
I made a couple minor tweaks to the cover letter and resume to tailor it better for this position (not making things up - it bugs me when people do that - but emphasizing work I took for granted that others not doing stuff with Agile or Scrum or Testing would be looking for.
We agreed on a billable rate and off we went.
We had a couple emails back and forth since then, just checking in.
Last night, as we were watching the fish in the fish tank (really, that is what we were doing) waiting for the "dinner's greatest hits" to warm up in the oven, the phone rang - it was him again.
"Hello, is this Pete?"
"Yes it is."
"Hi Pete, this is {him} we talked last week about submitting you for a position at {company}. Do you remember?"
"Of course, {him} I remember. How are you doing today?"
A simple polite nothing - small talk in some ways, but a bridge that is so important.
The change in tone and energy was immediate. From being rather mechanical, almost awkward, everything became much more human.
"I am good today, thank you for asking."
The manner of the conversation changed with that simple question. Recognizing him as a person, recognizing he was trying to do good work to support his family and, incidentally, help a client company connect with a candidate with specific skills.
We finished the business, I wished him a good evening at the end and the conversation ended.
My lady-wife was watching with great interest.
"His entire energy changed when you asked how he was doing, didn't it."
Yup. It did. At the end, you could almost hear him smiling.
Sometimes, the purpose of such small things as asking how someone is, asked in a sincere manner, does more for that person than any other thing you could do right then. Such "polite nothings" are similar to the honorifics that once were part of everyday society.
"Good morning, Mr Jones."
"Good afternoon, Miss Radzikowska."
"Good evening, Ms Neal."
Giving people such a greeting sometimes feels awkward today, when many people cast off such artifice and defer to first names as being more "real" or "honest."
I'm not so sure.
I prefer to not abandon them out of hand and presume a familiarity that is not honestly present. Such things help keep the wheels and cogs of society moving as smoothly as possible, when they tend to be clunky at best.
Reach out with open handed kindness to another human person. Recognize them as worthy of respect and kindness. We don't know what they are struggling with themselves and sometimes small things might help them get through the day.
Be kind, even when it is hard for you to feel kind.
As Burns wrote over 200 years ago -
Then let us pray that come it may,
(As come it will for a' that,)
That Sense and Worth, o'er a' the earth,
Shall bear the gree, an' a' that.
For a' that, an a' that,
It's comin' yet for a' that,
That Man to Man, the world o'er,
Shall brothers be for a' that.
Monday, November 26, 2018
Testing, Limiting Failure and Improving Better
In this post, I wrote about demands and practices that lead to myriad problems in software development - even though every story (really, Backlog Item, but, whatever) is marked as "Done."
In this followup post, I wrote about things that can be done to mitigate the damage, a bit.
This is looking at the problems in the first post (above) and how this might be tied to answering the implied question in this post.
I suspect these are all tied together. I also suspect that people have been told by experts, or read a book, or talked with a peer at a large company who heard that a cool, Silicon Valley company is now doing this - and so they are jumping in so they can attract the best talent. Or something.
Let's talk about a couple of subtle points.
Testing
That is something all of us do, every day, whether we want to admit it or not. It may not be testing software, but it may be something else, like "I wonder what happens if I do this." At one time, most people writing production facing, customer impacting code were expected to test it - thoroughly. Then we'd get another person on the development team to test it as well. It was a matter of professional pride to have no bugs found by that other person, and a matter of pride to find bugs in other people's work.
I remember one Senior guy telling me "Before you hand this off to someone to test for you, make sure you have tested everything you possibly can. Document what you did and how, so that if something IS found in later testing, we can see what the difference is. Then the next time, you can test that condition when you test the others. Make them work hard to find something wrong."
That stuck with me and helps guide my thinking around testing - even 30+ years later.
Things have shifted a bit. Much of what we did then can be done fairly quickly using one or more tools to assist us - Automation. Still, we need some level of certainty that what we are using to help us is actually helping us. The scripts for the "automated tests" are software, and need diligent testing just as the product we need to test so the company can sell product, customers are happy and we don't get sued - or worse, brought up on criminal charges.
Still, when done properly, automated test scripts can help us blow through mundane tasks and allow people to focus on the "interesting" areas that need to be examined.
OK, caveat #1 - check the logs generated by the tests - don't just check to make sure the indicator is green. There MAY be something else happening you have not accounted for. Just, do a sanity check.
Then, while code is being worked on, and unit tests are being prepared (presuming you are doing TDD or something similar) someone NOT working on that piece of code look at the story (or backlog item, or whatever) and look at the tests defined in TDD and ask "What would happen if this happened?"
Now, that could be something like, an unexpected value for a variable is encountered. It could also be something more complex, for example, a related application changes the state of the data this application is working with.
One approach I have used very often, look at a representation (mindmap/decision tree/state diagram) of what the software looks like before, and after this piece is added. What types of transactions are being impacted by this change? Are there any transactions that should not be impacted? Does the test suite, as it is running, reflect these possible paths?
Has someone evaluated the paths through the code? Beyond simply line and branch coverage, how confident are you in understanding the potentially obscure relationship between the software, and say, the machine it is running on going to sleep?
Have you explored the behavior of the software, not simply if it "works?" Are there any areas that have not been considered? Are there any "ghosts in the machine"? How do you know?
Testing in Almost-Agile
I have been told by many "experts" that there is no testing in "Agile." They base this, partly, on the language or wording in the Agile Manifesto. They base it partly on the emphasis on everyone being responsible for quality in a "whole team" environment.
Some even point to the large Silicon Valley companies mentioned earlier who state very publicly they "don't have any testers."
Yet, when pressed, there are often clarifying statements like "We don't have testers in the Scrum teams, because we rely on automation for the Function testing." Here is where things kind of break down.
No "testers" in the "Scrum teams" because people write automation code to test the functions. When asked about "Integration Testing" or Load or Performance or Security or any of the other aspects of testing that testing specialists (sometimes referred to as "Testers") can help you with, do really well, and limit exposure to future problems, and possibly future front pages and lawsuits - the response often is "We have other teams do that."
Wait - What?
The "Scrum Team" declares a piece of work "Done" and then at least one or two other teams do their thing and demonstrate it is not really "Done"?
Mayhap this is the source of "Done-Done"? Is it possible to have Done-Done-Done? Maybe, depending on how many teams outside of the people developing the software there are.
That sounds pretty Not-Agile to me - maybe Almost-Agile - certainly not Scrum. It sounds much more like one of the Command-and-Control models that impose Agile terms (often Scrum) on top of some form of "traditional software development methodology" like "Waterfall." Then they sing the praises of how awesome "Agile" is and how much everything else stinks - except they are busy in stage gate meetings and getting their "Requirement Sprint" stuff done and working on their "Hardening Sprints."
Another Way - Be Flexible
Look at what the team can work on NOW for the greatest impact to the benefit of the project.
What do I mean? Let's start with one thing that the customer (in the person of the product owner or some other proxy) really, really wants or needs - more than anything else.
Figure out the pieces to make that happen - at least the big ones;
make sure people understand and agree on what the pieces mean and what they really are.
Then pick a piece - like, the one that looks like it will deliver the biggest bang NOW -
OR - one that will set you up to deliver the biggest band in the next iteration;
Then, figure out...
I'm not advocating doing this all at once for every piece/ticket/story/whatever. I am suggesting these be defined and worked on and then actually executed or completed before any task is labelled "Done."
Some of these may be easily automated - most people automate the bit about each piece "works individually."
If your team does not have people on it with the skill sets needed to do these things, I'd suggest you are failing in a very fundamental way. Can I really say that? Consider that Scrum calls for cross-functional teams as being needed for success. Now, it might be that you also need specialists in given areas and you simply can't have 1 per team - but you can share. Through decent communication and cooperation, that can be worked out.
Still, the tasks listed above will tend to be pretty specific to the work each team is doing. The dynamics of each team will vary, as will the nature of some of the most fundamental concepts like - "does it work?"
Of these tasks, simple and complex alike, perhaps the most challenging is the last one in the list.
Is there a way to find unexpected behavior in this individual piece we are working on? Is there a way to find unexpected behavior when it gets added to the whole?
These are fundamentally different from what most people mean by "Regression Testing." They are tasks that are taken up in the hopes of illuminating the unknown. We are trying to shine a light into what is expected and show what actually IS.
But, who has those kinds of skills?
They need to be able to understand the need of the customer or business AND understand the requirements and expectations AND understand the risks around poor performance or weak security, and the costs or trade-offs around making these less vulnerable. They need to be able to understand these things and explain them to the rest of the team. They need to be able to look beyond what is written down and see what is not - to look beyond the edge of the maps and consider "what happens if..." Then, these people need to be able to share these concepts and techniques with other people so they understand what can be done, and how it can be done differently and better.
These things are common among a certain group of professionals. These professionals work very hard honing their craft, making things better a little at a time. These people work hard to share ideas and get people to try something different.
They are often scorned and looked down upon by people who do not understand what the real purpose is.
These people have many names and titles. Sometimes they are "Quality Advocates" other times they are "Customer Advocates." However, they are commonly called Testers.
In this followup post, I wrote about things that can be done to mitigate the damage, a bit.
This is looking at the problems in the first post (above) and how this might be tied to answering the implied question in this post.
I suspect these are all tied together. I also suspect that people have been told by experts, or read a book, or talked with a peer at a large company who heard that a cool, Silicon Valley company is now doing this - and so they are jumping in so they can attract the best talent. Or something.
Let's talk about a couple of subtle points.
Testing
That is something all of us do, every day, whether we want to admit it or not. It may not be testing software, but it may be something else, like "I wonder what happens if I do this." At one time, most people writing production facing, customer impacting code were expected to test it - thoroughly. Then we'd get another person on the development team to test it as well. It was a matter of professional pride to have no bugs found by that other person, and a matter of pride to find bugs in other people's work.
I remember one Senior guy telling me "Before you hand this off to someone to test for you, make sure you have tested everything you possibly can. Document what you did and how, so that if something IS found in later testing, we can see what the difference is. Then the next time, you can test that condition when you test the others. Make them work hard to find something wrong."
That stuck with me and helps guide my thinking around testing - even 30+ years later.
Things have shifted a bit. Much of what we did then can be done fairly quickly using one or more tools to assist us - Automation. Still, we need some level of certainty that what we are using to help us is actually helping us. The scripts for the "automated tests" are software, and need diligent testing just as the product we need to test so the company can sell product, customers are happy and we don't get sued - or worse, brought up on criminal charges.
Still, when done properly, automated test scripts can help us blow through mundane tasks and allow people to focus on the "interesting" areas that need to be examined.
OK, caveat #1 - check the logs generated by the tests - don't just check to make sure the indicator is green. There MAY be something else happening you have not accounted for. Just, do a sanity check.
Then, while code is being worked on, and unit tests are being prepared (presuming you are doing TDD or something similar) someone NOT working on that piece of code look at the story (or backlog item, or whatever) and look at the tests defined in TDD and ask "What would happen if this happened?"
Now, that could be something like, an unexpected value for a variable is encountered. It could also be something more complex, for example, a related application changes the state of the data this application is working with.
One approach I have used very often, look at a representation (mindmap/decision tree/state diagram) of what the software looks like before, and after this piece is added. What types of transactions are being impacted by this change? Are there any transactions that should not be impacted? Does the test suite, as it is running, reflect these possible paths?
Has someone evaluated the paths through the code? Beyond simply line and branch coverage, how confident are you in understanding the potentially obscure relationship between the software, and say, the machine it is running on going to sleep?
Have you explored the behavior of the software, not simply if it "works?" Are there any areas that have not been considered? Are there any "ghosts in the machine"? How do you know?
Testing in Almost-Agile
I have been told by many "experts" that there is no testing in "Agile." They base this, partly, on the language or wording in the Agile Manifesto. They base it partly on the emphasis on everyone being responsible for quality in a "whole team" environment.
Some even point to the large Silicon Valley companies mentioned earlier who state very publicly they "don't have any testers."
Yet, when pressed, there are often clarifying statements like "We don't have testers in the Scrum teams, because we rely on automation for the Function testing." Here is where things kind of break down.
No "testers" in the "Scrum teams" because people write automation code to test the functions. When asked about "Integration Testing" or Load or Performance or Security or any of the other aspects of testing that testing specialists (sometimes referred to as "Testers") can help you with, do really well, and limit exposure to future problems, and possibly future front pages and lawsuits - the response often is "We have other teams do that."
Wait - What?
The "Scrum Team" declares a piece of work "Done" and then at least one or two other teams do their thing and demonstrate it is not really "Done"?
Mayhap this is the source of "Done-Done"? Is it possible to have Done-Done-Done? Maybe, depending on how many teams outside of the people developing the software there are.
That sounds pretty Not-Agile to me - maybe Almost-Agile - certainly not Scrum. It sounds much more like one of the Command-and-Control models that impose Agile terms (often Scrum) on top of some form of "traditional software development methodology" like "Waterfall." Then they sing the praises of how awesome "Agile" is and how much everything else stinks - except they are busy in stage gate meetings and getting their "Requirement Sprint" stuff done and working on their "Hardening Sprints."
Another Way - Be Flexible
Look at what the team can work on NOW for the greatest impact to the benefit of the project.
What do I mean? Let's start with one thing that the customer (in the person of the product owner or some other proxy) really, really wants or needs - more than anything else.
Figure out the pieces to make that happen - at least the big ones;
make sure people understand and agree on what the pieces mean and what they really are.
Then pick a piece - like, the one that looks like it will deliver the biggest bang NOW -
OR - one that will set you up to deliver the biggest band in the next iteration;
Then, figure out...
- how to know if that piece works individually;
- how to know if that piece works with other pieces that are done;
- how to know if that piece will negatively impact the way the whole package is supposed to work;
- how to know if that piece might open a security vulnerability;
- if there is a way to find any unexpected behaviors in this piece or by adding this piece to what has been done.
I'm not advocating doing this all at once for every piece/ticket/story/whatever. I am suggesting these be defined and worked on and then actually executed or completed before any task is labelled "Done."
Some of these may be easily automated - most people automate the bit about each piece "works individually."
If your team does not have people on it with the skill sets needed to do these things, I'd suggest you are failing in a very fundamental way. Can I really say that? Consider that Scrum calls for cross-functional teams as being needed for success. Now, it might be that you also need specialists in given areas and you simply can't have 1 per team - but you can share. Through decent communication and cooperation, that can be worked out.
Still, the tasks listed above will tend to be pretty specific to the work each team is doing. The dynamics of each team will vary, as will the nature of some of the most fundamental concepts like - "does it work?"
Of these tasks, simple and complex alike, perhaps the most challenging is the last one in the list.
Is there a way to find unexpected behavior in this individual piece we are working on? Is there a way to find unexpected behavior when it gets added to the whole?
These are fundamentally different from what most people mean by "Regression Testing." They are tasks that are taken up in the hopes of illuminating the unknown. We are trying to shine a light into what is expected and show what actually IS.
But, who has those kinds of skills?
They need to be able to understand the need of the customer or business AND understand the requirements and expectations AND understand the risks around poor performance or weak security, and the costs or trade-offs around making these less vulnerable. They need to be able to understand these things and explain them to the rest of the team. They need to be able to look beyond what is written down and see what is not - to look beyond the edge of the maps and consider "what happens if..." Then, these people need to be able to share these concepts and techniques with other people so they understand what can be done, and how it can be done differently and better.
These things are common among a certain group of professionals. These professionals work very hard honing their craft, making things better a little at a time. These people work hard to share ideas and get people to try something different.
They are often scorned and looked down upon by people who do not understand what the real purpose is.
These people have many names and titles. Sometimes they are "Quality Advocates" other times they are "Customer Advocates." However, they are commonly called Testers.
Monday, November 19, 2018
Moving From Failure to Better
In this blog post I described scenarios I have seen play out many times. Official mandates based around some understanding of Scrum, some version of "Best Practices" and fairly shallow understanding of software development and testing.
If we stop there, it appears that there is no avoiding the traps that lead to failure of the sprint and the work the sprint is supporting. But, there are options to make things a wee bit better.
Common Option 1: Hardening Sprints
I know - the point of Scrum us to produce regular product increments that can be potentially released to a customer or the production environment or some other place. For many large organizations, the idea of incremental improvements, particularly when it comes to their flagship software, seems anathema.
The result is bundling the work of many development teams from many sprints into one grand release.
When each team looks up and outside their silo for the first time after a sprint, or four, the collected product increments (new version of the software) are pulled together. The next step is often something like a "hardening sprint" to exercise all the pieces that were worked on from all the teams and make sure everything works.
As much as this violates Scrum orthodoxy, I can see where this might seem a really good idea. After all, you have the opportunity to exercise all the changes en masse and try and work it with as close to "real world activity" as possible in a test environment.
The problem I see many, many times, is each team simply reruns the same automated scripts they ran when pushing to finish the sprint and get to "Done." The interesting thing to me is that sometimes bugs are still found, even when nothing has "changed."
This can be from any number of causes from the mundane, finding data that was expected to have certain values has been changed, to interesting, when team X is running part of their tests when team Y is running part of their tests, unexpected errors are encountered by one, or both teams.
Another challenge I have seen often is people remember what was done early in the cycle - possibly months before the "Hardening Sprint" started. Some changes are small and stand alone. Some are build on by later sprints. Do people really remember which was which? When they built their automated acceptance tests did the update the tests for work earlier in the iteration?
In the end, someone, maybe a Release Manager, declares "Done" for the "Hardening Sprint" and the release is ready to be moved to production, or the customer, or, wherever it is supposed to go.
And more bugs are found, even when no known bugs existed.
Less Common Option 2: Integrating Testing
In a growing number of organizations, the responsibility for exercising how applications work together, how well they integrate, is not under the purview of the people making the software. The reasons for this are many, and most of them I reject out of hand as being essentially Tayloristic "Scientific Management" applied to software development.
The result is people run a series of tests against various applications in a different environment than they were developed in, and sending the bugs back to the development teams. This generally happens after the development team has declared "Done" and moved on.
Now the bugs found by the next group testing the software come back, get pulled into the backlog, presumably they get selected for the next sprint. Now it is two weeks at least since they were introduced, probably four and likely six - depending on how long it takes them to get to exercising new versions.
What if, we collaborated?
What if we recognize that having a group doing testing outside of the group that did the development work is not what the Scrum Guide means when referring to Cross-functional teams? (Really, here's the current/2017 version of The Scrum Guide)
What if we ignore the mandates and structure and cooperate to make each other's lives easier?
What if we call someone from that other team, meet for a coffee, maybe a donut as well, possibly lunch, and say something like "Look. It sucks for us that your tests find all these bugs in our stuff. It sucks for you that you get the same stuff to test over and over again. Maybe there's something to help both of us..."
"Can we get some of the scripts you run against our stuff so we can try running them and catching this stuff earlier? I know it means we'll need to configure or build some different test data, but maybe that's part of the problem? If we can get this stuff running, I think it might just save us both a lot of needless hassle. What do you think?"
Then, when you get the new tests and teste data ready and you fire them off - check the results carefully. Check the logs, check the subtle stuff. THEN, take the results to the team and talk about what you found. Share the information so you can all get better.
Not everyone is likely to go for the idea. Still, if you are willing to try, you might just make like a little better for both teams - and your customers.
Your software still won't be perfect, but it will likely be closer to better.
I've seen it.
I've done exactly that.
It can work for you, too.
Try it.
If we stop there, it appears that there is no avoiding the traps that lead to failure of the sprint and the work the sprint is supporting. But, there are options to make things a wee bit better.
Common Option 1: Hardening Sprints
I know - the point of Scrum us to produce regular product increments that can be potentially released to a customer or the production environment or some other place. For many large organizations, the idea of incremental improvements, particularly when it comes to their flagship software, seems anathema.
The result is bundling the work of many development teams from many sprints into one grand release.
When each team looks up and outside their silo for the first time after a sprint, or four, the collected product increments (new version of the software) are pulled together. The next step is often something like a "hardening sprint" to exercise all the pieces that were worked on from all the teams and make sure everything works.
As much as this violates Scrum orthodoxy, I can see where this might seem a really good idea. After all, you have the opportunity to exercise all the changes en masse and try and work it with as close to "real world activity" as possible in a test environment.
The problem I see many, many times, is each team simply reruns the same automated scripts they ran when pushing to finish the sprint and get to "Done." The interesting thing to me is that sometimes bugs are still found, even when nothing has "changed."
This can be from any number of causes from the mundane, finding data that was expected to have certain values has been changed, to interesting, when team X is running part of their tests when team Y is running part of their tests, unexpected errors are encountered by one, or both teams.
Another challenge I have seen often is people remember what was done early in the cycle - possibly months before the "Hardening Sprint" started. Some changes are small and stand alone. Some are build on by later sprints. Do people really remember which was which? When they built their automated acceptance tests did the update the tests for work earlier in the iteration?
In the end, someone, maybe a Release Manager, declares "Done" for the "Hardening Sprint" and the release is ready to be moved to production, or the customer, or, wherever it is supposed to go.
And more bugs are found, even when no known bugs existed.
Less Common Option 2: Integrating Testing
In a growing number of organizations, the responsibility for exercising how applications work together, how well they integrate, is not under the purview of the people making the software. The reasons for this are many, and most of them I reject out of hand as being essentially Tayloristic "Scientific Management" applied to software development.
The result is people run a series of tests against various applications in a different environment than they were developed in, and sending the bugs back to the development teams. This generally happens after the development team has declared "Done" and moved on.
Now the bugs found by the next group testing the software come back, get pulled into the backlog, presumably they get selected for the next sprint. Now it is two weeks at least since they were introduced, probably four and likely six - depending on how long it takes them to get to exercising new versions.
What if, we collaborated?
What if we recognize that having a group doing testing outside of the group that did the development work is not what the Scrum Guide means when referring to Cross-functional teams? (Really, here's the current/2017 version of The Scrum Guide)
What if we ignore the mandates and structure and cooperate to make each other's lives easier?
What if we call someone from that other team, meet for a coffee, maybe a donut as well, possibly lunch, and say something like "Look. It sucks for us that your tests find all these bugs in our stuff. It sucks for you that you get the same stuff to test over and over again. Maybe there's something to help both of us..."
"Can we get some of the scripts you run against our stuff so we can try running them and catching this stuff earlier? I know it means we'll need to configure or build some different test data, but maybe that's part of the problem? If we can get this stuff running, I think it might just save us both a lot of needless hassle. What do you think?"
Then, when you get the new tests and teste data ready and you fire them off - check the results carefully. Check the logs, check the subtle stuff. THEN, take the results to the team and talk about what you found. Share the information so you can all get better.
Not everyone is likely to go for the idea. Still, if you are willing to try, you might just make like a little better for both teams - and your customers.
Your software still won't be perfect, but it will likely be closer to better.
I've seen it.
I've done exactly that.
It can work for you, too.
Try it.
Sunday, November 18, 2018
Grand Pronouncements, Best Practices and the Certainty of Failure
Many times, those "in charge" will issue mandates or directives that
seem perfectly reasonable given specific ideas, conditions and
presumptions. We've seen this loads of time on things related to
software development and testing in particular.
We've seen this many, many times.
It
did not work. Thousands were killed on the first day. Entire battalions
simply ceased to exist as viable combat units. Some, like the
Newfoundland Regiment, were destroyed trying to get to their launch
point.
With luck, the best practices and directives you are getting are not in the same scale of life and death.
Being Done
What I have seen time and again, are mandates for a variety of things:
Let me say that again.
It seems to me organizations with controls like these tend to have no real idea how software is actually made.
There is another possibility - the "leaders" know these will be generally ignored.
Unfortunately, when people's performance is measured against things like "automated tests for every story" and "increased code coverage in automated tests" people tend to react precisely as most people who have considered human behavior would expect - their behavior and work changes to reflect the letter of the rules whilst ignoring the intent.
What will happen?
Automated tests will be created to demonstrate the code "works" per the expectation. These will be absolutely minimalist in nature. They will be of the "Happy Path" nature that confirms the software "works."
Rarely will you find deep, well considered tests in these instances because they take too long to develop, exercise, test (as in see if they are worth further effort) and then implement.
With each sprint being two weeks, and a mandate that no bugs are allowed, the team will simply not look very hard FOR the bugs.
When these things come together, all the conditions will be met:
If this is to some other group, it is probable that group will howl about the product. They likely will hound your support people and hammer on them. Expect them (or more likely their manager/director/big-boss) to hammer on your boss.
But, the fact remains, all the conditions for "Done" were met.
And together, they ensured failure.
1 July, 1916.
British infantry launched a massive assault in France along the Somme River. It was a huge effort - days of artillery bombardment intended to destroy German trenches and defensive positions, as well as destroy the barbed wire obstacles in front of the German positions.
British infantry launched a massive assault in France along the Somme River. It was a huge effort - days of artillery bombardment intended to destroy German trenches and defensive positions, as well as destroy the barbed wire obstacles in front of the German positions.
The
best practices mandated by High Command included forming ranks after
scrambling "over the top" of the trenches, then march across
no-mans-land, overcome what would be left of the German defenses and
capture the German positions, thus breaching the lines and opening a
hole miles long though which reinforcements could pour, and send the
Germans reeling backward in defeat. Troops in the first two waves were
promised that field kitchens would follow behind them with a hot dinner,
and supplies of ammunition and more field rations would follow.
Brilliant
plan. Conformed to all the official Best Practices of the day. In a
training setting, the planners would have gotten very high marks indeed.
One very minor issue was it was based completely on unrealistic presumptions.
One very minor issue was it was based completely on unrealistic presumptions.
With luck, the best practices and directives you are getting are not in the same scale of life and death.
Being Done
What I have seen time and again, are mandates for a variety of things:
- All sprints must be 2 weeks long;
- Each team's Definition of Done MUST have provisions that ALL stories have automated tests;
- Automated tests must be present and run successfully before a story can be considered "done;"
- There is a demand for "increased code coverage" in tests - which means automated tests;
- Any tests executed manually are to be automated;
- All tests are to be included in the CI environment and into the full regression suite;
- Any bugs in the software means the Story is not "Done;"
- Everyone on the team is to write production/user-facing code because we are embracing the idea of the "whole team is responsible for quality."
Let me say that again.
- All "user stories" must have tests associated with them before they can be considered "Done;"
- Manual tests don't count as tests unless they are "automated" by the end of the sprint;
- All automated tests must be included in the CI tests;
- All automated tests must be included in the Regression Suite;
- All automated tests must increase code coverage;
- No bugs are allowed;
- Sprints must be two-weeks;
- Everyone must write code that goes into production;
- No one {predominantly/exclusively} tests, because the "whole team" is responsible for quality.
It seems to me organizations with controls like these tend to have no real idea how software is actually made.
There is another possibility - the "leaders" know these will be generally ignored.
Unfortunately, when people's performance is measured against things like "automated tests for every story" and "increased code coverage in automated tests" people tend to react precisely as most people who have considered human behavior would expect - their behavior and work changes to reflect the letter of the rules whilst ignoring the intent.
What will happen?
Automated tests will be created to demonstrate the code "works" per the expectation. These will be absolutely minimalist in nature. They will be of the "Happy Path" nature that confirms the software "works."
Rarely will you find deep, well considered tests in these instances because they take too long to develop, exercise, test (as in see if they are worth further effort) and then implement.
With each sprint being two weeks, and a mandate that no bugs are allowed, the team will simply not look very hard FOR the bugs.
When these things come together, all the conditions will be met:
- All the tests will be automated;
- All the tests will be included in the CI environment;
- All the tests will be included in the (automated) Regression suite;
- Code coverage will increase with each automated test (even if ever so slightly);
- Any bugs found will be fixed and no new ones will be discovered;
- Everything will be done within the 2 week sprint.
If this is to some other group, it is probable that group will howl about the product. They likely will hound your support people and hammer on them. Expect them (or more likely their manager/director/big-boss) to hammer on your boss.
But, the fact remains, all the conditions for "Done" were met.
And together, they ensured failure.
Subscribe to:
Posts (Atom)