Showing posts with label Process. Show all posts
Showing posts with label Process. Show all posts

Tuesday, May 15, 2012

On Too Much Process or Too Much, Meh, Whatever

Process - noun - 
   1. a series of actions or steps taken in order to achieve
       a particular end

    (Oxford English Dictionary)

So, a fair number of people have heard me discuss excitedly (find the euphemism for rant) on how too much process stuff gets in the way of actually getting things done.  A fair number of times I pretty well am set in the idea that process should not be a controlling thing or a limiting thing, but a guiding thing.

There is a significant difference between the two of them.  Some people don't want to accept that.

Kind of like some people define "Agile Software Development" as 1) no documentation and 2) no testers.

When people allow the aspects of process to overtake the entire point of what is intended - typically the facilitation of what the process is supposed to, well, facilitate, where the process becomes more important than anything else, you don't have one problem, you have a collection of them.

The opposite end, well, makes bedlam look reasonable.

Having no controls and no processes in place can, and I suspect will, lead to its own problems.  Point of codifying processes, the "How we do stuff" is centered around making sure nothing gets missed: no steps get left out, nothing critical that will impact customer experience (like .JAR files not being where they are supposed to be) and other small things like that.

A process can also help us be certain that everyone who needs to know about something actually knows before that thing happens.  People, myself included, often talk about process as something that is ungood.  I've tempered this over time to be more like, process can help us, it can also hinder us - and the specifics of the situation will no doubt have a direct impact on which one it is.

What often gets lost in the shuffle when talking about process, or some development methodology or approach, is that the point of most software processes is to get things done that need to be done, and make sure people know - I kind of said that a bit before - I know.  But if you look carefully at the second portion of that, the bit that process helps "make sure people know" then that sounds alot, to me at least, like communication.

People forget that communication is less about what is said and is more about what is understood - what is heard on the receiving end.

If you have good communication and people succeed in hearing A when A is said, or written in a report or email or... yeah, you get the idea, then process forms a framework to operate within.  When people don't communicate, process may help - but I suspect it just adds to the noise.  More emails and reports to ignore, more meetings to sit in and do something else whilst they are going on, more of the same Dilbert-esque pointy-haired-boss stuff.

Even companies that quite publicly talk about their lack of formal process have processes - they have rules that people work within - frameworks that guide them  and their activities.

I suspect where I draw the line for processes that are useful and those that get in the way is the willingness of the staff - the people who are directly impacted - to follow and adhere to the given processes.

I prefer processes that are organic - that grow and develop based on the experience and relationships among the people doing the work.

I object to processes which are imposed by someone, or a group of someones, who have never actually done the work (except maybe as an exercise in school or a single project in the "real world") but have read, or attended a workshop or talked with someone at "the club"on some best practice that involved some stuff.  Whatever that is.

If people want to have a thoughtful discussion around what can work for their organization, team, company, whatever, I'd be extremely happy to participate.  If you tell me things must be done this way because of some study or best practice or whatever, don't be surprised if I ask what was studied and what practices were compared to determine which one was best. 

Sunday, April 22, 2012

On Passion, or Be Careful What You Wish For

Recently I was reminded of something that was said several years ago.

The Several Years Ago part: In the middle of a project that was simply not going well, in fact, it was a bit of a train-wreck.  Nah, not a bit.  It was a complete and total train-wreck.  Pick something that would go wrong and it did.  In spades.

Yours truly was QA Lead and was overwhelmed.  A "target rich environment" does not begin to describe what was going on.  Massive effort, huge amounts of overtime to try and control the damage, stop the flooding, stop the bleeding, well, pick a metaphor.

Fact was, the testers were putting in a lot of effort and, frankly not many others were.

So, sitting having an adult beverage, or several, with one of the development managers on the project, he looked at me and said, "Pete, you have a real passion for what you do.  You're better at testing and understand software better than an awful lot of people I've worked with.  You are really passionate about what you do.  That is great.  Be careful though.  If you're too passionate you can burn out."

That struck me as odd, at the time anyway.  How can one be "too passionate"?  Is it possible that one can be too involved? Too close to the work? Too passionate?

After all, we have a lot to do and scads of work and... whoa.  Why is it that some folks are diving in and going full bore and others are, well, sliding by and doing what seems to be the minimum.  Why is it that some people just, well, not as deeply into making the project work as others?

The Reminder part:  So, talking with another tester I know, she was muttering about a project where the developers just did not seem to care, about deadlines, quality of the project, impact on, well, performance reviews, raises, bonuses, and the like.  She looked at me and said "Its like they just don't care!" 

SO,  why is it that some people just, well, are not as deeply into making the project work as others?  I don't know.  Maybe it depends on what is expected, or what the normal approach is for the shop or company or, whatever.  Maybe it depends on the nature of the project leadership.  Are people being managed or controlled, compelled.

While what is often called craftsmanship is something that seems hard to find.these days, in some places (maybe many places, I don't know) I remember hearing many people speak passionately about being, well, passionate - as a tester, as a developer, or as whatever it is that each one of us is.

I got to thinking some more Friday night and generally over the weekend about this.

When looking for places where everyone is passionate about their work, what does that look like?  How do you know when you find it?  I used to think I knew.  I've worked at places where the majority of people were very passionate about what they did.  They wrapped much of their view of their self-worth into their work - so if the project was a success, their efforts were "worth it."

Then, I started wondering what a project that was a success looked like.  I suspect it rather depends on the software development group's target audience.  Are the people who will be using the results of your work all working for the company you are working for?  If so, "market" is a hard concept - unless the results of their work, with the new system, improves so much that the company as a whole performs better because of the many long hours and weekends in the office and ... yeah, you get it.

If the company makes software that will be bought by other companies for use in their business, the combination of sales, licenses, recurring/renewal of contracts around the software and the like will be one measure of how your efforts contributed to a successful project.  Likewise, the customer-companies being able to conduct their business better, more efficiently, is another measure of the success of the project.

And so, what about the other signs?  What about the places where people are not passionate about their work.  What do they look like?

That's easier to find examples...

People use "process" as an excuse to not do something.  "I'd love to do this, but I can't do X until D, F and L are in place.  That is what the process is."  (Whether its true or not does not seem to matter.)

People lock into rituals and stay there.  Arrive 5 minutes after the "start time"; start laptop/desk-top computer; get coffee; drink coffee, eat breakfast; sign on to network; get more coffee; sign on to email (personal)... etc., leave 10 minutes before official "stop time" to "avoid the rush".  Use the, "well, I work a lot of extra hours from home and over the weekend" reasoning.  (Oh, laptop is still in the dock on the desk as they are heading home.)


The appearance of work counts more than actually doing work.  Lots of reports being filed, status reports, progress reports, papers being shuffled up to leads and supervisors and managers and, of course, process management.  This is different than using process as an excuse to not do something.  This is taking the literal process and ignoring the intent.

Heroic Behavior is rewarded  more than steady solid work.  Now, I'm not down on heroes.  I've been in that role, and was recently called a hero as well.  I mean the false-heroes, the ones who dawdle and obfuscate and put things off and delay, and miss interim deadlines and miss delivery deadlines - partly by using the first three behaviors - and then work massive hours the last week of a project to pull things together and deliver something - and let everyone know how hard they worked. to "make this happen."

I bet you can come up with a bunch of other examples.  I stopped there simply because, well, I did.


Now, What to Do?  If you find yourself working at a shop or department or company that you find described above - where it seems you are the only one who cares - what do you do about it?  Ask yourself, "Has it always been this way?"  Maybe something changed recently, or not so recently.  Maybe the change has been gradual.

Sometimes, it takes you being the one to be burned by this behavior to notice it.  Sometimes it has been going on with some people and not others and it is your turn to work with these people and - what a mess.

You can say "Maybe they learned their lesson from this and the next time will be better."

Don't bet on it.  There is likely some other reward system in play that they value more than the rewards workmanship, craftsmanship and passion for doing good quality work can provide.  Ironically, they may get rewarded from their supervisors for being heroes (even though they created the situation that needed heroes) or "preserving the process" or, whatever.

So, back to what to do.

Your choices are limited.


You can try to "change the culture."  This is easier in small companies than in large Borg-like companies that grow by assimilating small companies into the Collective.  I know people who have tried to do this.  Some were successful; those dealing with the Borg Collective were less so.

You can try to "change the environment."  Here I include "process" as well as the nature and flow of the work and communication.  You can ask questions and field inquiries and take part in improvement task forces and, and, and... don't let the project slip.  I know people who have tried this - myself included.  It may work, you may feel more engaged and more aligned with improving the company.  At some point you may look back ans wonder what has been accomplished.

You can stop resisting - Accept it for what it is.  Turn off independent thought and go with the flow.  Collect the paycheck, take the "motivational development" courses and continue to collect the paycheck.

You nuclear option - Leave.  Go somewhere else.  That is what I did with the company in the first part of this post.  I packed it in.  I do not regret it.  My other options seemed so improbable.  I tried them - the engage thing, the culture change thing.  I could not bring myself to stop resisting.

Please, never select to stop resisting.  Never conform that much.  We are testers.  We can not be good testers if we stop questioning.  That is what is required of that option.

Sunday, March 4, 2012

Process and Ritual and Testing, Oh My.

I've been having some interesting conversations lately.  Well, I have a lot of interesting conversations, so that is not so unusual.  The interesting thing about these is that they have been, well, interesting. 

Not interesting in the way that some conversations on some online testing forums are interesting.  Not interesting the way that some conversations in groups in LinkedIn are interesting (you know the ones - where someone posts a question to get folks to start to answer then the person posting the question shows how smart they are and gives the "right" answer...

These conversations were around "Process" and "Best Practices" and things of that ilk.  Now, most of you who know me will realize that I take a dim view of 99.99999% of the "practices" that are labeled as "best."  I concede that in some situations, there may be something I am not aware of that can be considered a "best practice" - in the literal definition, not the buzz-wordy defnition.

Where was I?  Ah, yes.

These conversations were debating/arguing/asserting/rejecting the need of control and repeatability and measureability in software testing.  What I apparently failed to comprehend was that sometimes these things must be done in order to make sure the product is of "good quality."  I kept asking "How is it that this practice ensures the quality of the product?  Is there another practice that could give you the same results?"

The answer reminded me of a passage from a pulp-fantasy-fiction book series I read a long time ago.  You see, there was this particular race of dwarves who weren't terribly bright.  One of them found a secret passage.  At the time she found it,(yes, there are female dwarves) she was carrying a dead rat (seemingly for supper) and triggered the locking mechanism by accident.  This opened the door that let her take this "secret short-cut."

Well, she was making the main characters in the book take an oath that they would never divulge the magic of the passage.  One of them mentioned the trigger, which he noticed, she insisted it was magic.  She pulled out the dead rat, waved it in front of the door - then stepped on the trigger.  POOF!  The door opened!

In her mind, the ritual of waving the dead rat then stepping just so on the floor was what opened the door.  The others (outside observers) noticed what the real cause for the door opening was. 

Because it did them no harm to allow her to hold on to her certainty, they let it go.

Now, in software testing, we sometimes find ourselves in the situation of the not-too-bright female dwarf.  It worked this way the first time, therefore, this is the one true way to make it work every time.

Instead of a process, it becomes a ritual.

Are the processes we are working though aiding the testing effort, or are they gestures?  Are they helping us understand the application, or is this the ritual we must go through to get green-lights on all the test cases?

If its the later, would an incantation help?  Maybe something Latin-ish sounding like in Harry Potter?

Monday, August 30, 2010

Learning and Teaching and Leading

One thing I learned early on when teaching drumming students, particularly beginners, is that the person who learns the most is often the teacher.

It never seems to matter whether the lesson is an individual or group lesson, focused on one style or general drumming - the process of teaching beginners forces the instructor to reconsider things that the instructor simply does.  This forces the teacher to reconsider all that he does, find interesting foibles or potential weaknesses, then correct or change them as needed for working with the student. 

The interesting thing is that this reflection sometimes leads to profound understanding of what the student is learning and what the instructor is conveying.  When preparing for the odd lunch-and-learn or training session at the office I never really had that kind of an experience - or when presenting such sessions. 

On Improvement...

This last couple of weeks something interesting happened.  I've been preparing a presentation on Test Process Improvement for TesTrek in October.  I wasn't scheduled to present, or lead a workshop, but as a couple of presenters had to cancel, Viola!  I'm on the presenters list.  Then, a couple of other things came into my observation. 

There have been several conversations on email lists I'm a participant in, as well as forums, on the dreaded M word.  Yes - Metrics.

On top of this, I had a remarkably revealing email conversation with Markus Gartner - amazingly bright guy.  This came about because the questions I submitted for the "Ask the Tester" were submitted after the magic number of 10 had been reached.  However, they were forwarded to Markus and that presented me the opportunity to learn and be rinded of things I once knew and had forgotten (or channelled off into a safe place in my memory.)

My question to Markus was centered on his take of "Test Process Improvement" in an Agile environment.  The bulk of his response was reasonably close to what I expected - in fact, reassuringly close to what I had prepared for the presentation so my confidence level increased dramatically in what I was saying.  (Yes, a little reassurance is sometimes a good thing, particularly when one is a very little fish hanging out with very big fish.) 

He had one idea that I did not have.  And it left me gob-smacked.  Tacked onto an already interesting sentence about the organization's management, Markus said "... or they don't trust testing anymore."

On Trust...

I was immediately thrown back many years to when Developers were called Programmers and when I was working as a COBOL Programmer on a large IBM mainframe.  I had a Manager who did not trust his staff.  Not because they were inexperienced, but because he simply did not trust them.  To this day, I do not know why that was the case.  I can surmise why, but it has little to do with the point.  Suffice to say, it was an un-happy work environment. 

Markus made an interesting observation.  His point was that in Agile, the very purpose is to engender trust amongst all participants. Additionally, when management is invited to observe the meetings, they can gain an understanding of what is being done by their staff and as their understanding increases, so to should their level of trust. 

When a group or a team has lost the trust of its management, the task of regaining that trust is nigh-on insurmountable.  Likewise, if a manager or lead has lost the trust of the group they are to lead or manage, the results will almost certainly be dire.

On Process...

Thus, when the call comes down for "better metrics" or "process improvement" or any other number of topics.  What is the underlying message?  What is it that someone is hoping to gain?  Do they know?  CAN they know?  Are they guessing? 

Much is debated around QUANTifiable and QUALifiable considerations, measurement and understanding.  I am not nearly bright enough to join into that fray fully-fledged. 

What I have seen, however, is when Managers, Directors, VPs, EVPs, and big-bosses of all varieties are looking for something - nearly anything will suffice.  A depressing number of times, I have seen management groups flail around what is wanted - then issue and edict announcing the new policy or practice or whatever it is.  These tend to roll-out like clockwork, every three to six months. 

Each company where I have worked that followed that practice engendered a huge amount of cynicism, resentment and distrust.  The sad thing is that these rather stodgy companies - including some that were quite small and prided themselves on having no Dilbert-esque Pointy-Haired-Boss behaviors - were wasting an amazing opportunity.

The first step to fixing a "problem" is figuring out what the problem is.  If there is no understanding over why policies or procedures are changing and no feed-back loop on the purposes behind the changed, will the average rank-and-file worker stand up and say "What do you hope to change/improve/learn from this?"  At some companies - maybe.  But I have seen relatively few times where the combination of policy-dujour and staff willing to stick their necks out and ask questions both exist in the same organization. 

On Leadership...

What I have learned, instead, is to look at all sources of information.  Explain what the problem or perceived problem is.  Ask for input - then consider it fairly.  To do so is not a sign of weakness - it is a sign of strength.  That the leadership of the organization have enough trust in their workers to approach them with a problem and work together toward a solution.

This, in my mind, is the essence of building a team. 

If you throw a bunch of people together without a unifying factor and expect great things it is silly in the extreme.  In the military, "Basic Training" serves this purpose - laying the groundwork to trust your comrades and follow the direction of officers and non-commissioned officers.  In the end though, the object is teamwork:  learning to work together using each persons strengths to off-set others weaknesses. 

Why is it that so many managers miss this rather elementary point?  For a team to "work" they must learn to work together.  If the Lead or Manager has not built the group into one capable of working together, like a team, what, other than professional pride, will get any results at all? 

Although I can not prove this, in a scientific method as it were, I suspect that it is the essence of the problem mentioned above.  The question I do not know the answer to, although suspect it, is the question of leadership in this instance. 

Is it that they, the leaders, have no idea how to build a team?  Is it possible that the step of instructing the fledgling team and shaping it into the needed form was too challenging?  Could it be that in the process of doing so, their own closely held beliefs, habits and foibles were more dear than the building of a successful team?

If this basic lack is present, does it contribute to the selection of what is easy over what is right

These are the ideas that have been floating through my mind while preparing the presentation and workshop lessons for the session at TesTrek.  If the master knows that he is but a beginner in the craft, what of those who consider themselves experts in all aspects of our trade.

Can this be at the root of the behaviours I've seen first hand and read about?  Are they feeling so insecure in their own abilities that they mistrust their own staff, the team they are charged with leading?  Is it to make up for this lack, they flounder and grasp for tips or magic revelations that will show them the "path?"  Is that why there is a continuing and perpetual drive for Metrics and Process Improvement?

Tuesday, July 6, 2010

Defining and Redefining Requirements

Its been crazy-busy at work. In addition, its summer which means lots of stuff to do in the garden and around the house. Time for other activities has been pretty rare. Sometimes, "other things" kick in.

Last Friday I mentioned I was a bit "under the weather." The worst part was the enforced physical idleness - not "not doing anything" idle, but not able to do the things I otherwise would do or needed to do. So, I've been catching up on my reading.

One book I've been reading is by Rebecca Staton-Reinstein called Conventional Wisdom How Today's Leaders Plan, Perform, and Progress Like the Founding Fathers. Its been an interesting read.

Vignettes from the (United States) Constitutional Convention give a framework for the lessons provided in contemporary case studies. The book is laid out by "Articles" - mimicking the US Constitution. That got me thinking about previous teams I've been a part of, and how un-like the Framers some of them functioned.

My blog post from Friday contained, well, remembrances from a past position. One thing from that was the Forced Best Practices environment. There were a fair number of people who wanted to do good work. There were others who tried to "follow the process" come-what-may. This created a potential for people to derail projects they wanted to fail, or, to assert their authority and dominate the process, inspite of what those tasked with running the project wished to do. In short, the project leaders/managers for a fair number of the most productive projects found a way by-pass the "official" process and focus on what needed to be done.

One of the tactics among those intending to derail the process was to bemoan "thrash" or "continually revisiting things we already decided."

On the surface, this makes a great deal of sense. After all...

Time is money and once a decision is made we must move forward because otherwise we're never going to make any progress. Once we have a direction we most move immediately. We must act. If we find weve acted wrongly, act again to correct it. They may well cite Ulysses S. Grant (who is eminantly quotable, by the way) on always moving towards objectives and "retreating forward."

The problem is, as the framers of the Constitution knew, one decision informs another. The deliberations around one topic may shed light on other topics. If the deliberations shed light on a topic that was "settled," the framers considered it entirely reasonable to reconsider that topic and any other previously settled decision that may be impacted.

What an amazingly time consuming process. Is it reasonable in today's software development process to see this as a reasonable approach? Can we really consider, or reconsider requirements that were "defined" two hours ago? What about two weeks? Is two months out of the question?

When a software project runs into an "issue" with requirements - why is that? Did the scope change? Did the requirements change? Did the "users" change what they wanted? Or did the understanding of the requirements change?

Are there presumptions in place that "everyone" knew, but did not communicate? Was the understanding match among all participants?

I'm not a world famous testing guru. I'm not a sought after speaker for software conferences and conventions. I'm not a famous historian or student of history. I do software testing for a living. I've seen some really good projects and some that were absolute trainwrecks. Some of those can be categorized as U. S. Grant did that "errors were of judgement, not intent." Where do we lowly software testers fall?

My assertion is, requirements will almost always be revisted. Sometimes it will be in the "requirements discovery process" other times it will be while the design is being worked in. Other times, while program code is being written, mis-matches or conflicts may be found. Occasionally, software testing will find inconsistencies in requirements when they "put all the pieces together."

Each of these instances is a far more expensive exercise than taking the time to revisit requirements and discuss them fully. What precisely does a phrase mean? What is the intent of this? Most importantly: Do the requirements work as a whole? Do they define a complete entity?

Do they summarize what the project team is to address? Do they describe the business needs to be fulfilled? Does everyone share the vision that is needed to fill those needs?

Defining your requirements does not mean that you need to know how the team will meet them. That is what the design process is for. If you can define the needs - you can define the way to fill those needs.

Friday, July 2, 2010

Rhythm and Reality

I was a bit "under the weather" recently. I was thinking about a lot of things. One of them was rhythm.

I teach drumming, private lessons, group lessons and workshops. Rhythm is a big deal for drummers. Its kind of what we do. So, I went looking for how rhythm is defined by "non-drummers." So I went to the web.

I went to freedictionary.com and found this:
1. Movement or variation characterized by the regular recurrence or alternation of different quantities or conditions.
2. The patterned, recurring alternations of contrasting elements of sound or speech.
3. Music
a. The pattern of musical movement through time.
b. A specific kind of such a pattern, formed by a series of notes differing in duration and stress: a waltz rhythm.
c. A group of instruments supplying the rhythm in a band.
4.
a. The pattern or flow of sound created by the arrangement of stressed and unstressed syllables in accentual verse or of long and short syllables in quantitative verse.
b. The similar but less formal sequence of sounds in prose.
c. A specific kind of metrical pattern or flow.

You get the idea, no?

Repeated patterns. Repeated actions. Repeated behaviors.

Have you noticed a repetition in projects or test efforts? Things keep coming back - almost like deja vu?

If things are working and "the process" is cracking-on, there are no problems. Right? No worries - you've got it down.

If things are, not-so-good - then what? Are your projects sounding a bit repetitious? Are they following the same model or are there slight variations in them? Is the "this could have been better" project from a year ago a role model of current projects or a warning for how future projects may go? Are things muddling along or are they getting worse?

If they are getting worse, why? Did the "lessons learned" sessions from the previous projects get acted on? Have your "process changes" been carried through and embraced by all particpants or by only some? If you're like me, you've worked in shops where that pretty well sums up the situation.

I remember one shop where I worked some time ago. They had beautiful documentation. Lots and lots of process. This happens - then this happens - then this happens. The Certified Project Managers pushed the Best Practices and Industry Standards to a fare-thee-well. They kept tight control over their progress meetings ("This is an update meeting, not a working meeting. If you need to discuss it, do so in a meeting other than this one.")

Projects were regularly train-wrecks.

All the participants made a great show of "following the process." Except they only went through the motions. It was an elaborate charade - they dutifully attended every meeting and carefully reported what was done and what needed to be done.

When the whole team really does what they were say they do, magical things can happen. One cynical... not nice person... can completely derail the project.

What I learned there, was to ignore the rules. When we projects focused on what needed to be done, and forced the "process" to that end, amazing things could occur. When the "process" drove the steps needed, the project failed.

There was rhythm in both cases. Its just one looked good on paper, the other looked messy - but it had a good beat and you could dance to it.