Showing posts with label Best Practices. Show all posts
Showing posts with label Best Practices. Show all posts

Thursday, July 5, 2012

On Best Practices, Or, What Marketing Taught Me About Buzzwords

An eon or two ago, whilst working toward my Bachelor's Degree, I had a way interesting professor.  Perfectly deadpan and recognized people's faces and names - but never the two together.  Crazy smart though.

He had multiple PhDs.  Before deciding he would enter academia (PhD #2), he worked as a geologist searching out potential oil deposits (PhD #1) for a large multinational oil company.  It was when he got tired of "making a sinful amount of money" (his words) he decided he should try something else.

He had an interesting lecture series on words and terms and what they mean.  One subset came back to me recently.  I was cleaning out my home office - going through stuff that at one point I thought I might need.  In one corner, in a box, was a collection of stuff that included the text book and lecture notes from that class.  I started flipping through them with a smile at my long ago youthful optimism I had recorded there.

One thing jumped out at me as I perused the notes of a young man less than half my current age - a 2 lecture discourse on the word "best" when used in advertising. 

Some of this I remembered after all these years.  Some came roaring back to me.  Some made me think "What?" 

X is the Best money can buy.

Now, I've noticed a decline in TV advertising that makes claims like this,  "Our product is the best product on the market.  Why pay more when you can have the best?"  Granted, I don't watch a whole lot of TV anymore.  Not that I did then either - no time for it.  (Now, I have no patience for most it.)

Those ads could be running on shows I don't watch.  This is entirely possible.  Perhaps some of you have seen these types of ads. 

Anyway, part of one lecture was a discussion on how so many competing products in the same market segment could possibly all claim to be the best: toothpaste, fabric softener, laundry detergent, dish detergent, soft drink, coffee, tea... whatever.  All of them were "the best."

The way this professor worked his lectures was kind of fun.  He'd get people talking, debating ideas, throwing things out and ripping the ideas apart as to why that logic was flawed or something.  He'd start with a general statement on the topic, then put up a couple of straw-men to get things going.  (I try and use the same general approach, when I can, when presenting.  It informs everyone, including the presenter.) 

The debate would rage on until he reeled everyone in and gave a summary of what was expressed.  He'd comment on what he thought was good and not so good.  Then he'd present his view and let the debate rage again.

I smiled as I read through the notes I made - and the comments he gave on the points raised.

Here, in short, is what I learned about the word "Best" in advertising: Best is a statement of equivalence.  If a product performs the function it was intended to do, and all other products do the same, one and all can claim to be "the best."

However, if a product had claims that it was "better" than the competition, they needed to be able to provide the proof they were better or withdraw the ad.

So Best Practices?

Does the same apply to that blessed, sanctified and revered phrase "Best Practice?"   

Ah, there be dragons! 

The proponents of said practices will defend them with things like, "These practices, as collected, are the best available.  So, they are Best Practices."  Others might say things like, "There are a variety of best practices.  You must use the best practice that fits the context."

What?  What are you saying?  What does that mean?

I've thought about this off and on for some time.  Then, I came across the notes from that class.

Ah-HA!  Eureka!  Zounds!  

Zounds?  Really? Yes, Zounds!  Aside from being the second archaic word I've used in the post, it does rather fit.  (I'll wait while you look up the definition if you like.)

OK, so now that we're back, consider this:  The only way for this to make any sense is to forget that these words that look and sound like perfectly ordinary words in the English language.  They do not mean what one might think they mean.

Just like X toothpaste and Y toothpaste both can not both be the best, because how can you have TWO best items?  Unless, they mean "best" as a statement of equivalence, not superiority. 

Then it makes sense.  Then I can understand what the meaning is.

The meaning is simple: Best Practices are Practices that may, or may not work in a given situation.

Therefore, they are merely practices.  Stuff that is done.

Fine.

Now find something better





Sunday, March 4, 2012

Process and Ritual and Testing, Oh My.

I've been having some interesting conversations lately.  Well, I have a lot of interesting conversations, so that is not so unusual.  The interesting thing about these is that they have been, well, interesting. 

Not interesting in the way that some conversations on some online testing forums are interesting.  Not interesting the way that some conversations in groups in LinkedIn are interesting (you know the ones - where someone posts a question to get folks to start to answer then the person posting the question shows how smart they are and gives the "right" answer...

These conversations were around "Process" and "Best Practices" and things of that ilk.  Now, most of you who know me will realize that I take a dim view of 99.99999% of the "practices" that are labeled as "best."  I concede that in some situations, there may be something I am not aware of that can be considered a "best practice" - in the literal definition, not the buzz-wordy defnition.

Where was I?  Ah, yes.

These conversations were debating/arguing/asserting/rejecting the need of control and repeatability and measureability in software testing.  What I apparently failed to comprehend was that sometimes these things must be done in order to make sure the product is of "good quality."  I kept asking "How is it that this practice ensures the quality of the product?  Is there another practice that could give you the same results?"

The answer reminded me of a passage from a pulp-fantasy-fiction book series I read a long time ago.  You see, there was this particular race of dwarves who weren't terribly bright.  One of them found a secret passage.  At the time she found it,(yes, there are female dwarves) she was carrying a dead rat (seemingly for supper) and triggered the locking mechanism by accident.  This opened the door that let her take this "secret short-cut."

Well, she was making the main characters in the book take an oath that they would never divulge the magic of the passage.  One of them mentioned the trigger, which he noticed, she insisted it was magic.  She pulled out the dead rat, waved it in front of the door - then stepped on the trigger.  POOF!  The door opened!

In her mind, the ritual of waving the dead rat then stepping just so on the floor was what opened the door.  The others (outside observers) noticed what the real cause for the door opening was. 

Because it did them no harm to allow her to hold on to her certainty, they let it go.

Now, in software testing, we sometimes find ourselves in the situation of the not-too-bright female dwarf.  It worked this way the first time, therefore, this is the one true way to make it work every time.

Instead of a process, it becomes a ritual.

Are the processes we are working though aiding the testing effort, or are they gestures?  Are they helping us understand the application, or is this the ritual we must go through to get green-lights on all the test cases?

If its the later, would an incantation help?  Maybe something Latin-ish sounding like in Harry Potter?

Tuesday, October 12, 2010

Improving Test Processes, Part IV, or The TPI Secret of Secrets

So far, I rambled about Improving Processes.  In Part I, I wrote about how we may recognize there's a problem, but may not be sure what the problem is.  In Part II, I wrote about the problem of introspection and how hard it can be to see outside ourselves and look at how we really are.  In Part III, I wrote about Don Quixote and the unattainable goal of  Process when the Charter and Mission are in disarray. 

The simple fact is, each of these things play a part in that which makes up Test Process Improvement

Now for the secret.  TPI is not the pointTPI is not the goal

In the end, TPI doesn't really matter except as a means to the REAL goal. 

The REAL goal is this:  Better Value from your Testing Effort.

The thing is, most humans don't think in a clear fashion.  I know I don't think in a way that can be described as linear in any way, shape or form.  That is particularly true when I'm working on a problem.  If I DID I would long ago have stopped looking into something I was testing because it did not feel right, even though there was nothing on the surface to indicate there was a problem.  When I have one of those feelings, I sometimes will go over what I have for notes, look at the logs from the applications (not the nicely formatted versions, but the raw logs) or poke around in the database.  Sometimes its nothing.  Sometimes, I sit back and think "Well, look at that.  Where did that come from?"  (Actually, I sometimes say that out loud.)

That is the pay-off for me as a tester.  I found something with a strong likelihood of causing grief for the users/customers which will in turn cause grief for my company. 

I don't know how to describe that in a linear fashion.  I wish I did, I'd probably be able to make a pile of money from it and live comfortably for the rest of my life from the earnings.  The fact is, its too organic - organic in the sense that Jerry Weinberg used the term the first time I encountered it in this context (Becoming a Technical Leader) not in the Chemistry organic/carbon-based context. 

The Test Script (and its companion, the formal Test Process Document) is not the Test.  The Test is the part that is done by the M1-A1 Human Brain.  Using that most powerful tool is the key to gaining value from testing - or improving the value you are currently getting. 

You can have the best Process in the World of Software.  You can have the best Charter and Mission statements.  You can have the best tools money can buy.

Without encouraging your people to think when they are working, and rewarding them when they do it creatively and do it well, none of those other things matter.

Sunday, August 29, 2010

Music, or Testerman's Ramble

At one point in my life I played in a band that performed a variety of Irish traditional and folk music.  We also played a fair amount of Scottish traditional and folk as well, however, it seems if you play or sing a single Irish song, you are labelled "Irish" and you'll be crazy-busy in March, and pretty slow the rest of the year.  Unless you work really hard and play reasonably well.

So a side-effect of playing in a band that performs this stuff  is, when you get good enough for people to pay you money to go to their towns, cities, festivals, whatever, you will run into other folks who play the same type of music.  When schedules permit, this often devolves into a session / sessiun / wild-music-playing party.  There are certain protocols that most folks follow in these events - and the fantastic thing is that usually the level of play is quite good.  Tempos are snappy so reels drive forward and hornpipes can be lilty (and tend to run around Warp 9) and jigs are of a nature where feet rarely touch the ground. 

Now, these uber-sessions are not so different than more traditional ones held in houses or coffee-shops or bars or clubs.  The big difference is the recognition that there are no light-weight players and everyone has mastered their craft.  This is not always the case at other sessions. 

I have been out of the performing trad/folk music for several years now, and in the last year began attending some of the local sessions, just to get my feet wet.  I was a bit rusty on bodhran, the Irish hand frame drum, which I had played for 20 years on stage and in sessions.  My limited ability on penny whistle was nigh-on vanished - I remembered tunes and could call phrases from my memory to my finger tips, but I'm effectively starting over.  With crazy work and home schedule it has been hard to find time to practice , let alone become "street legal" on whistle.

So, I show up at the Sunday night sessions and play a couple of tunes on whistle when they come up.  I will also play the bodhran a bit, depending on the number of people there (it does not take many drums to become "too many" for the melody instruments - whistles, mandolins, fiddles, flutes, dulcimers and the like.) 

This last Sunday there were a fair number of players.  There were 8 or 9 "melody" players, a couple of guitars, a tenor-banjo, who played melody when he knew the tune and vamped when he did not - and me on drum (with the occaisional contribution of bones.)  Some of the players are quite experienced and I have seen around for many years.  Some are between beginner and novice.  Some are "in between" levels of experience. 

One tune in particular would have made me stop the band, if it was a "band" that was playing and have them start again.  That typically isn't done in sessions - so I did the "drummer equivalent" and simply stopped playing.  One of the mandolin players, who knew me and has also been around the block gave a little smile and he stopped as well.  We were treated to a rare sight of 6 people who were absolutely certain what the "correct" tempo was for the tune that was being played - and none of them would give an inch - or a click on the metronome.  The guitar players seemed to play along with which ever melody instrument was close to them and generally the best description was "trainwreck."

That reminded me of a projet I had worked on some time ago.  I was not on the project originally, but was brought in as part of a desperation move to fix it.  Like in the tune on Sunday, each of the participants knew what the right thing to do was.  The problem was none of them agreed on what that thing was.  "Blood on the Green" was an apt summation of that effort.  The programmers were berated for not following instructions - but how do you follow instructions when there are multiple, conflicting sets of instructions? 

Because of the "political nature" of the project, no managers or directors were willing to step up and take on some form of leadership role for fear that there would be repercussions for doing so.  The PM, BA and Dev Lead floundered without some form of direction from their respective management teams.  Information was contradictory at best. 

In the end, a Director put his foot down, asserted control and forced the issue.  Me being added to the project was part of forcing the issue.  Until that point, the uncertainty of the leadership was sapping the ability of the project group to operate as an effective team.  Like the music session last week, no one had a clear picture as to what was "right" and where the center of gravity was. 

People can urge "Best Practices," "Standards," "Process" and "Metrics" all they want.  In some contexts, that may be the right thing.  However, wiothout a clear understanding of the intent of the effort, nothing will save the project.  Ulysses S. Grant, that prescient Software Oracle (well, American General turned President) warned that indecision was worse than a wrong decision.  Wrong decisions could be countered by "right" decisions, but no decision, from leadership, leaves your group floundering looking for a center. 

Tuesday, July 6, 2010

Defining and Redefining Requirements

Its been crazy-busy at work. In addition, its summer which means lots of stuff to do in the garden and around the house. Time for other activities has been pretty rare. Sometimes, "other things" kick in.

Last Friday I mentioned I was a bit "under the weather." The worst part was the enforced physical idleness - not "not doing anything" idle, but not able to do the things I otherwise would do or needed to do. So, I've been catching up on my reading.

One book I've been reading is by Rebecca Staton-Reinstein called Conventional Wisdom How Today's Leaders Plan, Perform, and Progress Like the Founding Fathers. Its been an interesting read.

Vignettes from the (United States) Constitutional Convention give a framework for the lessons provided in contemporary case studies. The book is laid out by "Articles" - mimicking the US Constitution. That got me thinking about previous teams I've been a part of, and how un-like the Framers some of them functioned.

My blog post from Friday contained, well, remembrances from a past position. One thing from that was the Forced Best Practices environment. There were a fair number of people who wanted to do good work. There were others who tried to "follow the process" come-what-may. This created a potential for people to derail projects they wanted to fail, or, to assert their authority and dominate the process, inspite of what those tasked with running the project wished to do. In short, the project leaders/managers for a fair number of the most productive projects found a way by-pass the "official" process and focus on what needed to be done.

One of the tactics among those intending to derail the process was to bemoan "thrash" or "continually revisiting things we already decided."

On the surface, this makes a great deal of sense. After all...

Time is money and once a decision is made we must move forward because otherwise we're never going to make any progress. Once we have a direction we most move immediately. We must act. If we find weve acted wrongly, act again to correct it. They may well cite Ulysses S. Grant (who is eminantly quotable, by the way) on always moving towards objectives and "retreating forward."

The problem is, as the framers of the Constitution knew, one decision informs another. The deliberations around one topic may shed light on other topics. If the deliberations shed light on a topic that was "settled," the framers considered it entirely reasonable to reconsider that topic and any other previously settled decision that may be impacted.

What an amazingly time consuming process. Is it reasonable in today's software development process to see this as a reasonable approach? Can we really consider, or reconsider requirements that were "defined" two hours ago? What about two weeks? Is two months out of the question?

When a software project runs into an "issue" with requirements - why is that? Did the scope change? Did the requirements change? Did the "users" change what they wanted? Or did the understanding of the requirements change?

Are there presumptions in place that "everyone" knew, but did not communicate? Was the understanding match among all participants?

I'm not a world famous testing guru. I'm not a sought after speaker for software conferences and conventions. I'm not a famous historian or student of history. I do software testing for a living. I've seen some really good projects and some that were absolute trainwrecks. Some of those can be categorized as U. S. Grant did that "errors were of judgement, not intent." Where do we lowly software testers fall?

My assertion is, requirements will almost always be revisted. Sometimes it will be in the "requirements discovery process" other times it will be while the design is being worked in. Other times, while program code is being written, mis-matches or conflicts may be found. Occasionally, software testing will find inconsistencies in requirements when they "put all the pieces together."

Each of these instances is a far more expensive exercise than taking the time to revisit requirements and discuss them fully. What precisely does a phrase mean? What is the intent of this? Most importantly: Do the requirements work as a whole? Do they define a complete entity?

Do they summarize what the project team is to address? Do they describe the business needs to be fulfilled? Does everyone share the vision that is needed to fill those needs?

Defining your requirements does not mean that you need to know how the team will meet them. That is what the design process is for. If you can define the needs - you can define the way to fill those needs.

Friday, July 2, 2010

Rhythm and Reality

I was a bit "under the weather" recently. I was thinking about a lot of things. One of them was rhythm.

I teach drumming, private lessons, group lessons and workshops. Rhythm is a big deal for drummers. Its kind of what we do. So, I went looking for how rhythm is defined by "non-drummers." So I went to the web.

I went to freedictionary.com and found this:
1. Movement or variation characterized by the regular recurrence or alternation of different quantities or conditions.
2. The patterned, recurring alternations of contrasting elements of sound or speech.
3. Music
a. The pattern of musical movement through time.
b. A specific kind of such a pattern, formed by a series of notes differing in duration and stress: a waltz rhythm.
c. A group of instruments supplying the rhythm in a band.
4.
a. The pattern or flow of sound created by the arrangement of stressed and unstressed syllables in accentual verse or of long and short syllables in quantitative verse.
b. The similar but less formal sequence of sounds in prose.
c. A specific kind of metrical pattern or flow.

You get the idea, no?

Repeated patterns. Repeated actions. Repeated behaviors.

Have you noticed a repetition in projects or test efforts? Things keep coming back - almost like deja vu?

If things are working and "the process" is cracking-on, there are no problems. Right? No worries - you've got it down.

If things are, not-so-good - then what? Are your projects sounding a bit repetitious? Are they following the same model or are there slight variations in them? Is the "this could have been better" project from a year ago a role model of current projects or a warning for how future projects may go? Are things muddling along or are they getting worse?

If they are getting worse, why? Did the "lessons learned" sessions from the previous projects get acted on? Have your "process changes" been carried through and embraced by all particpants or by only some? If you're like me, you've worked in shops where that pretty well sums up the situation.

I remember one shop where I worked some time ago. They had beautiful documentation. Lots and lots of process. This happens - then this happens - then this happens. The Certified Project Managers pushed the Best Practices and Industry Standards to a fare-thee-well. They kept tight control over their progress meetings ("This is an update meeting, not a working meeting. If you need to discuss it, do so in a meeting other than this one.")

Projects were regularly train-wrecks.

All the participants made a great show of "following the process." Except they only went through the motions. It was an elaborate charade - they dutifully attended every meeting and carefully reported what was done and what needed to be done.

When the whole team really does what they were say they do, magical things can happen. One cynical... not nice person... can completely derail the project.

What I learned there, was to ignore the rules. When we projects focused on what needed to be done, and forced the "process" to that end, amazing things could occur. When the "process" drove the steps needed, the project failed.

There was rhythm in both cases. Its just one looked good on paper, the other looked messy - but it had a good beat and you could dance to it.