Monday, May 23, 2011

Where No One has Gone Before: Exploratory Testing Lessons From Jean Luc Picard

Great.  You started out with a plan, maybe you had scripts to follow, maybe you had a nice neat plan - and before you know it you're off in the weeds.

Or worse, before you know it you've stumbled into some area that is completely uncharted and unknown.  Its as if you were navigating a sailing ship 500 years ago and realized you were smack in the middle of the area that said "Here be Dragons" or some OTHER undesirable slogan. 

None of us ever intend to get that far "out there" - well, I don't normally anyway.  Most of the folks I've worked with don't normally get that far "out there" either.  Usually.  Unless we feel like - well - seeing what's out there.

When that happens you have a couple of choices.  You can punt and start over, writing this off as a weird anomaly.  Sometimes when this happens folks shrug and say something like "I don't know how this happened. It must have been something I did wrong."  Frankly, I've said that once in a while as well.  Sometimes when I do that, in the course of re-tracing what I did I find where I should have zigged and I actually zagged.  I sometimes will make a note of it to return and intentionally follow that path after finishing off what I intended to do. 

The odd thing is that sometimes I find myself out in the weeds again, just as unexpectedly as I was the first time.  So, the choice we all face is to see if we have an idea what caused the event this time OR we can see where we can get from where we are right now.

I sometimes think of this as "X-Treme Exploratory Testing."  Instead of blasting our way through whatever we just ran into, we sometimes need to carefully unravel the threads that we have around us.

Do you remember the Star Trek TNG episode where the Enterprise picked up their own distress call saw a massive explosion and a debris field that was their own ship?  It was kind of like Groundhog Day - they found themselves in a "time-space continuum anomaly" where they repeated the same incident without knowing it. 

As luck would have it, several of the recurring characters had a sense of deja-vu - around the time they picked up the distress call as I remember it.  Then Data began saying things that were, un-Data-like.  So, they decided to try and send messages to themselves each time the repeated the process to let them know what they had tried - and they'd know if it worked or not by them not blowing up.  Cool, no?

What if you don't have an Android - humanoid artificial life form, not the phone - to tell you what did not work?  What if you find yourself trying to repeat the same process and finding yourself back in the weeds everytime?

For me, the simplest tool is a legal pad with a pen - and I make note of what is done.  See?  Who needs Lieutenant Commander Data when I have Ensign Note Pad?  ;)  Now, since I began doing this all kind of other cool tools have become available - Rapid Reporter is one.  Try it - I've read rave reviews but have not had the opportunity to put it through its paces myself - I look forward to it in the near future, however.

My point, such as it is, Don't be afraid of the unknown.  We're TESTERS - that is what we do!  We find the unknown and make it known!   We head off into the weeds and chart a course out and back again. 

If you stick to the path laid down in the script, you will have a fairly safe round of exercises.  It won't be testing - but it will be safely predictable and you will be able to show nice charts and pictures showing what is done and what is left to be done, and you may even find some bugs. 

It is when you head out to see what you can see that you really learn the product, the application or how it works. 

It is your job to boldly go where no one has gone before.

Monday, May 16, 2011

Agile or You Keep Using that Word; I Do Not Think It Means What You Think It Means.

Its funny.  Many of the more recent blog posts have come from ideas or thoughts or reactions to comments and discussion at the local tester group meetings.  I think there's a blog post in there, but this one is triggered around an idea I've have for some time.  Of course, it came together clearly during a lightning talk at the most recent meeting. 

Yes, yet again the local testing group had gathered to discuss testing and eat pizza.  I don't know if it is the collection of bright people sitting around munching on pizza just talking - no slides, no formal agenda - just folks talking about testing - or if it is the collection of minds engaged in thought on the same topic that I find so interesting. 

The Trigger

One of the presentations discussed "The Fundamental Flaw in Agile" - and was based on the presenter's experience around Agile environments in software development shops.  Her premise, which I can find no fault with, was that most shops "doing Agile" make the same mistake that most shops did with "Waterfall" and experience very similar results.  That is, the belief that there is a single inerrant oracle for "user information" for software development projects. 

Mind you, she is no slouch and is extremely talented.  In fact, one statement she made was the key to allow my mind to pull things together, and that in turn, lead to this blog post.  You see, sometimes (like at conferences or presentations) I use twitter to take notes.  Other times, I outline ideas then add ideas around that outline and that turns into a blog post.  Then sometimes that blog post turns into the foundation for a presentation or longer paper. 

You see, I've worked with some really bright people in agile environments.  I've also worked with some really bright people in Agile environments.  I've also had the pleasure of working with some really bright people in Waterfall environments. 

Some of the people in the first group (agile) are also in the third group (Waterfall.)

Nah, Pete - you're kidding, right?  Everyone knows that Waterfall is not agile.

Really?

I'd argue that the way most people functioned and called it "Waterfall" was anything other than "agile."  It certainly had little to do with the Agile Manifesto.  Now, I have some theories around that but they will wait for another time. 

I might suggest that the ideas expressed in the Agile Manifesto were the extreme antithesis of how many folks "did Waterfall."  I certainly would suggest that the idea of using "Agile" to fix software development practices of some shops is equivalent to the silver bullet solution that gave us project managers and business analysts and other folks getting involved in software development with limited experience  in the field themselves. 

Now, an aside.  I do believe that some very talented people can help move a project nicely.  They can be Project Managers.  They can be Business Analysts.  They can be Programmers and Testers and DBAs and on and on.  The interesting thing, to me, is that when I got into software development, the common title for those people doing the bulk of that work was "Programmer."  Anyone else remember when programmers were expected to sit down with business users or there representatives and discuss in a knowledgeable way how the software could help them do their work better?  Now, avoiding images of people getting excited and yelling "I'm a people person!" why is it that we figure people who are good at technology stuff should be un-good with people stuff?  I don't know either.  But for now, let's leave that and consider it in another blog post.  OK?

Right.  Where was I?  Oh, yes.  Silver bullets. 

Many shops where I've seen people "doing Agile" seem curious to me.  In fact, I get curious about them in general.  I ask questions and get answers like "No.  We're Agile so we don't need documentation."  A close second is "We're Agile so we don't need to do Regression testing."  Third most common is something like "We're Agile so we don't track defects..." (now up to this point, no worries; the worries normally come after) "... because we don't do documentation." 

Thus the thought that pops into my mind,,,

"I do not think it means what you think it means." 

Now, I'm not the sharpest knife in the drawer.  I make a lot of mistakes and I have said some really un-smart things in my time.  Having said that, those folks I sometimes hear selling "Agile" to people - and neither the person selling nor the potential customer/client have a decent idea, or at least a more clearly formed idea of what "Agile" means, than I do.  I mean, come ON! 

Listen to what you are saying!  "Oh, you have communication problems! That is because you use Waterfall!  Agile fixes that!  You have customers not getting what they need! That is because you use Waterfall!  Agile fixes that too!"  And on and on and on...

sorry.  got excited there a moment.

Here's what I'm getting at.  There are some really smart people who firmly believe that Agile methodologies are fantastic.  I think there is a lot to recommend them.  Really, I do.  I can agree with everything listed in the Agile Manifesto - Really! 

I disagree with the way some people interpret Agile.  Why?  Because they are missing the point.  In my mind, the entire purpose - including dropping the stuff that is not needed, that does not move the project forward, etc., boils down to one thing:  Simplify Communication.

By that I mean exactly that - help people communicate better by breaking down the barriers that get pur in the way by process or by culture or by evil piskies. 

It seems to me, that is the greatest flaw in "Agile." 

Without good communication, Agile projects will fail.  Full stop.  If you do not have good communication, nothing else matters. 

When you replace one set of burdensome processes with another and wrap it in the banner of "Agile" have you really made it better?  Really?  Is the process the key?  Really? 

Do me a favor and  grab a dictionary and look up the word "agile."  Go ahead, I'll wait.

OK, you're back?  I bet you found something like this...

Adjective: Characterized by quickness, lightness, and ease of movement; nimble.



Wait.  Did you look up "Agile Development" or "agile"?  Yeah, consider what the word means - not the methodology but the word

Now. Someone please explain to me how folks demand that something be done because "that's what you do when you're Agile" is really agile?  If they are following form over function - doing something by rote - without explaining to the rest of the team why this is important (I understand that each Scrum master or whatever the "leader" is called needs some leeway in approach) then will the team see any more value in this than in the "evil" methods of "Waterfall"?

Then again, in my experience, what is the difference between teams that were successful and those that were unsuccessful in Waterfall?  Communication. 

Saturday, May 14, 2011

Incomplete Complete Testing

In March, the local testing group got together to eat pizza and talk about testing.  We tend to get together each month and discuss some aspect around testing and eat pizza.  This time, we had a fun meeting where my boss and I gave a "preview" of a presentation we were slated to give at STPCon in Nashville later that month and starting testing groups. We had a decent size turnout and had a lively discussion.

One portion stuck out to everyone. There was an animated discussion around whether the efforts of a testing group could result in "complete" testing.  This discussion was the result of a seemingly simple question, "Can you really have complete testing of an application?"  It took almost no time for us to realize we had a topic for the April meeting. 

The challenge was sent out and all interested were to bring the "proof" each had cited and be sure their arguments were well considered for the April meeting.  After indulging in yet more pizza and an introduction/ice-breaking exercise, we settled down to business. 

The core question revolved around what is "complete" and what is "testing."  Could we agree on the terms?  It seems simple enough, no?  Have you ever tried to get a dozen or so people with different backgrounds, training and experience, some are testers, some are designers, some  programmers, to agree on what something so simple?  This actually took longer than I expected.  Testing is more than "unit" testing.  Testing is more than making sure things work.  Yes?  Well, maybe.  With a bit of discussion, we succeeded in getting an understanding we could work with.  That testing involves more than what many of us thought individually before the discussion and also involves aspects that others had not considered.

The interesting part of the conversation was around the idea of "proof" that complete testing was not only possible, but could reasonably be done.  With some discussion around what constituted "proof," a realization dawned on most people that a conceptual "proof" (think a theorem from math class in high school) left room for an awful lot of wiggle-room.

You see, it may be possible in certain limited circumstances to test every possible combination of everything impacting the system and it may be possible that the full range of potential valid and invalid input data and it may also be possible to exercise all possible paths within the code and it may also be possible to exercise the full range of potential loops and paths for each possible combination of the paths executed within the system. 

And then there is the reality of it.  Can you really do all of that?  Can you really do any of that?  Really?

How small is the system you're testing? 

The probability of those things and the costs associated with them is the issue.  Really. 

You may be able to cover somethings.  But all?  Really?

You see, an awful lot of systems have fairly complex input data structures.  Lots of potential valid input values.  And lots more of potential invalid values.  If you commit to "complete" testing will you really test all of them?  Then there's the example of Doug Hoffman and the calculation of a square root.  Simple, eh?  Something about floating point and five significant digits and unsigned integers and if you need to be sure the routine is right, how do you do that?

I mean, its four Billion possible values, right?  (c'mon - say that like Doctor Evil with the little finger pointed out. Four Bil-le-on ,,,)  Can you test it?  It depends, right?  What kind of machine are you running on?  An XT-Clone?  A super computer?   Makes a difference, no?  Well on one it might be completely impossible.  On another, it might take 10 minutes and show that the formula works for every possible input value, except for two. 

Then again, there's the question around the environment itself.  If you're running on a Windows environment, What is all that stuff running in the background anyway?  What happens if some of that stuff is not running - does it make a difference?  How do you know?  Are you certain? 

Without knowing, how can you possibly say that you can test all the environmental configuration combinations?  Can you test everything?  If not, can you really say you can completely test your system? 

So, you see where I'm going.  And that is kind of where the conversation went at the meeting. 

Can you test your systems completely?  Really?  Completely?

Monday, May 2, 2011

What Looks Like Is Wrong May Not Be What's Wrong: Testing Lessons From House, M.D.

So a few weeks ago, I was trying to figure something out.  I was getting weird error messages and strange results and generally odd behaviors when trying to run a test.  Now, the odd part was, the behaviors shown and the error messages and the data stored in the DB simply did not, well, match. 

There was no way those pieces could all go together.  What could be going on?  What could be causing these really unusual symptoms? 

Now, I don't know what anyone else does in these situations.  I know that the method I use for tracking down problems like this can best be described as "Housian" - Well, maybe "House-ian."

Yes, I turn to that ever so patient and thoughtful, gentle tempered example for all testers, the title character of "House M.D." Gregory House.  The cuddly, lovable, always cheerful soul dispensing folksy wisdom in an gentle, kind way.

OK - If you ever watched the show, you know I'm, well, fibbing a bit.  He isn't most of those things, usually. 

What he is, is methodical.  He may not be the gentlest soul, but there is something about him.  You see, when problems simply can't co-exist - I think of it as two objects not being able to share the same space.  I sort of learned that once upon a time.  The core question is what are you going to do about it?  How are you going to find the real problem that is triggering the first in the observable problems? 

House uses a team of people to bounce competing ideas against.  That can work in some circumstances, like when you have a short time to figure out what is wrong with a critically ill patient on television, and a commercial break is coming up.  What I do like is to keep ideas flowing.

Sometimes I find myself looking at a symptom or two and trying to find a relationship.  Is one causing the other?  Can the "real" problem be something less than obvious?  Maybe another set of problems is masking the real issue?  Maybe the real issue has nothing at all to do with what I am seeing? 

Yeah, I know.  It kinda depends on what the symptoms are, right? 

So, with this particular set of problems, I found myself thinking, "What would House do?"  Other than belittling all the ideas from the Greek Chorus of supporting doctors, a common tactic is to focus on something that I can control - one symptom or set of symptoms and carry on with that.  Maybe I'll find something related to what the core problem is. 

So, I tired that.  Nope.  Nothing appeared to make sense.  Back to square one.

Then something dawned on me.  Maybe the clue to the real cause was being masked in the flood of stuff in the logs.  Maybe I was ignoring the one piece of information I needed.

So I took a look at the logs again.  (Folks who watch the show will see this as the "Hey, look at this little spot in this easily ignored place on the patient's body" moment.)

Bingo - there it was.  The one little error message in the midst of spectacular dumps of... stuff.  There was the clue I was ignoring all along. 

Thanks, Doctor House.  We were able to save the patient.  Umm, project. 

Sunday, May 1, 2011

Happy Anniversary, Pete

One year ago today I posted my first blog entry in my "new" blog on software testing. 

As a result of the blog and tweeting about it, I've had the pleasure of meeting in person and cyberly a vast number of remarkably bright, talented and interesting people. 

No one knows what is certain in the future. This last year has not progressed at all as I expected.  Well, that may be too strong.  There have been some things that went as I hoped, and others that left me absolutely gob-smacked. 

For those who have shared this portion of the journey with me, "Thank you."  The comments in the blog, by twitter, email and in person are encouraging and very much appreciated. 

Now, let us see what the coming year holds for us all.