Friday, June 24, 2011

CAST 2011 Emerging Topics Track and Deadline

No odd philosophical ramblings in this post.  No questioning definitions or other people's dearly held beliefs.  Still, I'm getting pretty excited.

The Conference for the Association of Software Testing (CAST) is coming up REALLY fast - August 8 through 10 in Seattle (to be precise) Part of the fun this year is to hold an "Emerging Topics" track - 20 minute sessions that anyone already attending the conference can submit a proposal to give.

Yeah, anyone who is registered and who wants to give a short talk on something they do not see covered in the balance of the program can do so. 

Here's the catch:  In order to make a schedule and give everyone attending the opportunity to review the abstracts for the Emerging Topics session they might be interested in, we will need to cut off entries and voting for them, on July 1, 2011. 

So, if you are attending CAST and want to submit a proposal, or review and vote on proposals in the Emerging Topics track, please drop an email to me or Matt Heusser.  We'll get you setup with access to see (and create) them.

UPDATE!!!!!!!!!!
July 20, 2011 -
The Submission and Voting period is now closed, selections and a schedule have been made. This has been an interesting, thought provoking and fun project to work on. I am looking forward to meeting everyone in person after communicating by email and telephone.

Thanks to all who participated - Pete





Thursday, June 23, 2011

Managing and Controlling or One of These Things is Not Like the Other

There are some interesting threads on various internet discussion forums.  Some are interesting as in "this is a thought provoking conversation with a lot of good ideas."  Some are more along the lines of "this is a very odd series of disjointed thoughts where people can not even agree on what they disagree on."

One was interesting in the "What is this guy talking about?" sort of way.

His assertion was that Exploratory Testing was fine for small groups of one or two testers.  However, it was unsuitable for larger or regulated environments because testing could not be controlled.  He also suggested that Exploratory Testing was not as thorough as fully scripted testing because you did not need to think about it before you did it. 

Take a deep breath, Pete.  Can we start with this, "What is the difference between Controlling and Managing?"  His response was "None. They are the same thing."

Oh dear, oh my, oh dear.

Let's see.  Being too lazy to look get out one of my physical dictionaries, I turned to Google and searched for "control definition",  "manage definition",  "controlling definition" and "managing definition".  I very scientifically grabbed the top search results (that were not paid advertisements) and found this... 

Control: –verb
1. to exercise restraint or direction over; dominate; command.
2. to hold in check; curb.
      OR...
Control: -noun
1. the act or power of controlling; regulation; domination or command.
2. the situation of being under the regulation, domination, or command of another.

Then there is this...

Manage: -verb (used with object)
1. to bring about or succeed in accomplishing, sometimes despite difficulty or hardship.
2. to take charge or care of.
      OR...
Manage: -verb (used without object)
1. to conduct business, commercial affairs, etc.; be in charge.
2. to continue to function, progress, or succeed, usually despite hardship or difficulty; get along:

Now then, we have looked at the roots, now let us look at the  -ing words in question.

Controlling: -adjective
1. inclined to control others' behavior; domineering
       And...
Managing:  -adjective
1. having executive or supervisory control or authority

As I sit here, I think a bit on the interesting idea that Manage and Control are the same thing.  Based on these definitions, I find it interesting that there can be a serious assertion made that they mean the same thing.  Having said that, I know certain boss types who firmly believe this.  Me, I'm far to liberal (at least in the traditional, apolitical sense of the word) to agree with this.

Liberal:  -adjective
1. open to new behavior or opinions
2. favorable to or respectful of individual rights and freedoms

Now, if you want to exercise restraint or direction over your people (whom I suspect you refer to as "resources") or to dominate your staff or to hold them in check, have a great time.  Your staff probably won't. 

Oh, I won't be part of that game, either.

Now, if you want to be in charge and guide and supervise your staff, no worries from me.  I'd be happy to discuss exactly what that means to you and I'd also be interested in knowing how your people perceive your style of management.

Now, to be sure, there is some overlap in some of the words.  If the intent of "Control" follows the defntitions I found, I am simply not interested.  If the intent of "Manage" follows the definitions I found, I may be interested and would be willing to talk about it.  Having said that, if your use of "manage" really means "control" - I'm not going to play along.

Managing and Controlling are far from the same concept.  If you want to be a Manager, consider just what the differences are.

Wednesday, June 22, 2011

Back in My Day: Confessions of a Curmudgeon

When I first got into software for a living, the idea of "structured" anything was the red-hot burning idea that was going to save software from the horrible/bad/evil practices of people who were inept/wrong-thinking/clue-less practitioners of hocus-pocus.  Structure was going to save us.  Then it was CAD.  Then it was Object Orientation.  Then it was blah, blah.  You get the idea.

I heard some folks talking about some "New Ideas" that they had heard about.  Fantastic ideas, I thought.  Instead of centralizing everything on a host server, they could have servers in a bunch of different places and have them all talk to the host.  Then response times would be faster and the users would be able to get faster response.  Astounding, eh? 

Anyone else remember that new idea from, oh, 20 years ago?

Wait, that is sounding really, negative. 

Let me try again.

Back when I was heavily involved in bagpipe bands, there was an amused expression that was reserved for folks who had been involved in pipe bands some years before, and no longer were/

"The older I get, the better I was."

The fact is, people's perceptions will change over time as our experiences inform those same perceptions.  In the pipe band world, it seemed to inflate what the abilities were.

In reality, I have learned, now that I am part of the same "club," is that some folks really REALLY don't like change.  Change in any form is bad.  At the same time, things change as they grow, or they wither and die.  You can't maintain existence without change.  Well, maybe you can, but it is not really existence, it is a museum display - almost a "living history" lesson. 

Change is inevitable. 

Once it was suggested that since I was so "set in my ways" I may not like the changes that were coming and I could have a hard time adapting to them.

I resisted the temptation to look around for my cane and wave it about calling that individual a "young whippersnapper."  For one thing, I don't use a cane or a walking stick.  For another, I sympathized with his perception and lack of life (and maybe professional?) experience that would lead him to say that. 

The thought that crossed my mind was "It is this very experience I have that allows me to see how he could have a view like that.  I have been around a while and I like things a certain way.  I have liked things in different ways before that, too.

When comments like the above are made, or when I think on change and flexibility, my mind sometimes wanders back to the companies I have worked for, the shops I have worked in.  No two were the same, even remotely.  Some were happier than others.  Some were more efficient than others.  Some turned out really good work.  Some were just jobs. 

Some are examples of the same things I mentioned before.  My own experiences shaped my perception of each of those organizations.  As I learned more, I wanted to learn more.  My views changed related to that job as I was working there.  I learned and experienced different things in different areas.

What does all this have to do with anything, let alone each other?

Well, simply put, I read a blog entry by The Maestro a couple of weeks ago.   My first reaction was "YES! EXACTLY"  Then it made me think on some things.

What I discerned from that thinking is that each of these "revolutionary ideas" was intended to address a problem.  Or at least, a perceived problem.  The thing is that many are just that - perceived problems.  I think the real cause is that people, myself included, don't want to do the painful self-examination that is required for real improvement.

It is easier to follow the herd and the glossy marketing people when they hold out a promise than it is to dig down and work on the problem we have.  This leaves us the desperate grasping at the just-out-of-reach silver bullet.

This is, I suspect, the core issue with all the trends over the last 30 years in my experience, and more.

Unless you are willing to face and address your real problems, you will never fix them and will keep grasping at quick-fix solutions that are not.

Well, maybe I'm just being a curmudgeon.

Tuesday, June 14, 2011

What Can Be So Hard About That? Or, Why Do Some Folk Think Other Folks' Jobs Are Easy?

A funny thing happened the other day.  I overheard some "sophisticated" city-dwelling folks talking about farming.  To be more accurate, it was a group of folks from one of the "more affluent" suburbs of the city I live in.  The kids were a little uncomfortable with their surrounding, a small-town eatery that had opened a really nice "deck" by a river.  This was in a really rural area catering to farmers and their families.  The deck was clearly a "let your hair down" establishment for young folks, farmers kids and younger farm hands to enjoy a cool beverage.  It was also a short drive off the expressway, which is how they, and we, landed there.

So, the lady-wife and I were observing the to-and-fro of the regulars and these self-same sophisticated folks.  One of the women made a comment that made me blink.  "People talk about farming as if it is so hard.  I don't know what they are talking about.  You've seen my garden, that is some work, but really, how much more work can that be than my garden?"

The lady-wife's eyes looked like saucers (she's a Master Gardener and regularly says she's glad to not be a farmer).  For me, I was not surprised.  I was reminded of comments I've heard other people make like "Its just testing!  How hard can that be!" 

To be fair, I've also heard testers say things like "Why don't the developers get it right so we don't find stupid mistakes like this?"

You see, it strikes me that some folks simply don't get it.  Whatever "it" is, they just don't get. 

Here's what I mean.  You've all heard that "a little knowledge is dangerous," right?  If you have a small amount of experience with a tiny portion of what someone else does for a living, there is a tendency to extrapolate that experience to being what those who do this for a living do. 

Many of us testers have run into the developer or project manager or some other manager type who sputters about how much time testers take and how can it possibly take "that long" to test - That it doesn't add anything and just slows the project down and you can get it done faster if you just...

What comes after that varies, but you get the idea, right?

Kind of troubling for those of us whose profession and craft is "just testing."  No? 

Then why do we say the same thing about developers?  (See? I'm being polite.)  A lot of times I'll say something like "software program code writers" since there are far more folks in "software development" than those who write the code.  Yes, I know.  I'm not nice sometimes.  Yeah, sometimes I yank chains.  At the same time, there are many, many more people in software development than the people called "developers." 

I know, I'm kind of off in the weeds. 

But not really. 

When one group sets themselves or their craft above the skills of others as more sophisticated, challenging, difficult, advanced, whatever, it becomes easy to take the next step and raise yourself a notch or two over those who work with you, but are in the "lesser-skilled" trades and crafts.

I've done Project Management and Business Analysis.  I've done programming (which at the time I started working in software included design and requirements gathering and communicating with business users.)  I'm doing testing now.  I've dabbled in DB stuff - enough to know I'd not be a good DBA - no passion or patience for it. 

I do not understand how people can look down on others in a difference craft.  All of them take specific skills, training, focus and discipline.  To do them well, each of them are demanding and challenging and at the same time, they very rewarding. 

There are a lot of instances where you see this mindset - something is easy to master because you can get the basics in 10 or 15 minutes.  Learning to apply them is the hard part.  Learning to master them takes longer.  Just exactly how hard is it to do anything? 

Want to find out? Try it.  If you are a tester without development experience, try learning a programming language then try writing a simple program.  Then test it.  How many bugs did you find?  If you have some development experience, try your hand at project management - at least get a bit of training then try to apply that training at work.  Let's see what happens. 

If you dabble a little bit, or took a course in college X years ago, you're an expert, right?  Maybe the little exercise above will help you understand a tad more.  How hard can anything really be? 

After all, it is just testing, right?  How hard is that?

Monday, May 23, 2011

Where No One has Gone Before: Exploratory Testing Lessons From Jean Luc Picard

Great.  You started out with a plan, maybe you had scripts to follow, maybe you had a nice neat plan - and before you know it you're off in the weeds.

Or worse, before you know it you've stumbled into some area that is completely uncharted and unknown.  Its as if you were navigating a sailing ship 500 years ago and realized you were smack in the middle of the area that said "Here be Dragons" or some OTHER undesirable slogan. 

None of us ever intend to get that far "out there" - well, I don't normally anyway.  Most of the folks I've worked with don't normally get that far "out there" either.  Usually.  Unless we feel like - well - seeing what's out there.

When that happens you have a couple of choices.  You can punt and start over, writing this off as a weird anomaly.  Sometimes when this happens folks shrug and say something like "I don't know how this happened. It must have been something I did wrong."  Frankly, I've said that once in a while as well.  Sometimes when I do that, in the course of re-tracing what I did I find where I should have zigged and I actually zagged.  I sometimes will make a note of it to return and intentionally follow that path after finishing off what I intended to do. 

The odd thing is that sometimes I find myself out in the weeds again, just as unexpectedly as I was the first time.  So, the choice we all face is to see if we have an idea what caused the event this time OR we can see where we can get from where we are right now.

I sometimes think of this as "X-Treme Exploratory Testing."  Instead of blasting our way through whatever we just ran into, we sometimes need to carefully unravel the threads that we have around us.

Do you remember the Star Trek TNG episode where the Enterprise picked up their own distress call saw a massive explosion and a debris field that was their own ship?  It was kind of like Groundhog Day - they found themselves in a "time-space continuum anomaly" where they repeated the same incident without knowing it. 

As luck would have it, several of the recurring characters had a sense of deja-vu - around the time they picked up the distress call as I remember it.  Then Data began saying things that were, un-Data-like.  So, they decided to try and send messages to themselves each time the repeated the process to let them know what they had tried - and they'd know if it worked or not by them not blowing up.  Cool, no?

What if you don't have an Android - humanoid artificial life form, not the phone - to tell you what did not work?  What if you find yourself trying to repeat the same process and finding yourself back in the weeds everytime?

For me, the simplest tool is a legal pad with a pen - and I make note of what is done.  See?  Who needs Lieutenant Commander Data when I have Ensign Note Pad?  ;)  Now, since I began doing this all kind of other cool tools have become available - Rapid Reporter is one.  Try it - I've read rave reviews but have not had the opportunity to put it through its paces myself - I look forward to it in the near future, however.

My point, such as it is, Don't be afraid of the unknown.  We're TESTERS - that is what we do!  We find the unknown and make it known!   We head off into the weeds and chart a course out and back again. 

If you stick to the path laid down in the script, you will have a fairly safe round of exercises.  It won't be testing - but it will be safely predictable and you will be able to show nice charts and pictures showing what is done and what is left to be done, and you may even find some bugs. 

It is when you head out to see what you can see that you really learn the product, the application or how it works. 

It is your job to boldly go where no one has gone before.

Monday, May 16, 2011

Agile or You Keep Using that Word; I Do Not Think It Means What You Think It Means.

Its funny.  Many of the more recent blog posts have come from ideas or thoughts or reactions to comments and discussion at the local tester group meetings.  I think there's a blog post in there, but this one is triggered around an idea I've have for some time.  Of course, it came together clearly during a lightning talk at the most recent meeting. 

Yes, yet again the local testing group had gathered to discuss testing and eat pizza.  I don't know if it is the collection of bright people sitting around munching on pizza just talking - no slides, no formal agenda - just folks talking about testing - or if it is the collection of minds engaged in thought on the same topic that I find so interesting. 

The Trigger

One of the presentations discussed "The Fundamental Flaw in Agile" - and was based on the presenter's experience around Agile environments in software development shops.  Her premise, which I can find no fault with, was that most shops "doing Agile" make the same mistake that most shops did with "Waterfall" and experience very similar results.  That is, the belief that there is a single inerrant oracle for "user information" for software development projects. 

Mind you, she is no slouch and is extremely talented.  In fact, one statement she made was the key to allow my mind to pull things together, and that in turn, lead to this blog post.  You see, sometimes (like at conferences or presentations) I use twitter to take notes.  Other times, I outline ideas then add ideas around that outline and that turns into a blog post.  Then sometimes that blog post turns into the foundation for a presentation or longer paper. 

You see, I've worked with some really bright people in agile environments.  I've also worked with some really bright people in Agile environments.  I've also had the pleasure of working with some really bright people in Waterfall environments. 

Some of the people in the first group (agile) are also in the third group (Waterfall.)

Nah, Pete - you're kidding, right?  Everyone knows that Waterfall is not agile.

Really?

I'd argue that the way most people functioned and called it "Waterfall" was anything other than "agile."  It certainly had little to do with the Agile Manifesto.  Now, I have some theories around that but they will wait for another time. 

I might suggest that the ideas expressed in the Agile Manifesto were the extreme antithesis of how many folks "did Waterfall."  I certainly would suggest that the idea of using "Agile" to fix software development practices of some shops is equivalent to the silver bullet solution that gave us project managers and business analysts and other folks getting involved in software development with limited experience  in the field themselves. 

Now, an aside.  I do believe that some very talented people can help move a project nicely.  They can be Project Managers.  They can be Business Analysts.  They can be Programmers and Testers and DBAs and on and on.  The interesting thing, to me, is that when I got into software development, the common title for those people doing the bulk of that work was "Programmer."  Anyone else remember when programmers were expected to sit down with business users or there representatives and discuss in a knowledgeable way how the software could help them do their work better?  Now, avoiding images of people getting excited and yelling "I'm a people person!" why is it that we figure people who are good at technology stuff should be un-good with people stuff?  I don't know either.  But for now, let's leave that and consider it in another blog post.  OK?

Right.  Where was I?  Oh, yes.  Silver bullets. 

Many shops where I've seen people "doing Agile" seem curious to me.  In fact, I get curious about them in general.  I ask questions and get answers like "No.  We're Agile so we don't need documentation."  A close second is "We're Agile so we don't need to do Regression testing."  Third most common is something like "We're Agile so we don't track defects..." (now up to this point, no worries; the worries normally come after) "... because we don't do documentation." 

Thus the thought that pops into my mind,,,

"I do not think it means what you think it means." 

Now, I'm not the sharpest knife in the drawer.  I make a lot of mistakes and I have said some really un-smart things in my time.  Having said that, those folks I sometimes hear selling "Agile" to people - and neither the person selling nor the potential customer/client have a decent idea, or at least a more clearly formed idea of what "Agile" means, than I do.  I mean, come ON! 

Listen to what you are saying!  "Oh, you have communication problems! That is because you use Waterfall!  Agile fixes that!  You have customers not getting what they need! That is because you use Waterfall!  Agile fixes that too!"  And on and on and on...

sorry.  got excited there a moment.

Here's what I'm getting at.  There are some really smart people who firmly believe that Agile methodologies are fantastic.  I think there is a lot to recommend them.  Really, I do.  I can agree with everything listed in the Agile Manifesto - Really! 

I disagree with the way some people interpret Agile.  Why?  Because they are missing the point.  In my mind, the entire purpose - including dropping the stuff that is not needed, that does not move the project forward, etc., boils down to one thing:  Simplify Communication.

By that I mean exactly that - help people communicate better by breaking down the barriers that get pur in the way by process or by culture or by evil piskies. 

It seems to me, that is the greatest flaw in "Agile." 

Without good communication, Agile projects will fail.  Full stop.  If you do not have good communication, nothing else matters. 

When you replace one set of burdensome processes with another and wrap it in the banner of "Agile" have you really made it better?  Really?  Is the process the key?  Really? 

Do me a favor and  grab a dictionary and look up the word "agile."  Go ahead, I'll wait.

OK, you're back?  I bet you found something like this...

Adjective: Characterized by quickness, lightness, and ease of movement; nimble.



Wait.  Did you look up "Agile Development" or "agile"?  Yeah, consider what the word means - not the methodology but the word

Now. Someone please explain to me how folks demand that something be done because "that's what you do when you're Agile" is really agile?  If they are following form over function - doing something by rote - without explaining to the rest of the team why this is important (I understand that each Scrum master or whatever the "leader" is called needs some leeway in approach) then will the team see any more value in this than in the "evil" methods of "Waterfall"?

Then again, in my experience, what is the difference between teams that were successful and those that were unsuccessful in Waterfall?  Communication. 

Saturday, May 14, 2011

Incomplete Complete Testing

In March, the local testing group got together to eat pizza and talk about testing.  We tend to get together each month and discuss some aspect around testing and eat pizza.  This time, we had a fun meeting where my boss and I gave a "preview" of a presentation we were slated to give at STPCon in Nashville later that month and starting testing groups. We had a decent size turnout and had a lively discussion.

One portion stuck out to everyone. There was an animated discussion around whether the efforts of a testing group could result in "complete" testing.  This discussion was the result of a seemingly simple question, "Can you really have complete testing of an application?"  It took almost no time for us to realize we had a topic for the April meeting. 

The challenge was sent out and all interested were to bring the "proof" each had cited and be sure their arguments were well considered for the April meeting.  After indulging in yet more pizza and an introduction/ice-breaking exercise, we settled down to business. 

The core question revolved around what is "complete" and what is "testing."  Could we agree on the terms?  It seems simple enough, no?  Have you ever tried to get a dozen or so people with different backgrounds, training and experience, some are testers, some are designers, some  programmers, to agree on what something so simple?  This actually took longer than I expected.  Testing is more than "unit" testing.  Testing is more than making sure things work.  Yes?  Well, maybe.  With a bit of discussion, we succeeded in getting an understanding we could work with.  That testing involves more than what many of us thought individually before the discussion and also involves aspects that others had not considered.

The interesting part of the conversation was around the idea of "proof" that complete testing was not only possible, but could reasonably be done.  With some discussion around what constituted "proof," a realization dawned on most people that a conceptual "proof" (think a theorem from math class in high school) left room for an awful lot of wiggle-room.

You see, it may be possible in certain limited circumstances to test every possible combination of everything impacting the system and it may be possible that the full range of potential valid and invalid input data and it may also be possible to exercise all possible paths within the code and it may also be possible to exercise the full range of potential loops and paths for each possible combination of the paths executed within the system. 

And then there is the reality of it.  Can you really do all of that?  Can you really do any of that?  Really?

How small is the system you're testing? 

The probability of those things and the costs associated with them is the issue.  Really. 

You may be able to cover somethings.  But all?  Really?

You see, an awful lot of systems have fairly complex input data structures.  Lots of potential valid input values.  And lots more of potential invalid values.  If you commit to "complete" testing will you really test all of them?  Then there's the example of Doug Hoffman and the calculation of a square root.  Simple, eh?  Something about floating point and five significant digits and unsigned integers and if you need to be sure the routine is right, how do you do that?

I mean, its four Billion possible values, right?  (c'mon - say that like Doctor Evil with the little finger pointed out. Four Bil-le-on ,,,)  Can you test it?  It depends, right?  What kind of machine are you running on?  An XT-Clone?  A super computer?   Makes a difference, no?  Well on one it might be completely impossible.  On another, it might take 10 minutes and show that the formula works for every possible input value, except for two. 

Then again, there's the question around the environment itself.  If you're running on a Windows environment, What is all that stuff running in the background anyway?  What happens if some of that stuff is not running - does it make a difference?  How do you know?  Are you certain? 

Without knowing, how can you possibly say that you can test all the environmental configuration combinations?  Can you test everything?  If not, can you really say you can completely test your system? 

So, you see where I'm going.  And that is kind of where the conversation went at the meeting. 

Can you test your systems completely?  Really?  Completely?