Tuesday, February 3, 2015

On The Easy Way and the Fallacy of the Simple

I've forgotten which of the several books by Jerry Weinberg I first read this in. The gist of the lesson is that when a proposed solution starts "All you have to do is..." it is a safe bet that this "solution" is almost certainly not going to be the solution that is needed.

It's funny how that works.

How many times has some expert walked in and told you, or your boss or someone "important" that "The answer to your problem is {blah}? That clearly the best way to handle the situation is to... blah blah blah blah blah..." Followed by something where it looks a bit like they expect you to accept that 1+2 = Magenta.

Yeah.

It is kind of like "We need to test this stuff but it is hard and complex and figuring our the data relationships and how values interact with . So, we'll just pull data from production and that will be good enough."

Now, I've mentioned that very idea in articles from time to time, and given presentations on how that can be done in integration testing and other forms. In a nutshell, there is no "just" about it.

If you are looking for a shortcut to figuring out test data - this is an OK place to start. But it is not the end.  Except some folks it might be the end.  That is too bad. Frankly, I think it's a huge mistake to think it's the end, but I could be oversimplifying it.

The issue isn't about the data. The issue is more subtle than that.

Where I see that to be the case is generally where people don't understand the system they are supposed to be experts in.

(OK, Context alert - I understand some folks will say "But, Pete, not everyone gets a chance to learn about the system they are supposed to test." I understand, really, and I rarely see those folks being the ones who have the assertions made about "Just use production data..." for testing. OK? So, consider this a "your mileage may vary" disclaimer...)


Ummm, yeah. That is kind of brutal. It also tends to be what I see time and again.

Why? It's complicated.

Well, duh - unless you are working on a piece of software with 1 possible value for the 1 variable in the software, it gets complicated quickly. Frankly, even then it can get complicated. Software is complicated.

Get over it!

OK. Whew. Sorry. Let me try again.

What can we do to make it a little less complicated?

We can look at variables in question - like the possible values that will send the software down different paths. If we have specific conditions we want to exercise or recreate, what does it take to make that happen? What combinations of values do we need?

Maybe some basic data analysis to start? What are the range of values for the variables? What combinations lead to what paths?

We can check the business rules, right? Maybe have coffee with the people who actually use the software? Maybe spend some time sitting with them and seeing how they do the things we're trying to figure out how to test, right?

Maybe we can evaluate logic within the existing code to see what it does - and see what the changes might do, right?

Pretty straight forward, isn't it? So why do so many people punt and say "We'll just run some transactions from production through?" 

Data and transactions "from production" might be a start. Then look at the transactions that are weird - that cause problems, or odd behavior,in the wild. Of course, I've found that chatting with the people using the system on a regular basis can give us insight as to what these types of transactions are.

Doing that might give more information than sitting with them and chatting about what they do.  I've found that asking "what makes things go pear-shaped?" Ummm - maybe "What kinds of things do you run into once in a while - maybe every 6 months or once a year - that takes intervention by  someone?" It could be something odd or something really, really, common, but with something that makes it uncommon.

Having a coffee with them, or buying them a bagel and a coffee - might get a little extra help in finding information. It might get the experts to spend a little time with you working through the problem transactions. It might get you some really deep insights into how people actually use the software.

I find that to be valuable on many levels.

So, what about the people who find this too hard? The folks who are always surprised when there are problems even after "it was tested?"

Maybe because they "are busy"? Maybe because there is a pile of stuff to do and do all those other things take time and get in the way? I'm not sure. Maybe all these things.

Maybe coffee or a cigarette was more important.

Simply put, figuring things out takes time and right bloody hard work. If it was easy, anyone could do it. If it was easy, they would not need to pay someone to do it. 

Of course, I could be wrong. I could be making things more complicated in testing than I need to. I might be uncharitable to the people who don't go through the effort I suggest might be a good idea.

What I do know is that many of the systems people "test" with the "all you have to do is..." approach, tend to have issues that get in the way of people actually using the software.  Of course, it may not be their fault. After all, they used data from production to test it. Is it their fault that the data from production did not cause the problems other data from production caused?

Sorry about shouting earlier. I'll go have a tea and calm down a bit. I get a little upset when people put out rubbish as if it is a revelation from beyond...

Thursday, January 15, 2015

On The King's Speech and Testing

I was sitting with the lady-wife watching "The King's Speech" earlier this week. Did you ever see the film "The King's Speech"? Its an interesting study. The thing is, most people watching it saw it as a study in how one man, a Prince who would be eventually be King, even though he really did not want to be and had an elder brother, the heir, who was tied up in interesting "relationships" with people that were simply, inappropriate... the whole abdication thing in 1936.

Most people who see the film take away a story of a triumph of will on the part of the man who became King George VI, with the assistance of the speech therapist, of course.

As a tester, I noticed one distinct thing, early on - Albert (not yet George) and his wife, Elizabeth, pay a visit to the speech therapist (Lionel Logue). After a brief, unsuccessful visit with Albert alone, this second visit consisted of Albert and Elizabeth telling Logue, in effect, how they wanted him to do his job.

It was interesting, because he had been asking questions that made Bertie/Albert (not yet George) uncomfortable. Stuff where Bertie/Albert did not understand the purpose of the questions. When Logue explained that there was possibly information he needed to help Bertie/Albert within the answers.  Except Bertie and Elizabeth would have nothing to do with such, flim-flam unimportant silliness.

They wanted him to fix the physical problem of his stammer.

Ummm, Right.

How many times does someone, a PM or a BA or whatever, come in and demand that we, as testers, do something that will not serve the needs of the project, team or company - and tell us to do what we are told - and that is our job. So just do it.

So, we should just do it their way. Right?

Clearly, we should focus on finding bugs. Unless we should focus on making "sure the software works".  We should focus in ensuring confidence. 

We should focus on ALL of these things.

CODSWALLOP.

We might do those things, on some projects, in some contexts, when they are the right thing to do. Here, by "right" I mean within a reasonable professional code of ethics. Of course, it might boil don to "keeping your job" but that has never really held much sway for me - at least not in the last 15 or 20 years or so.

So, how to approach or respond someone who is telling us what we should be doing, with no real expertise or experience within their argument, other than "I'm your customer and this is what I want"?

I am reminded of the philosopher-poet who wrote:
You can't always get what you want
But if you try sometimes you just might find
You get what you need

What is Wanted vs What is Needed

Loads of people confuse wants and needs. There really is a difference, no matter what the marketing folks will tell you.  The hard part in sorting out what is needed from what is wanted is that, well, it is hard.

Really hard.

There is the noise/buzz/clamour that they "need it NOW!" Then there is the voice in the back of the head that says "something does not quite feel right; something is out of sorts."

So, how do we do that? The apparently easy way is, we tell them "that won't work."  Of course, I'm not sure that works either.

The apparently not quite as easy way is to say "Well, I'm not sure that will work, and here are my concerns..."

The thing that is wanted from us, as testers, is the one where we say "OK! I'll do precisely that!" Which will make the requester/demander go away happy, at first. Odds are, they'll be back not so happy, but that won't be for a while so that is just fine for now. We can figure out something to tell them later.  Not today.

The option that Logue used was simple. "OK, we'll do it that way." He then proceeded to do "physical exercises" and "training" to deal with the stammer - knowing that the chances of it working were... ummm... improbable.

In the process of working through these exercises, they conversed. They talked. At one point it became clear that Bertie was actually left-handed, but had been trained to act right-handed. This led Logue to comment that this was not uncommon. There were questions posed as "interesting ideas" that Bertie answered, simply because he was relaxed. His guard was down - and did not seem to mind.

In the end - Bertie came to rely on Logue - even apologizing for being a jerk. Well, not quite in those words, but well, it was the gist.

In the end Logue did what was needed - by being willing to help. Oh, and letting Bertie and Elizabeth know he wanted to help - and being willing to "do what they wanted" until it became clear that was not working.

And Testers?

We can push back, gently. We can offer help. We can set the conditions. We must also know what to push back against. We must know why.

I'm not sure if I can do what Logue did - at least on a regular basis. I've tried, with various levels of success. Some folks were OK with that. Other folks wanted something like "that and only that" - do precisely that one thing, exactly what they said.

I have a hard time with that, particularly when they can't answer basic questions around the intent of the software.

Granted, working as a consultant or contractor, you may have a bit of leeway that an "employee" may not have, at least on the surface.

Know how you can contribute, then do so.  Or not.

You may not get a CVO out of it, but I expect you'll be able to sleep at night.




Monday, January 12, 2015

On Conferences and Differences and Speaking

I've been watching a lot of discussion the last year or so on conferences - particularly in the realm of software conferences. Whether it be Approach, or Methodology or more niche areas like Agile, Lean, specific development styles - or, my favorite - Testing.

Wow. That was a long run-on sentence. That is what happens when I type what I am thinking and not worrying about format too much. I'll try and do better the rest of this post.

While looking at how people perceive conferences and ideas and sharing and learning, I often wonder what people are looking for in a conference. I know what I am looking for. I suspect there are others with similar criteria. There are also people who decide about going to a conference on totally other decision points.

I get that. Not everyone thinks in a similar vein as I do. It would make my day-job a lot easier if they did.  However, I suspect it would be quite boring (as in, not stimulating in any significant way) and I probably would not learn much.

For me, a conference counts as a something I am interested in going to if, in looking at the program and reading the abstracts, I can say "This is something I am interested in and a) I want to learn something about it; b) I want to get different insights on this topic; c) I want to see how this speaker approaches something I am familiar with, but I don't necessarily agree with."

There are other things - but those are the big ones. Some of those other things are - do I know any of the presenters? As in, have I met them? Engaged in conversation? Exchanged emails? Maybe I've read some of their writings.

The thing is, I am fairly certain that a fair number of people don't use the same measures. I know some who look to see where the conference is. Some folks are more interested in the activities around the conference more than the conference itself.

Well, if asked directly not, but... well - if the conference is at a famous resort, are they drawing people for the content or for the non-conference activities? Now, in fairness, there is usually very good content at some of these. Others? Well, for me, there is not anything in the program as a draw.

What makes me different from still others, is that I don't really need to look for people who look, and sound, and think like me.

Because the majority of speakers at the majority of conferences look a lot like me: Male. Caucasian.

I get it. I really do.

Other folks, I'm not so sure that they get it. I also am not so sure that they see a problem with that.

I am working on a conference this year.  I am the Conference Chair for CAST - the Conference of the Association for Software Testing.  This year it will be held in downtown, Grand Rapids, Michigan. Where I live. The venue is nice. A conference center hotel that emerged out of a historic hotel - one of the grand old-style ones that have become rare in the US. The nice thing being downtown, there are loads of activities, events and nightlife within walking distance.

I am working with a group of people who are working hard on putting together the best program they can. Ummm - I don't get a vote on the track talks and workshops.

They are building that part of the program from the proposals that are submitted by people. Testers, Developers. Other software professionals and - maybe students and people who are interested in software and testing.

The thing is, if you are a person who is looking to see speakers who look like you at this or any other conference, there is nothing my colleagues and I or our counterparts at other conferences can do if people do not submit proposals.  I can reach out to individuals and encourage them to submit proposals. The members of the program committee can encourage people to submit proposals.

But people must submit proposals. 

If you want to see more women speaking, or more people of color speaking or more - whatever.

Submit your proposal.  I'd be very happy if you submitted a proposal for CAST. 

But - Submit your proposal.

Thank you. Other people will be happy you did.

Submit your proposal.   Today.

Monday, December 29, 2014

On Looking Back and Forward, 2014-2015

As the year wraps up, I'm spending a bit of time sipping coffee and looking back on the last year.

There have been some changes, sure - I actually took a full time position in July (for those who missed the subtle notice then) with Gordon Food Service in Grand Rapids, Michigan. Still, my role remains much the same - help them do better testing.  In doing so, Ive been on a deep dive looking at how testing is done - now I need to figure out what the next steps need to be and work on making them happen.

Simple, eh?

Looking Back -

The local tester meetup is continuing to help me learn and share ideas locally. I am grateful to all the usual suspects who get me to think about things I often would not otherwise consider.

By sifting through my blog posts from the last year, I expect you'll get an idea of some of the things I've been thinking about or trying to work through in my head.  It is probably easier, and shorter, to simply point you there.

So, I guess that brings me to Conferences. Most of them I captured in live blogging (or almost-live blogging) at the time.  If you want to go back and read them, by all means, do.

Let's see - the first conference I participated in this past year was in June - Nordic Testing Days in Tallinn, Estonia. It is a very good conference in a very nice, historic city. Loads to see and do - the National Parliament building, museums and many fine shops, restaurants and the like.

The most important thing I remember from this, and other conferences, are the people.  Folks like Huib Schoots, Dan Billings, Stephen Janaway, Peter Varhol, Gerie Owen, Gitte Ottosen, Rob Lambert, Raimond Sinivee, Raji Bhamadipati, Helena Jeret-Mae, Ruud Cox, Bill Mathews, Kristjan Karmo, Irina Ivanova, Rob Lambert, Aleksis Tulonen, Andrei Contan, Rikard Edgren, Martin Nilsson... loads of people worth meeting, getting to know in person, renewing acquaintances with...

August had me participating in CAST. The 2014 edition of the Conference of the Association for Software Testing was in New York City, in August. This conference, for me anyway, was unlike most conferences I've been to or participated in. I spent three days at the Registration table, checking people in, answering questions, directing people to where they are trying to go - and running the odd errand for speakers. Really, it was a pile of work and great fun all around.

One thing that makes CAST interesting is that people attending like hanging with other participants - even when they don't know them (at least when they start chatting.) The Sunday night before the conference, there were 15 people from the conference hanging out talking, having pizza and other food, and having a laugh or three over beer, wine or ...whatever. The fun part was, none of that was planned.

This is topped only by the hallway meetings and conversations that can last 3 minutes or 3 hours. And CAST this year was no exception for me - Let's see- time with Fiona Charles, Erik Davis (and his crew from Hyland Software - these folks get it), Huib Schoots, James Bach, Griffon Jones, Matt Heusser, Karen Johnson, Michael Bolton, Selena Delesie, Michael Larsen, James Christie, Richard Bradshaw, Smita Mishra, John Stevenson, and ... the list just goes on.

The simple thing is, solid ideas can be discussed and shared with colleagues, new and old, given half a chance.

In October, I was at StarWest, in Anaheim, California. Yeah - Disneyland. I had not been there before and was not quite sure what to expect.

Aside from the lovely weather and nice venue (the Disneyland Hotel - yeah, its nice) It was very nice seeing familiar faces - Michael Bolton, Rob Sabourin, James Christie, Jon Bach, Griffon Jones, Martin Nilsson, Ben Simo, Paul Holland. It was also good to meet some new folks, like Julie Gardner and Rob's lovely and charming wife Anna.

There were many people I had great conversations with like, well, there were the nice people I had a glass of wine with one night, and talked about recruiting and training and motivating people - I'd tell you their names except, well  - OK folks - I don't remember all your names - BRING BUSINESS CARDS AND HELP ME REMEMBER!!!!!!! (ahem - sorry)

November (sheesh, sounding like quite the jet-setter!) found me in Potsdam, Germany for Agile Testing Days. This was an amazingly rewarding trip this year. it was unlike any I've been to before. I was crazy busy - like CRAZY busy. Some of it you can see from the blog posts from the conference. Some of it - well - to quote the great, inspirational tester Conan the Barbarian, "Time enough for sleep in the grave."  Let me just say that 15 minute naps help a great deal.

People - Oh my, where to begin - The wonderful staff from the conference - Jose Diaz, Madeleine Greip, Uwe Gelfert, Maik Nogens - they were everywhere doing everything - and still made time for you to feel like the most important person at the conference.


I had conversations with loads of people - some planned, some unplanned. Chats, unplanned but not un-hoped for with Bob Marshall (@Flowchainsensei), Meike Mertsch, Jean Paul Varwick, Huib Schoots (AGAIN!), Darryn Downey and the crazy folks from Paddy Power, Dan Ashby, Emma Armstrong, Chris George and the folks from Redgate, Tony Bruce, Selena Delesie (and her wonderfully charming son!), Markus Gaertner, Andreas Grabner, Bill Matthews, Carl Schaulis, Lars Sjodahl, George Dinwiddie, Richard Bradshaw, Alan Richardson, and, and... and more. Loads of people (can ya tell?)

{Updated - More, like, Danny Dainton (how could I have forgotten him?) and Maria Kedemo... that will teach me to try and 'power through' when I am tired...}

Then there were the planned chats. I was asked to "interview" selected speakers and participants from the conference. What would you chat about with Janet Gregory, Lisa Crispin, Joe Justice (from Scrum, Inc.), Matt Heusser, Maik Nogens, Jose Diaz,and the members of Cesar Brazil?

Who? Team Cesar Brazil! The winners of the Software Testing World Cup (STWC) championship! Smart, Bright, Intelligent people - and very, very nice - An excellent combination. The team consisted of Alessandra Cursino, Jose Carrera, Melissa Pontes and Rodrigo Cursino. A lovely conversation.

So, 15 to 20 minutes or so of chatting over three days was loads of fun. The interviews were recorded and I expect will be available shortly.

One other thing - I tried to have informal chats with ALL the competitors in the Finals of the STWC. The folks were friendly and smart - if they reflect the future of testing, I have hopes that future will be bright:
Army Ants from Romania, consisted of Ileana Brodeala, Lavinia Cazacu, Sanda Cristina Pop, Irina Savescu. OK, The name for the team was inspired by Big Bang Theory - very fun.
Teststar from China, consisted of Nicole Niu, Humphrey Chen, Eva Hao and Harris Wei.  A very bright and talented group of testers.
Quadcore from Canada (Kitchener-Waterloo, ON) was made up of Shivani Handa, Shuman Ip, Persis Newton and Richard Bouffard. 
The Annunciation from New Zealand consisted of Joshua Uriele, Joseph Walker, Henry Ashton-Martyn and Mark Tokumaru. Very out-going gents with good thinking skills and very friendly.
Open Box from South Africa was Andrew Thompson, Carin Eaton, Delicia Oliver and Ryan Hill. Right. These folks were outgoing, welcoming, ready to learn and have a great time.

To each of the people from the teams, particularly those of you I was able to have longer conversations with, remember that you have good skills and wonderful gifts. Share them. Write, speak, help others learn. I look forward to hearing great things about you all.

Looking Forward - 

For me this coming year, I can safely say things will be a bit different. I am not looking forward to what is coming.  I am planning on fewer conference this year - I am looking forward to some new adventures.

One of those adventures is chairing CAST - the Conference of the Association for Software Testing.  The 2015 edition will be in Grand Rapids, Michigan, August 3-5. Look for more on that in this blog as we get closer.

There will be more articles coming, and more sharing by other methods.

It is the conversations and connections that mean so much to me, and help me learn more about testing every day. Thank you for sharing my journey so far.  Won't you continue along with me in the future?

I look forward to the company -

P.S. The UNICORN! Dude - I'm so sorry - I don't know how I left you out. Yeah, I know, I know - I'll buy you lunch tomorrow? Thanks for everything - P.

Wednesday, December 24, 2014

On Symbols, Metaphors and Understanding

I had an interesting conversation today. We were talking about testing, a group of us at coffee this morning, talking about how to get messages over to people. How do you explain things that people are not understanding in a way they can "get" it.

We talked about how visual representations can help some folks and how some people need more, well, linear representations.  I talked a bit on how I have used mind-maps in the past - to track requirements and look at impact and risk and, well, stuff.

Then we talked about how we can present ideas to other people and get some core ideas an questions around them can be explained, sometimes in other ways.

For example? Well, Christmas for example.

So, for a long, long time, Christmas has been celebrated in December. Now, the stories around the "Nativity Event" all point to Jesus, you know, the guy whose birth is commemorated by the celebration/Holiday of Christmas, was born in the Spring. Likely around April. How is this clear? Well, consider that in Judea in the 1st Century (BC/AD - whatever) shepherds did not watch their flocks at night in the dead of winter - they did that when they were sent to pasture - in the Spring - April likely.

Yet, somehow people got the idea of celebrating the birth of this fellow in December seemed a reasonable idea. Of course, that has naught to do with reality. So, because there were large celebrations and festivals in Rome in the 1st Century AD and well into the 2nd and 3rd - this time of year honoring Saturn - the Saturnalia. Now, Saturn was an interesting character in the world of Roman mythology.

He was a complex figure thanks to his multiple associations, history, and stuff. He was the first god of the Capitol, known since the most ancient times as Saturnius Mons, and was seen as a god of generation, dissolution, plenty, wealth, agriculture, periodic renewal and liberation. At some point, he also became a god of time (not a Time Lord, significant difference there.)  The Temple of Saturn in the Forum (the "city center" of Rome) housed the state treasury.  Cool, eh?

Well, the big celebration of Saturn was, as I mentioned, the Saturnalia.  Good party, that, I expect (my grandson's questions to the contrary - I wasn't there to actually participate, mind.)

Included in the celebration, was gift giving.  Sometimes they were quite extravagant - other times, simple and fun. And children got games and toys and fun stuff.

And somehow, the idea of visiting people and sharing meals and gifts seemed to fit.

And, according to a couple of versions of the story, followers of this, Jesus fellow, joined in - about the same time, in December, around the Solstice, they began sharing meals and gifts. Instead of honoring Saturn, they honored Jesus' birth.

Then there is the word Christmas, derived from "Christ's Mass" - the religious commemoration of the "Nativity Event." There we have it - a celebration commemorating a thing that happened no where near the actual date of the events commemorated, but was similar to other celebrations happening at a given time of year.

And you know what? I don't think it matters.

Much like how we explain things - like, how we get people to look at other symbols - this is a symbol that people use to teach loads of things. Like, peace and charity and love.

Look at the things used as symbols to teach these lessons. The hammer of Mithras (who was born about this time, according to 'myth', grew to adulthood quickly and died to save his people, his followers - to rise from the dead and lead them to victory) the sun of Saturn (who brought wealth, prosperity and health to the people) St Nicholas/Father Christmas/Santa Clause who brings gifts/rewards to the worthy.  All are symbols associated with this time of year.

All taught lessons at different times to people, who passed them on and taught lessons to live by.

Explaining fundamental ideas to people is a challenge, particularly when those people are young children. Sometimes the ideas around testing are things that are hard to explain any other way without symbols and metaphors, particularly with those who have no real connection with good testing.

We use terms that encompass what we mean - but then we don't always know how they relate to other people or their experience. So we need to try and explain, somehow - and we find ourselves looking at the idea of bags full of gifts and magical mutant caribou.

These symbols are representations of aspects that are important (or were important in some respect at some time) to the holiday or festival being celebrated.  We use symbols and representations for what we do as well - mind maps, requirements documents, test plans, design documents, process flow diagrams, state diagrams, transition diagrams, and (dare I say it?) bug reports.  These are not the thing they are explanations and representations of the thing.

So, while I consider the testing equivalent of  magic bags and reindeer, let me wish you the greetings of the season -

Io Saturnalia - Happy Solstice - Happy Hanakkuh - Happy Kwanzaa - Merry Christmas.



 

Saturday, December 20, 2014

On Waiting and Testing

A couple of weeks ago I wrote a post about childhood views on the world and how we need to grow beyond those simple concepts we learned once upon a time, and develop an adult view of the world.  This allows us to continue growing and learning. 

It is only fair that as I write this, it is one week until Christmas Day - that much anticipated, looked for and hoped for day by people all around me.  Children and adults alike look forward to the wonder of the season.

Yet it is in this season that so many people rush in soon and quickly. The anticipation of childhood, the waiting, the excitement, is part of what makes the "magic of Christmas" well, magic - and I think is missing for many people these days.

It is the sharing of this feeling that makes things so... wonderful for so many people. Still, it is the idealized memory of what Christmas "should" be that causes so many people so much stress. I suspect that at least part of this is related to the ever-earlier start of "Christmas Shopping Season" - This year it seemed to be mid-October and Christmas displays were up in shopping malls.

Except I am not writing about Christmas.  I've been thinking about testing a lot lately.

If there is one thing I'd suggest to people when it comes to testing is, take a deep breath and wait a moment.  Make a nice cup of tea. There are practical reasons for what so many non-tea drinkers look at as rituals around tea.

Things like heating fresh cool water, warming the tea pot with hot water, pouring out the hot water, adding the tea leaves to the pot (I prefer leaves in an infuser - some use tea bags, and that's OK, at the office I do the same thing.) Then pouring the not-quite-boiling water into the pot and ... tea.

But that takes waiting a moment.  It takes knowing the right time to do the right thing.  Not too soon and not too late.

The thing with testing I've seen lately - so many people want to charge in and test stuff.  NOW! Just DIVE IN!

I might suggest making a tea... or a coffee if you prefer.  I like a really good cup of coffee, too. I blogged about that once upon a time. I really like a good cup of coffee.

Ask some questions. I might start with asking something like "Why are we doing this? What do we hope to learn from testing this? If you are telling me how you want the software tested, will that 'how' answer the question 'why'?"

This might seem obvious to some people I often associate with. To others, I think they might not understand why.

Another set of questions might start with something like, "Is there something we should probably know about this that we have not considered?"  Another way to ask that might be "Is there something acting against the system, or the data that is used by the system, that is important? Maybe that we have not asked about? Is there something that 'everybody knows' we have not thought about?"

If you are testing a system that you have been participating in developing, some of these questions may not be so important - if you have been in the discussions around how the software should work, and why. Of course, when it comes to that, it might be important to ask yourself if you have built up an immunity to such things. Your certainty and understanding might be of value. Then again, if you clear your head and have a tea, then what happens if you look again with fresh eyes?

When you are handed software and told to "just test it" then remember that sometimes waiting a bit allows you to discover something you had not considered. Asking questions of people might reveal something important to you.

Those questions might tell you something about the software.  The problem I see time and again is that people want to try and force the issue. They, or their bosses or developers or managers or PMs or someone, want them to jump in too soon.

Take the time to see what things look like after a tea, or a coffee.  You might learn something about the software. You might learn something about the attitude of people you work with toward testing.

You might also learn something about yourself.


Tuesday, December 9, 2014

On Learning, Growth and Leaving Childhood Behind

Of late, I've been thinking a great deal on how people learn - how they "stay current" in their profession, to resurrect a buzzword from longer ago than I wish to recall.  That got me thinking.

What have people done to learn something recently?

Let me see if I can explain where part of this thought developed from. 

In the "Liturgical Year" I am writing this in the Second Week of Advent - the time of religious and spiritual preparation for Christmas. The Sermon/Homily this last Sunday was one that featured a note of irony.  The gist of it was that Advent, the season of preparation, is one of waiting.  People get ready for something they know is coming, but are not sure when it will actually come.  The priest's point was that in an era when Christmas music starts on the radio and in shopping malls a day or two after Hallowe'en, when society is rushing toward a fixed date, the Church is asking people to pause and consider what may be coming. 

On top of this, a few years ago, before the previous pastor retired, he gave a sermon about this same time of year.  He stood in the middle of the church, the main aisle, and pointed at the stained glass windows that ran the length of the church - both sides.  They really are lovely to look at.

Anyway, the pastor made a comment that many people's idea around matters of faith are those of a small child - the same basic things one learns in school, maybe age 7 or 8. People like the idea of Christmas and the child in the manger and the shepherds and the lights and tinsel and what-not.  But they don't like the "opposite bookend" as the pastor described it.  They don't like the story of that same child beaten, flogged and executed in a manner that boggles the mind of most people in this day and age.

As he was talking about this - he pointed to two windows.  One, on his right, depicted the Christmas story. The other, on his left, exactly opposite the first, depicted Good Friday - the death of the same baby.  His statement was simple. We can't accept the simple, childhood story without looking at the hard truth that accompanies it.

As mature adults, we need to step beyond the things we "learned" early on - either as children in school or as fledgling software professionals.

When was the last time we challenged our own beliefs and presumptions? When was the last time we critically considered what we were about? Have we become complacent?

Are we still operating based on our childhood understandings? Are we still operating on things we learned years ago and have not thought about? Are we passing on this same "wisdom" to people without a deeper understanding? 

Why?

Should we simply accept the pronouncements of people whose learning and understanding stopped when they were 7 or 8 years of age? What about "professionals" whose learning stopped 10 or 15 years ago? How about 25 years ago?