Wednesday, August 28, 2013

Testing Ideas CAST 2013 pt 2

Wednesday morning dawned, well, early in Madison.  Warm already, but nothing to deter the hard core from heading out to Lean Coffee.

There is a Keynote address by Dawn Haynes this morning, more track sessions, ice cream in the afternoon, SIG meetings in the evening.  Did I mention there will be ice cream in the afternoon?  :)

And here we go!  LeanCoffee is underway. Fewer people than the last few days - of course it has been some late nights.  First question is related to "motivating" people to do more.  Ohhhh, my. 
Problem is, of course, it is really hard to "motivate" someone.  A better question might be "is there a problem with that person's work?"

Next question - Loyalty to test or loyalty to a product?  Getting people to "engage" in something that goes against the company culture?  In silo'd orgs, can we count on people to look at the craft of testing beyond what we need them to do?  Should we call on people to consider more than is needed for their job - right now?  Forcing change or "cross-team communication" by fiat is not likely to work unless there is an underlying interest - and the directive is essentially a nudge to move forward.  If there is no real interest, then mandates won't work.

Next question - How to "sell testing" to job candidates?  Oh - did you sit in Erik Davis' talk yesterday? Yeah, but I wanted to talk about it anyway.Interesting question.

How often do we do Lean Coffee?  As often as we can?  Really - that was kind of the answer we came up with.  Talked about LeanCoffee.org for ideas.

How do you interact with "customers"?  Hah!  What's a customer?  What is the difference between a customer and a consumer?

===

OK -  In the main room.  Loads of announcements - Education initiatives supporting non profits (Summer QAmp and Per Scholas); micro-conferences - one day affairs to give "a taste of CAST" and build the testing community; webcast training sessions - for members and the broader community; things we are already doing: mentoring; grant program for local groups; bbst...

Election.  I was honored to be announced as having been re-elected to the Board of Directors of AST.  Thank you.

Dawn Haynes gave an extremely personal "retrospective" on success, failure, what is good, what is not good.  Her "I'm not an expert but I'm going to give you advice."  OK - I can live with that.  The curious thing was the relationship she drew between introspection, retrospectives and what we do with software.  In her case, it was learnign to skate as an adult - and compete as an adult in skating contests.  She ALSO had some video from 1995 of her skating in the "US Adult Figure Skating" championships - or some such.

What counts as failure? Is it falling down failure? Is having no idea what happens next failure? Is having no plan or no concept of what you can do failure?  It kind of depends on what is going on and what you are expected to be doing, no?

In reviewing these ideas, what does that do for us?  When we look at testing, what can we draw from that.  The thing is, msot people find it really hard to examine themselves.  It is really uncomfortable for many people.  (Pete - yeah, it is extremely hard to do.)

OK - this is too good to write and listen at the same time.  Read Michael Larsen's live blog here: http://www.mkltesthead.com/2013/08/larger-than-live-cast2013-day-3.html


---

Break and then in Cindy Carless' presentation on lessons for software testing from the book The Elephant Whisperer (Lawrence Anthony).  Game preserves in South Africa tend to be wide open expanses of ... wide open expanses.  If you're like me you've some idea from shows like "Wild Kingdom" and things of that ilk.  The interesting thing is that there are areas within the reserve that are set aside as "safe" places - fenced/controlled areas that are perfect for rehabilitating injured or traumatized animals, like juvenile elephants.  These closed areas, called "boma," are bits of reserve within the reserves.

The thing is, for them to work, the environment needs to be handled and managed carefully or the problems you are trying to fix.  This includes animals not part of the herd - as in there are critters that are not elephants present but other, broader part of the eco-system.

TO make things work, the matriarch (dominant female) of the herd needs to trust Lawrence for his work to succeed.  There were a huge variety of elephants involved and in the boma that needed help.  That would not happen if the matriarch did not trust Lawrence.

Relationships are funny things.  We see problems or some terms around the relationship that we have not anticipated.  This gives us potential for some level of conflict on projects even though we are dealing with people and not elephants.

One way to build trust, and consider how relationships are being established, is a debrief.  Cindy used a daily debrief - even when she was the only tester on the project.  This was done through journal-ing and logging.

Giving feedback is important -really important.  PMs are looking for feedback all the time, as are devs and other project participants.  Anthony's book describes how he encountered various forms of feedback from the elephants - and struggled to learn to give feedback to the elephants.  This is really similar to a testers (or new tester/test lead) need to do with project participants. 

Her model is similar to what others have described, yet is worth reiterating here.:

Get Context;
Describe/stabilize environment;
High Level Sanity (definition);
Areas of Interest (tracking progress through the known area);
Risks/Opportunities.

Cindy's background includes a degree in commerce - translated - she did not intend to be a tester.  Hey - that is fairly common at CAST!  She also found ET to make sense.  The question of blending structure with application understanding/knowledge

"Bush Lore for Testing"
* The Boma is a starting point - the closed environment is only a safe place to begin, you must move out from there.
* Poaches are and will be a problem.  (She refers to Political Poltergeists)  Same idea.  Something happens you did not anticipate - sometimes this is done intentionally by others.  Learn to deal.
* Change creates discomfort - Change means setting things aside and doing something else, the familiar is replaced by unfamiliar.
* Whisperer - openenness to joint discovery - partly controlling, partly trusting.
* Respite is often found in unusual places - brilliant example of this, the buck chased by cheetahs that jumped into the back seat of an SUV w/ its window open. (Any port in a storm)

Afterward: On Anthony's death, the herd of elephants he had protected and rehabilitated, went to his settlement and paid homage (in elephant fashion) similar to how they do when an elephant dies.  They stood around his domicile, his home quietly.

Open Season: Sometimes we don't know what is happening.  We see behavior that is wrong and fail to understand what the cause of that was.   In Anthony's case, a rehabilitated elephant had a change in personality for unknown causes.  The elephant became violent and extremely aggressive.  Decision was made to put the elephant down.  After that, in the process of dealing with the body, they discovered an abscess under a tusk.  This was the cause of the elephant's change in behavior.  They (Anthony) had not identified it in time to save the elephant.
==
LUNCH TIME!
==

Peter Varhol's session on software failures was the session I went to after lunch.  Its an interesting walk through some signficant failures over the last 20+ years, including the Mars Climate Orbiter, the MS Azure failure, the 2003 Powere Outage.

Central lessons from these -
* Testers are an essential part of the prohject team;
     - thanks to a unique perspective software projects need;
     - as long as they exercise their skills in that pursuit;
*  Testers must try to break the system
*  Testing against requirements is needed (my take is that
*  Test like people may die (in some contexts, they might.)

Nice summary.
===

Next up - Testign when software must work by Barbara Streiffert.  Barbara is with JPL - yeah - the Jet Propulsion Labratory.  They do Space stuff.  How do you test stuff and not rely on simulators?  Wehn it goes in space.  Yup.  One shot - it better be right.  How do you do that?

She's starting with a way cool computer generated simulation of the deployment of "Mars Science Labratory" - the Rover Curiosity.  (Which, by the way, is a great name for an exploratory device.)  She's explaining the functions in the rover, which is really pretty cool.

After explaining the basics of terms for their context, she's off into the hard stuff.

Right - this is the classic formal review process - extremely rigorous.  They have a variety of software classifications.  Class A softeare is human related - stuff people interact with.  (Her group does not work on that.)  Class B is Mission Critical stuff - as in if this fails, the craft crashes.  Seems pretty straight forward to me. there are others, but you get the idea.   Test software is Class D - its related but not critical.

In their environment - if a critical bug is found in the course of a mission, everything stops except work on fixing that bug.  Time is truely of the essence.  Their core rule is to fly as you test and test as you fly.

There are some things that are constant tho - each project is unique.  Software is developed as a service for each mission.  There are unique rules for each mission based on the nature and characteristics of the craft - the platform its running on.

Fun fact, commands sent from JPL / Flight control to an in-flight craft is sent binary, with appropriate handshakes, etc.,

Another fun fact - JPL's "test scripts" are SOFTWARE - Yeah - code.  This is stuff that doesn't have a UI, right?  Testers are considered an integral part of the team.  They are developing their code alongside the devs and working together.  (This is something I belive Jerry Weinberg would recognize from his days at IBM & NASA.)

Quote - "Code that is dependent on unstable third party software is extremely difficult to test, if not impossible." Somethings are always true.


OK, this is interesting.  The development methodology is Scrum.  Yeah, who'd have thought?  OK, for all the Scrum-iphiles, guess who the Scrum Master is.... come on, guess.  If you said "Barbara Streiffert" you'd be 100% right.  Yeah, a tester who is scrum master.  What's not to love?

Now, they are not "pure" scrum, so the whole hard-core scrum folks are screaming "You're doing it wrong!!!!!" - No, sorry - They are serving their context.  Lives can be lost if a critical error is not addressed.  That means a 2 to 4 week sprint can be interrupted to fix a much bigger problem.

The solution is really simple to that - schedule people at 50%.  No really - it works.

There are some really important ideas she is giving way too fast for me to type.  "Testing is a continually changing process and updating that process is crucial to its being successful."  "Test task isn't to do testing but to plan testing."

Understanding software and its purpose - the application - is crucial.  Without that you will not succeed.  (Pete note:  That applys to so many places, including every place I have ever worked - EVER - not just as a tester.

AND - she's wrapping with a video of the physical test of the skycrane.  This is the thing that lowered the rover to the Martian surface.  Pretty cool.

Open Season: HUGE comment: OK, Process weenies - pay attention.  "We talk about bugs every day.  When there is a critical one, there are phone calls or people find you.  There is a formal reporting communication system but it takes too long."  (Emphasis Pete's)  When something is important enough, you MUST drop the process and fix it NOW.  Really. waiting for the bug report to cycle through will take too long. 

===
Rob and Scott Summary
===

A fun bounce through their favorite bits of the last couple of days.

Ilari's presentation - via Rob - You do not have to fight for every bug.  Lack of priority not time. Absence of Evidence

Michael Hunter's presentation (via Scott) Describing tests at a high level (vs scripting them at a detail level) is a defense against inattentional blindness. 

Michael Larsen - via Rob - What we see vs what we think we see ; terms we take for granted -

Justin Hunter - via scott - I'm not the only one who believes that "Design of Experiments" is a more valuable model for testing than QA.  (I may be wrong but I'm not alone.)

Geordie Keitt - via Scott - Alternate model for "Know your mission 2 levels up':
*  What is context;
*  Does my context encompass the task;
*  Does my boss' context encompass my context;
*  Hoe can I better match my context, my bosses contest & Tasks;


Heather Tinkham - via Rob - Build Bridges ; Dare to disagree ; Being wrong ; Build on Limitations ; Seeing as forgeting the names of things ; Testers make mistakes


Dawn - via Scott - Dawn can deliver one hell of an experiential talk with far less prep than she believes - and so can you.
Dawn - Via Rob - Life can be understood backwards but can be lived forward;  I wasn't a tall child  ;  Judge less explore more. 

Rob and Sabina - via Scott - *True* mentoring is a 2-way lifetime contract predicated on trust, respect and a mutual commitment to continuous improvement.  Cool point - Mentor & Mentee side by side without

Cindy - via Rob - Learn in a safe haven ; model testing on ecosysytem ; regression as a swear word ; be vigilant like a sponge ; build trust w/ matriatrchs ; daily debried even when alone

Dee Ann and Manuel - via Scott - CEO - Don't throw excel spreadsheets at me.
Tell me what I should be concerned about.
Don't whine.
Articulate your strategy or replace yourself.

Anna - via Rob - Use focus groups w/ users to support testing both UX and functional;
Role of quality leader in agile transition;
"My Theory" - anne elk - Monty Python;
Agile Teams should have quality advocate ;
If I tell a man what to do, he freaks out ;
Whole team should be on the same page across the board ;


Markus - via Scott - Schools concept continues to offend many ;
Discuss culture instead.
















Tuesday, August 27, 2013

Testing Ideas CAST 2013 pt 1

OK!  Here we go.  Saturday, this past weekend, was the "Test Retreat" organized by Matt Heusser (Excelon Development) and was a fantastic kickoff to the event that is CAST.  Michael Larsen wrote about it - look for his stuff  here:

http://www.mkltesthead.com/2013/08/live-from-madison-its-testretreat.html
http://www.mkltesthead.com/2013/08/cast-2013-day-1-live-and-loud.html
http://www.mkltesthead.com/2013/08/day-2-at-cast2013-live-and-dangerous.html

==

The next day was a meeting of the Board of Directors - a long affair that was challenging for the participants.  I was there as a member of the Board.

Monday was full day tutorials/workshops.  The evening featured Matt and I presenting our Tester Round Tables Exercise.  More on that in another blog post

Which brings me to This Morning -

0.  Lean Coffee - We started Monday and Tuesday with a Lean Coffee session at a local coffee shop.  We had 15 people on Monday and 10 people on Tuesday.  I presume it was the result of extremely late night as we started at 7:00 AM.  At breakfast, there were a fair number of very tired looking people who mumbled something about being too tired to make it to Lean Coffee.  That's fine.  We're going to launch a Lean Beer tonight.

If you're at CAST, look for tweets on that announcing when and where.


Jon Bach's keynote on Argument.

His take - in summary - reasonable people can disagree.  Some do it bluntly, some do not.  Citing Stuart Reid's keynote from 2010 at EuroStar.  Interesting thing about that presentation - Jon's fight went viral.  His brother James described it not as an argument or a debate but an ideological war. I think there's something to that.

One observation(really important) - One thing we testers should be good at is argument.  Every bug we file is an argument - or a possible argument.  We need to advocate why that bug is important.  If it gets dismissed we resist.  If we don't are we really good testers?  Or are we faking it?

Can we deal with problems or fight with people who don't like what we say?  Who knows? Can we debate within our community?  Can we go "looking for a fight"? Can we jump to conclusions?  What about jumping to conjectures?  Jon (I agree) suggests that conjecture is what we do all the time.

Dale Hample - Arguing: exchanging reasons face to face  -  Yeah - something to that quote.

In short, Jon is presenting an idea similar to Lee's "fighting without fighting."

Interesting consideration, "root cause" of bugs - Humanity.  We're people.  Testers, Developers, Designers are human - they are fallible.

All Desire Causes Suffering - not the "don't want anything" instead, consider what it is that you want, don't attach excessive importance to those wants.  That is what causes harm.  Know what your values are, then stick with that.

There are several very good points - most grabbed in twitter.  Fear and concern and problems and conflict and ... stuff are related.  I think there's much to that.  Then there is the "Test is Dead" thing from 2011.

Really?  Yeah - HUGE conflict.

Now, consider the craft of testing as an investigative process.  When the argument seems totally having jumped the shark - consider reframing the argument.  In that, we have a consideration: testing is related to journalism.

Why?  Testing is interrogation & investigation in pursuit of information to aid evaluation.  That is what journalists - when they are at their best - are doing.

And now - open season!

- interesting opening question - argument / discussion - can they be interchanged?  Sure.  There are ways to make a case for something where you are aggressive and confrontational.  Other times, a more nuanced approach is of value. 

- Sometimes argument is not bad - It can serve to educate others on your position.  You may not convince your opponent, but others observing may be enlightened in what you value. 

TRACK SESSION

What's Software Craftsmanship - with Jim Holmes and Matt Barcomb

Right - sitting in the first track session of the season.  Jim tweeted a pair of links last night - I intend to find them.  And here they are: 
Code: https://github.com/jimholmes/PresentationDemos/tree/master/CAST2013
Slides: https://speakerdeck.com/jimholmes/cast-2013-software-craftsmanship-for-testers

Opening argument is that "software craftsmanship" grew from reaction to a couple of things, including Agile Community - and the programming community.  They launch into a discussion on 'what is it?' as in how do the people in the room think about it.

A variety of views from "becoming the best you can" to passion for the work.  An idea is presented that there is a community around this.  There are some things that are common and some are a little more interesting.

What is Test Craft?  How does one grow "craftsmanship"?

Why do we care?

* Maintainability;
* Continued Delivery of Value to Stakeholders (not customers);
* Sanity of the team;

These apply to testing as well as the code. 

Who cares?
* Testers
* Codes
* Whole Team

Generalization - if one is looking at the "whole team" - really that means people have a clue what the other specials do.  So, yeah, testers, get an understanding of what some basic terms mean.  You don;t need to be able to write production code, but understand some of the fundamental concepts.  ADDITIONALLY - Coders should learn fundamentals of testing beyond what unit testing they do.

Learn the Meta knowledge associated with your specific trade within the craft (eg., testing within the world of software development)  The Heuristics that we can keep handy to infer the years of knowledge it takes to understand programming - just as they can learn core Heuristics in testing.

It is not just OK to be specialized.  It is fantastic - AND - that means you can then address the needs of people without your experience.  They should be willing to do the same thing to help you understand their craft.  THERE is where "the whole teams" kicks in.

Concerns - Don't confuse "gilding the lily" with being a craftsman.  Too much of any one good thing is ungood.  Beware of being drawn into the trap of "this isn't good enough."

How do we care?

* A tester who works alone but wants to improve;
* During code reviews;
* During tester-coder pairing;
   * Tester as Navigator;

Right - so much of this is related to the idea of sharing information.  Testers don't need to critique the code (oh, this thingamajob looks incorrect and potentially inefficient. if you tweak the do-higgy it would work much better.)  Instead, look for the broader themes

Pairing is hard work - as humans we don't do well sitting 2 and a half inches away from someone working while that person is looking over your shoulder ALL THE TIME.  You need to take the approach of "we're here to help each other and our end goal is to make good stuff."  Jim makes the observation that no pair starts out fantastic or even good.  You each must work to build trust in each other.

Matt made the point that a tester who has learned some solid coding heuristics can actually make the tester helpful to developers.  (Pete note: absolutely - been there done that)

Coding Smells &  Anthony Marcano's Testing Smells (via Rob Sab)

Think of smells as the "spidey sense" that you have - be aware!  Observe!

Jim's "smell priorities" - WRONG -

Preferred order of these things:
* Optimization / speed
* correctness
* readbility

In code - Readability TENDS to trump everything else.  Unless it doesn't - hence the idea the heuristics are useful and  might be wrong.  Sometimes things will trump readability - except sometimes readability will help those other things...

Jim is now citing ideas I learned (mumble mumble mumble) years ago.  (Pete - while learning BASIC, FORTRAN, PASCAL and COBOL.  Yeah, somethings are still true.  You young whippersnappers aren't the first ones to struggle with this.  Argue with me and I'll get out my cane and beat you if you stand still.)

For example - Don't repeat yourself - DRY - as a rule on designing code.  Simple wins.  Simple is great.  Unless there is not enough abstraction - you've simplified it so much, it is of little use.

Not Enough Abstraction - Abstration is a pretty common idea - take one thing away and deal with it somewhere else.  This can make things easier to understand.  Unless you take this to a bit of an extreme - Loads of helper methods, wrappers within ... yeah ... yeah you get the idea.  Pushing things off is helpful - unless it is introducing problems - eg., not clear what is going on.

Complexity - so... what is this to us?  Well - let's see - 45 IF statements nested 8 levels deep.  1 case in the middle doesn't work - Why not?  WHO KNOWS?  Split it apart.  Make sure that each "test" is stand alone - not a conglomeration of multiple tests.  Keep it clean.  Keep it clean by refactoring.

Refactoring - Make small changes to implementation - NOT THE BEHAVIOR.  Look at what is happening and clean it up a little at a time.

Testers can help Developers (Coders) by asking questions - often around the scenarios of the code.

Jim walks through an exercise that was just too fast to capture - Find it in the code link above.  Its fun.  In short. in @ 10 minutes people in the session gave 4 test ideas he had not encountered even though he used the same example for several years.

MAtt is wrapping up with a story around "It takes a village to make software" - by a story around a reconstructed "Peat Bog Village" he and his wife visited in Ireland.  Kinda cool story - the gist was that the two /specialists/ in the village - the smith and the thatcher - needed to contribute to the greater community in the mundane tasks as well as exercise their specialties in the context of what needs to be done based on the season.  Related to that, was if there is a cobbler in the next village (a shoemaker) the smith may need to spend some time working with that craftsman - they each learn something and each benefit from their mutual work.

Sharing ideas can and will benefit both (or all) participants.

Find Jim on Twitter at @ajimholmes and Matt at @mattbarcomb

LUNCH TIME!

Pretty good conference lunch - a little pasta a nice salad some chicken like stuff - there was also pizza and a nice caprese salad (you know, the slices of tomatoes with slices of mozzarella in between drizzled with olive oil...)

And now in Erik Davis' session on finding testers "in the rust belt." 

Erik works at a company outside of Cleveland, OH, which, as he observes, is not one of the "cool technology" centers of North America.  He observes that they have an extremely difficult time to identify people who want to do testing - and want to do testing - and have really solid experience.  AND since CS majors tend to NOT want to do testing as a gig, after all, they'd rather do development for games and make a pile of money... AND then there are problems with recent grads not really knowing how to test because most colleges and universities in that area don't do a good job.

SO, here we are - what to do - Erik's solution was to hire entry level people.  No, not some entry level - everyone is an entry level tester.  THEN they learn - they are trained and worked with and lean their product then they learn to act and THINK like a tester.

That is a really important point.  How many companies treat testers as if they were not issued a brain when they were born?  When they think of "testing" they mean "click these buttons and enter these values and you will get this for a response - if you don't let us know."  That's not testing - not even checking - its rubbish.

So, they don't get many CS majors - when they do the get these weird requests to join as a tester - but really want to be developers... well, that may be good or it may stink.  If they want to learn the product and learn the product set for a year or two - AWESOME - then they can- no worries.  If their mission/intent is to get into dev AS SOON AS THEY START - they don't do well.

Erik's take is that some people with certain degrees tend to demonstrate certain traits.  He warns folks that certain people may have different experience. 

Interesting observation - Music Majors, Instructors, Fine Arts People - prove interesting.  They look at things differently.  Very differently.  They come in without the huge bias problems that "technical trained" people do - they may have problems with picking up computerish stuff - BUT - when they dive in they find new ways to look at testing.

This leads to issues with training people who "are not comfortable" with computers.  The thing is, fairly few people really are.  They do things that can compensate them.

Consideration - If you train people when they come in to the company, in a way that will benefit the company (not ust test) then you are helping people AND the company.  Everyone benefits.

One drawback - if you are calling people in for an interview and the candidate applied for a different position - tell them when you call.  That way if they hate the idea of being a tester they have the chance to bail before and save everyone's time.

Don't try and screen resumes.  For testers, there are not really good filter points that work - really.  Blind screens on certification or C# or C++ is not a good idea.  Having HR screen the resumes might not work - it did not for him at all.  BUT - HR can screen for "people problems" - with a funny story about a guy who stabbed a Pringles can because he was mad - and stabbed the hand holding the can.  Ouch.

He found interns really helpful - both college/uni level and high school kids (like strong STEM students).  It gets them in, AND - they gain interest (maybe in testing)

Follow Erik on Twitter at @erikld

OPEN SEASON -

Interesting - questions on training and Per Scholas come up.  How do people interact - Communicating via Network is helpful - INCLUDING - who might know someone who could fit in.  This is true in and out of companies (that was my comment.)

===

OK - I hid for a while.  I admit it.  Actually, I went to my room and swapped contents of my bag, put on a fresh shirt, washed my face and got back in time for milk & cookies.  YEAH!  COOLNESS!

And now in Erik Bricarp's session on Making Learning My Top Priority.

Very good opening/introduction on his personal experience, limitations and the like.  Perspective matters in all things.  He's describing his journey in self-education and learning.  The principle is fairly straight forward - using ideas and discoveries (self) to motivate yourself and develop thinking and work up the courage to take the next step.  Really good stuff -

Practice, Socialize, Present, Gather Feedback, Reflect (on all above), Read (stuff) / Watch (videos, webinars), Take Courses, Share Ideas, Mentoring, Organize. The thing is , we do much of this every day with things.  The hard part, from my experience is to actually go public with these ideas.

These items/ideas can contribute - by driving thinking, using them as a heuristic for working on each task.  Having said that, if you consider how to keep going, how to continue learning (Pete: continuous learning is an indicator of a true master) you can internalize these ideas and make this a perpetual loop - Pete: rather like Wash/Rinse/Repeat.

The challenge is to keep moving forward - to keep using this concept as a brainstorming exercise.  This is crucial of self-identified skills/training list. Using these ideas, one can strive (struggle?) to improve.

In Erik's case, he expects much of himself.  His goal is to demand much of himself, not necessarily others.  His goal was. simply, to get better.  Part of this includes establishing and raising personal standards.  When one does that, there are risks - professional and personal.

When you stand up, raise yourself, your standards and expectations, you will make waves.  People may not like it very much.  There may be costs.  When you do this, however, you will feel better about yourself.  You will grow - which is uncomfortable.

What do you do then?  Do you need some form of a personal manifesto?  Some form of guideline for your professional work?  The temptation is to become so "motivated" that you can't refuse anything - references "Yes Man" with Jim Carrey.  (Learning to do so is good, learning to say no is also important.)

Doing things that are scary is... scary.  It is part of how we develop.  It is part of embracing the challenge of what we have set for ourselves.  In turn, great things can happen - like speaking at CAST.


Pete: Yeah.  This was a good session for me to sit in. Follow Erik on Twitter at @Brickuz.

Open Season!











Thursday, August 22, 2013

Considerations on Change: AST and Elections and CAST 2013

Last year I was elected to the Board of Directors for the Association of Software Testing (AST.)  I was elected to complete a partial term.  Instead of the normal two year term, I would stand again for election in one year.  That is this year.

I have been nominated again for a seat on the Board of Directors.  I am deeply honored.

There has been much progress this year, however, much of it has been in the nature of laying a foundation for future things.  There have been deep discussions on how the Association can grow and develop.  More importantly, there have been deep discussions on what the Association can do for professional testers in general and members of the Association in particular.

The development of the Leadership Special Interest Group, the Leadership SIG, is part of the direction the Association is taking up - the question of reaching out to managers with the message around Context Driven Testing.  Many times, line managers - the actual managers of testers - will see the improvements and may make suggestions on how testing can be better.  The greater challenge is to go up the next level of management, then the next and the next. 

In small companies this is challenging.  In larger companies, with hundreds, not dozens of testers the task seems insurmountable.  That is what we are looking into.  How can we demonstrate and encourage leadership among test professionals while bringing upper management into the fold?  How do we do better work?  How do we help the managers understand where we are going?  How do we get them to join us?

In other areas, we are looking into how we can improve training for software testers.  How do we make things better?  How do we teach the essential skills that are needed?

These are two of the challenges we are working on. 

There are the "normal" things needed to keep a non-profit, professional organization functioning.  Website maintenance, LinkedIn groups and forum monitoring, and... stuff.  Newsletters with information of use to members.  Spreading ideas and sharing them - this is at the heart of why we exist.

These are among the challenges I took on when I accepted a position on the Board last year.  I see this coming term as an opportunity to build on the foundations that have been put down this year. 

If these seem like things you think the AST should do, I would appreciate your vote.

Thanks so much.

Thursday, August 8, 2013

One Year On, Is It Worth It?

Last August 9th I sent a tweet that left some people, perhaps a bit confused.  Let's face it, unless you've sailed a topsail schooner or perhaps something larger, or are reasonably well versed in such things, you may not understand the meaning of "All hands, shorten cables, loose topsails and jibs, prepare to weigh anchor.  Prepare to get underway."  A little later I tweeted something about "obscure tweets."  Heh.  Yeah.

In short, that translates roughly to "get ready to leave."

For some time I had been considering my role in the organization I was working for - like since the small company was bought by a much larger company.  The progress and advances made in the smaller company - away from heavily scripted, heavily control management practices into a more responsive approach to testing, taking steps toward a broader context driven approach. The results were pretty clear - the calls to customer support dropped, behavior of the applications we supported became far more consistent and, overall, there were fewer emergencies to be handled.

In testing, we focused our work away from scripts and into broader variable combinations which found instances similar to what was being reported in the field.  We could get the developers to focus on what we found and make incremental improvements.

Then we were bought.

Back to heavily scripted "best practices" that simply had no bearing in the context of our environment.  Loads of metrics to track - meaningful ones like total test cases, test cases passed and failed, bugs per test case - loads of spreadsheets and graphs.  We'd ask questions and were essentially told "You'll understand after you've done a few of these projects.  Then everything will make sense."

Change Your Employer or Change Your Employer

I dove in with the attitude of "I've changed the way companies have done testing before - I can do this."  And promptly ran into a wall of bureaucracy.   Conformity to standards counted more than anything else.  So I asked questions.  I went from asking questions to ranting.

Really. Loud, unyielding, ranting.

To no avail.

Well, that isn't really true.  I was commended for being so engaged in making the company a better place.  Whatever.  I then found myself in several games of "bring me a rock" - you know the game.

"These sound like interesting ideas, are there any examples in our industry you can point to where we can talk with people there and see how these ideas are working out?  Oh - not them, they're Canadian, we're in the US.  Not them, they are a commercial bank, we're in retail.  Not them, they're not the right kind of retail and they're in Europe."

I knew I could not stay.  I was losing that fight and I was not willing to do what was expected of me.  I was not willing to sell my soul to the Borg.  (Note to large firms that buy other firms, do NOT use the word "assimilate" (in any tense) when speaking to a room full of computer geeks.  It won't fly.)

I began looking.  A few opportunities were available.  One looked really interesting, one was a possibility (but fell through) and one looked like a great opportunity to mold a team - but there was a catch:  They did not want to hire the person in, they wanted a contract position.

Contracting.  Hmmm.  Something I had done briefly but only in a contract to hire situation.

Thoughts on Contracting

Unless you really want to be your own boss, I don't recommend doing that.  Also, unless you really like a LOT of paper work, I don't recommend being a contractor.  In fact, in the US, here are the things I would suggest you consider:

* Taxes - self employment, quarterly filings, writing big checks to the Federal, State and (if you're like me) Local Governments;
* Paper work - loads of it. Evidence of what you did, billing, statements, balancing accounts - yeah software can help, you still need to take the time to a) figure it out and b) do it. 
* Insurance - Unless your spouse (if you have one) or partner (if you live in a State that recognizes relationships other than heterosexual unions) has insurance coverage you can be covered by, get ready for an interesting journey - medical, dental, life, disability, homeowners riders (you're now running a business out of your home, does that make a difference for your policy?)  All that stuff.  Its a mess.
* Banking - Remember those direct deposits that magically landed in your account when you were an employee?  Ummm - you the business gets money periodically - how much of that gets paid to you the employee?  Out of that BIG check comes money for, well, the stuff above.  On top of that, your savings for short and long-term stuff.
* Vacations - Speaking of short and long-term stuff, Salaries are awesome - you get paid even if you don't work.  Contracting?  You get paid as long as the meter is running. You need to be disciplined enough to set up a salary for yourself - the banking thing above is involved in that.  I use a credit union for my primary bank - they were friendly and straight forward in how to set stuff up.  I like credit unions a lot.  So, you want to take a vacation?  Better make sure your income structure supports that.
* Paperwork - No, not the paperwork above - this is stuff like receipts - LOTS of receipts.  What was spent for the business - REALLY spent.  Taking the significant other out on the town and charging it to "the company" doesn't count.

There's more, but that is enough for now.  You get the idea.  Back to the story.

The Process

So, I went and talked with people, interviewed and the like.  Since I had a gig that changed the dynamics.  I was not the guy whose unemployment was running out and had a house and wife and car-note and what not.  I was the guy with a gig and if the one I was talking with you about did not seem a good fit, no worries.

Translated, I interviewed them perhaps a bit more aggressively than they interviewed me.  In one case, the interview team found that rather off-putting.  Too bad.

You see, 20 or 25 years (heh - who am I kidding?) 30 years ago, I would have been the nervous fellow I suspect that company wanted. No.  Sorry.  I know as much about software development as the interviewers did, and frankly, more than a couple of them.  I also knew more about software testing than all of them interviewing me at that company.

No, me showing up cap-in-hand is not going to happen. That company wanted that.  Nope, this won't work.

Another company talked with me and we had a really nice conversation.  But that is all it was.  I did not get the feeling they had any idea what they wanted and were looking for people to tell them what they needed.  So, we talked about software testing for an hour or so and that was that.

Then the third company.  Phone calls.  One face-to-face conversation - "Are you sure you want to leave this salaried position to be a contractor?"  Frankly, I wasn't at the time.  Remember the list of ideas above?  Yeah, that seemed fairly daunting.

So we talked about testing.  We talked about software.  We talked about what they were looking for and how they wanted to grow the group and develop it.  We talked about experience and what I had done where.  We talked about how things are unique everywhere.  At one point xkcd and Monty Python both came up within moments of each other.  It seemed like a good fit for mutual needs as well as cultural.

Money came up at some point.  A range was stated - always leave a little leeway. A few days later the terms of the contract were hammered out.  As we were in the process of finalizing the terms, I sent the tweet I mentioned above.

The morning of August 9, I submitted my resignation to my manager.  My mind was made up.  In it, I included a suggested transition plan for the systems I was responsible for as well as what seemed a logical sharing of information across the broader team.

The funny thing was he asked if I would take the time to think about it and be certain.  We would speak Monday morning and see if I had a change of heart.  Also, there was a team meeting the next Monday, would I wait until then to say anything to anyone on the team or in the building where I worked.  No problem.

That Monday, August 13, we spoke again.  I had not changed my mind.  I sent another tweet:

Alea Iacta Est

If your education did not include four years of Latin, or history of the Roman and Byzantine Empires, consider this.  That is what Julius Caesar is supposed to have said when leading his Legions across the Rubicon River into what was considered the Roman Heartland, starting the Civil War that doomed the Roman Republic and put him in power.  His nephew, Octavian, would become the first Emperor of Rome.

The die is cast - there was no turning back.

My friend and colleague Matt Heusser sent this rather excited tweet a few minutes later (quick study that guy - catches on sometimes faster than other folks do)


Two weeks flew by as I completed the hand-off of information, then looked to the future.

The Change

I admit - the transition was more bumpy than I thought.  Money coming in almost when I thought it would. Legal junk - yeah.  Ewwww.

In the end, it was the right move for me.  It may not be for everyone, but it works for me.

Why?

I am my own man.  
A growing number of companies are laying claim to the intellectual property rights of work done by their employees.  Now, don't get me wrong - work done at the day job or for the day job on on the day-job's equipment - Agreed.  That is theirs.  More and more, companies are laying claim to all intellectual work their employees do and create.

Work I do on my own, to better myself as a craftsman, that is mine.  That includes this blog.  That includes presentations I put together for conferences.  That includes work I am now doing in support of AST and my local testing group.  That is mine.  No employer can claim that.

Now, when I was active in pipe bands and teaching pipe bands, if I had an employer who wanted to lay claim to rights - they could claim the music I wrote.  They could claim the instructional materials I wrote for drummers.  They could claim the pictures I took.  

No, I'm not making this up.  

My time is my own.

Part of the contract is an agreed time per day or week I will spend at the client office doing work for them.  Any extra time needs to be approved in advance.  Any time short needs to be negotiated.  Pain?  Well, it is different than being on a salary.  

However, that also means that I can build into the schedule time away for conferences or training seminars (either presenting or receiving) or vacation or - whatever.  The phone will not ring at 3:30 AM unless it is something that has been agreed to in advance.  
My plans with my family will not be disrupted by an "emergency" at the office. No, sorry, I just stuck my head in my office and there are no emergencies there.  I'm going camping with the family - see you Monday.

I can work on other projects when I'm not at the client site.  I can write reviews of technical books.  I can write blog posts extolling the virtue of whatever I want to extol.  If I want to write articles and try and get people to buy them - I can do that.  If I want to clean the house - I can do that too.

No Butt-Head Rule

Unlike a full-time "permanent" position - I know this is going to end at some point.  I can decide how much time and energy I can put into dealing with people who are suffering from broken-spirit-corporate-itis and want everyone to have the same disease.

You know the types - the ones who are always predicting disaster.  The ones who patronize you because you don't really understand how software is made.  The ones who tell you to do your job.

Responding or not is up to you - and it is also up to you to see if renewing the contract contains enough incentive to look past people like that.  If so, then stay.  If not - there is another contract waiting.

Do the Right Thing

Now, if the contract is project specific, that may be an issue.  Is this project one that is good for you as well as them?  Use your filters and decide.

If the contract is more duration-based, long term (or short term) this can help with some things.  Are you doing interesting work?  Are you learning things that make you better?

Are you doing things to make the client better?

In my case, I do believe I am contributing to improving the methods and practices at the client.  Yes, it is frustrating to have the same conversation many times when explaining testing.  I see that as a good thing in this instance - there are many test groups here - I am working with one.  People are asking questions.  This I see as progress.

In the End

My lady-wife and I were having dinner a week or so ago.  She asked me if I thought I made the right decision last year.

For me, the answer is "Absolutely."

If anyone else follows the same path, do so for your own reasons.

As for me,

Steady the helm, Set topsails and jibs.  Another turn on the fore-topsail if you please.
Loose and set the courses.