Sunday, October 26, 2014

On Gardening, Tools, Testing and My Dad

This weekend I celebrated my birthday.  Well, many people, my grand children included, would not understand why "celebrated" is the right word.  I mowed the lawn, trimmed some bushes, weeded, cut down a bunch of plants growing in the back garden - then mowed that area with the lawn mower.  I then swept out the sitting area near the back of the house and shifted the potted herbs there.  They will be easier to get to in the winter.

In short, I spent a good deal of time working on putting the garden to bed, ready for the winter.  There is still more to do, but this was a good amount of work done and the weather was glorious. 

So yes, I celebrated by doing work in the garden.

Also, my lady-wife took me to lunch at one of our favorite restaurants (free meal on your birthday, up to a $20 value) then we went for a bit of a drive looking at the leaves in the last of their autumn gold and red glory.  Then we went out for drinks with a daughter and her fiance and listened to a band where a friend of ours was playing.

All in all - it was a wonderful weekend.

One thing about the garden work.  I used many tools to help me get the job done.  Or rather, I used several tools that were very specialized to do some extremely fundamental things that make it easier to do basic tasks.

There were loping shears for cutting branches and roots.  An 8-lb hammer to remove some posts that I could not work loose without "tool assistance."  There was the lawn mower - very handy since I don't have sheep to keep the lawn at a reasonable length.  A hand-saw I used for cutting large branches that needed to be removed that were too large for the loping shears.  And then there was the grass whip.

Of all of them - this was probably my favorite tool.  You swing this thing one handed with a straight arm swing - it cuts through brush and brambles and long, thick grass like nobody's business.  The neighbors wonder what I'm doing with this - since the area I was working with it was roughly 10' x 25' it took a bit of doing to clear.  Why did I not simply use a brush cutter? One of the big monsters that would run through this stuff like a lawn mower?

Two reasons - One, I really like using the grass whip.  Two, I can't imagine spending the money to rent a brush cutter, let alone take the time to drive there (to the rental shop) and back, operate it, then drive there and back to return it.  I could - and did - have the area cleared in much less time - 25 minutes to be exact.

The third reason is more personal.  My dad taught me how to use a grass whip when I was around 11 or 12 years old.  It was cool.  Swing this thing just so and you've cut through weeds and thick stuff faster than you could wrestle a lawn mower through it.  Hold it right and swing with enough room to swing it right and watch the cut stuff fly.  Something about simple tools designed to do one thing and one thing only - and they do it really when.

My dad had lots of wisdom in him.  As I approach the age when he died, I think back on the things he shared with me.  It amazes me how some of the simple things he taught me have stuck with me. 

He'd shake his head sometimes and say "Peter, you're making too much work out of this.  Do it this way..."  And then there were other times when it would be "Watch it, that's a lazy man's load. They want to avoid work so they take too big a load to make fewer trips. Except they spend more effort keeping everything in place than they would to make one more trip with a smaller load in each." 

One thing I remember he said - "Make sure you are using the right tool for the job.  If you don't have a single tool to do what needs to be done, see if you have two that you can use together to get it done."  Like, loosening a nut that does not want to relax.  You might spray it with a bit of lubricant to start.  Then using a good crescent wrench see if you can loosen it.  Sometimes a bit more leverage is needed - so a wrench with a longer handle might help.  Sometimes a good solid "whap!" with a hammer will get things going.  Sometimes a long-handled wrench with an "extension" - a piece of pipe over the handle to help with leverage will answer.  Sometimes all of these together.  There's a lesson in there for testing.

Anyway, sometimes he'd explain things in a way that made sense at the time - then a few days later I'd wonder what was meant - really - by what he said.  I'd lost the context of the comment.  Then I'd forget that bit and it would be shoved away to be replaced by something else.  Sometimes they come back.  Sometimes they float back years later - like today when swinging the grass whip.

He'd say something like "Using a tool right - in a way that shows you know how to use it, that is something that is worth doing and knowing how to do."

How many times have I seen people struggling with a job because they did not have the right tools - of if they had the tools, had no idea how to use them.  I bet most of us would not be impressed by someone who clearly had no clue what the plethora of tools around them were for.  Why do we accept so many people with shallow or no understanding of what their tools are capable of? 

Are we concerned about hurting their feelings or some such?  We can always gently point out that they are doing something in a sub-optimal way without being total jerks.  We can offer the potential there might be other ways of using the tools they have available.  If they are open to suggestions - usually I have seen this when people have been flailing and it is becoming clear they will not achieve what they are trying to do the way they are trying to do it. (This is absolutely true for me - when I am most desperate for a solution, I find myself most open to suggestions.  Other people may have varied experience.)

One other thing - If they are such fragile narcissists as a well-timed piece of gently offered advice will harm their psyche - or at least hurt their feelings - should they be doing the thing they are trying to do in the first place?

I was thinking of my dad this weekend when I was swinging that very simple tool - that grass whip.  Knowing the right tool for the job and using it is half the mark of a craftsman.  Using it well is the other half.

Friday, October 24, 2014

On Data and Testing and Inquiry: StarWest Retrospective pt 1

I was at StarWest last week, deep in conversation and thought with a huge number of people.  Back at the office this week and many people asked if it was a good conference.

Yes - it was!  I found a large number of people who were good thinkers and conversationalists.  Among these were some of the usual suspects - Michael Bolton, Jon Bach, Griffin Jones, Rob Sabourin, James Christie, Jon Hagar and more.  Added this time was Paco Hope, Lee Copeland and more.  I had the extreme pleasure of meeting Rob's wife Anna.

We had lovely conversations, shared some nice wine - some good beer and excellent sharing of ideas.

Some of them were about testing.

One interesting thing, there were some awesome discussions around testing-ish stuff,  For example - we had some discussions around motivating people - and how that does not work. This was an interesting discussion - we had some points that were worth considering.  Among other things - can we "motivate" people or remove the obstacles so people can discover their own motivation. 

That may be worth consideration another time.

What had me thinking this week was not that conversation - nor the conversation around how teams can so easily retreat within their cells and fail to work together in a meaningful way.

No, what had me thinking was a comment made during a keynote, then conversation later that day.  The comment was something to the effect of "needing production data" to do good testing.  Check out my Day 1 blog with the live blog notes here:

When I was a mainframe jockey, making systems do things that I was told were impossible to be done with the technology we had available, I used "production data" for testing.  I still advocate using "production data" in some contexts for testing.  What I caution people over is relying on said "production data" for all your testing.

I find it can be helpful to mimic production data for some instances of acceptance testing.  If we want to emulate what is happening, for example, to compare the "old system" with the "new system" it has proven helpful, in some instances.

In other instances, it can be less than helpful.  It can give a false sense of security.

Instead, I strongly recommend people consider what the data itself looks like.  What data is needed to exercise the application?  What happens to other applications or systems using that data when there are changes?

What happens when we make changes to how the data gets handled?  What happens to other systems that use the data made or updated by the system we are testing?  Sure - there was an impact analysis done - but those are not as thorough as we'd like to think, right?

So, what about production data?  In those situations, does it really help us?  Is there something more we can do? 


Examine what the data elements are - their characteristics - and how they are used. 

How do we do that?  Well, that is another blog post.

My point? 

The idea of production data being a cure-all for testing and the "go-to best practice" is a fallacy.  Don't be lured by the simplicity of it.  The world is rarely that simple. 


Thursday, October 16, 2014


Thursday morning at StarWest is lovely.  Some clouds, comfortable temperature (for me) and we started out sitting drinking coffee and talking about testing.  What a great start to the day.

Heading off for some day-job work stuff, will be back for the morning keynote.


So, the work stuff took longer than I thought it would, so I'm a bit late to this keynote (which is disappointing to me as I wanted to hear it.)  Alas - 

Ben Simo is a really nice guy, mild and soft spoken.  Do not be mistaken into thinking he is anything but an extremely skilled  describing his experience trying to get health insurance coverage through as "The Power of an Individual Tester."

Summary - It ain't pretty.

You can see his thoughts, comments and learnings here: This is different from his regular/normal blog, here:

The problems were Legion: poor security questions, poor account verification handling (as in "We sent you an email with your new password!" except it can take 3 days to get delivered and the link for the password expires in a couple of hours.

Then there were problems with how information was returned and displayed - anything from sensitive data and it personal information posted behind the screens, not visible - unless you trace what is actually being sent to the browser.

The "skilled hacker" comment came as a result of Ben noticing stuff and looking to see how far the thread went.  With a series of fairly simple steps, a person with bad intent could get everything they really needed simply by monitoring cookie streams, etc.

The data included in what was being sent to the front end included income and other sensitive data that should not really be returned to the browser in a secure system.  There were other issues around when and how data was collected from the screen, which is retained and passed back and forth.

Lesson: Technical people - be straight when talking with non-technical bosses.  Be absolutely certain that their understanding of how the system works is correct.

When there is bad news - be very precise.  Don't get involved in turf wars, give people information and be straight.  No matter how BAD the information is, get it out and then figure out how to deal with it.

Ben gives an excellent overview of problems in the site - starting with fundamental design flaws through transaction handling that had clearly not been considered.  The probability of success was low for such a complex system - then they made it worse by not making certain that public officials could give real, accurate information about the state of software.  Whether they chose to use it is beyond your control.  Be honest.

Ben does note that there are now some improvements - for example the actual ability to send feedback - and the form actually opens.


Next up – Rainmaking for Testers (and managers) with Julie Gardner.  She gave the opening keynote yesterday.  She’s with RedMind in the UK.

Julie talks about “trusted adviser” and “rainmakers” – and getting the image of what a “trusted adviser” has to say.  The issue is – what makes an adviser trusted?  How do people view you?  Are you the guy from Office Space?  How about a used car salesman?  How about Wormtongue from Lord of the Rings? 

Each of these were trusted, to some level, by some people.  They were not trusted by others.  This depended on several things, including the way that messages are delivered. 

It is important to be straightforward.  It is also important to frame bad news in a clear way – not overstating the problems, not understating the problems (note Ben’s keynote) – but give a concise evaluation.

Be willing to help, even if it is hard an outside of your normal realm, role and behavior. 

AND! She used one of my favorite words!  “Your manager does not want you to be a sycophant.” 

Don’t be a “yes man” a “suck up” – be straight with them.

(Pete Comment: OK – slippery ground here –)
What does “development” produce – CODE!

What does “testing” produce – (varied things shouted) Information – Documentation – Bugs

In general, too much detail will produce information overload, which will distract people from the message you want to deliver.  Be succinct.

Trust entails risk.  To be trusted is not a right – it takes time to develop trust over the questions at hand.  Speaking clearly and making sure that “the proof of the pudding is in the taste” – Make sure you deliver on the words you choose to use – the things you say mean things.  Make sure that when you say something – you can actually deliver.  People are taking on a risk by trusting others. 

What gets in the way of trust?  Bad manners – dishonesty – exaggeration – arrogance – empty promises – on and on and on…    Don’t do that…

Building strong relationships require you to ask questions and then listen to the answers – carefully.
Concentrate – listen (you have 2 ears, & 1 mouth – you also have 2 eyes – listen with your eyes as well. 

When you need help – ask.  You can’t be perfect for everything.  You can be empowering someone else by recognizing them.

Give advice effectively – Be prepared (as much as you can).  Advice is rarely a perfectly logical process – so be straight, don’t hide bad news and don’t forget the good news.  Don’t bury the bad news, be balanced when you can.

Know your audience – different people need different degrees of information.  Don’t give them the control panel for a jumbo jet, when the manager needs the information on the car dashboard. 

Now she’s talking about predictions- Number of tests you have to run, the number you expect to run by X time frame.  Then there is the question of predicting bugs… (Pete comment: can you really predict bugs? How?)  Possibly in some instances the information found in testing over time may help. (Julie refers to this as “defect measurement rate” which was developed by a colleague of hers.)
(Pete Comment – I know people have different views – I am not convinced that predictions of bugs to be detected is terribly helpful.)

OK – Moving on – Developing a helpful mindset is important.  Don’t place blame, don’t slag someone for not being perfect (you aren’t either.) 

Julie had a survey – “What do senior managers look for in a test manager?”

Top 10 common responses:
10 - Pragmatic
9 - Understand Testing
8 - Identify and Manage Risk
7 - Cooperation and Team Players
6 - Understand the issues and the politics around them
5 - Trend Analysis and forecasting
4 - Flexibility
3 - Abe to communicate at all levels
2 - Honesty & consistency
1 – Understand the Business

To move from “trusted adviser” to rainmaker – consider:
People do business with people because they choose to, not because they have to.  We can always find others doing the same thing or selling the same product.  It’s the personal connection that makes that happen.”

Even when marketing yourself – retain your integrity. Don’t “sell out.”

Testers can be the “Jiminy Cricket” for the team, project, organization, and act as the conscience.  We can be honest with people and give straight information – and ask if we are doing the right thing.  We must have the courage to do that – and speak truth, even when people may not want to hear it…
Adding the elements of “trusted advisor” with credibility, you get integrity.

Annnd – power is gone – it was a very good session…
Hallway (well, coffee line) conversation with a couple of people on testing, exploratory testing, applying ET in their environment and leadership.  Wonderful!
LUNCH!  Be back later with updates from James Christie's presentation...
Ummm- one thing - we don't have wifi in most of the session rooms - and limited power sources.  So, updates come as I can post them -

We’re BACK! 

After a lovely lunch table conversation with many people, including Griffon Jones, on Exploratory Testing, I’m now in James Christie’s session on “The Unfortunate Triumph of Process over Purpose.”  

James is best known, lately, for his opposition to the new ISO 29119 standard on software testing. I rather find this unfortunate – not his opposition, but that most people see him as a controversial figure.  I can say, after many conversations, James is an excellent thinker, writer and has very good ideas.  I have never heard him present and I am looking forward to this.

His talk is based on his career as a (process) auditor test manager, with examples from specific projects he worked on.  One example, the first, was a project done for a division of the British Government.  This project was politically sensitive at the time, and included significant expectations around accessibility (visually impaired in particular.)

In a sense, this was normal.  The only problem is that when you have a risk-averse organization the likelihood of failure tends to go up.  Instead of “Keep Calm and Carry On” the motto tended to be “Keep Calm and Ignore Reality.” 

ANNNNNNNNd… James refers to non-existent documentation – 70 pages of stuff that made no sense.  James role was to document and ensure that the items worked as intended.  Thus he made lovely documents, but much were of little value. 

Test plans were irrelevant because the documented requirements and the software developed had little to do with each other.  There was not a rejection of any form of Agile methodologies, they simply did things the way they always had.  (James is speaking really, really fast – banging points out – it is hard to keep up.)

The project was declared a success, even though they had just barely delivered something that sort of worked. 

By comparison, the crash of banks in 2008 resulted in several findings – the Parliament report on the failure of Royal Bank of Scotland had a dramatic point.  (See photo) It is an excellent summary of what happens when people worry about what the steps of the process are, and forget about what happens around the reason the processes exist. 

Thorstein Veblen described what he called “trained incapacity” (1914) where people are given such detailed, proscriptive methods that when the situation changes, they cannot adapt.  People are trained to not be flexible, and the result was people who were easily and often bypassed because they have been locked into one and only one way of working.

Isabel Meznzies Lyth (1959) described problems resulting from organization’s structure and processes: Task, Technology amd social/psychological needs of the people.  Lyth was looking at nurses, and how the growing view of them being fungible resources, thus nurses were seen as interchangeable. 

The problem was that these changes were not helping the people who needed to do the work, they were designed to help managers manage those people. (They also led to entrenched, fixed thinking similar to the Maginot Line built by the French in the 1930's against a German invasion.)

David Wastell demonstrated that structured methodologies did not actually advance the job, but they served as social defenses for the managers.  They showed no value in and of themselves.

James slides in to the ideas around “Rules versus Principles,” citing FD Roosevelt, “Rules are not necessarily sacred, principles are.” Principles are what we use as fixed points to hold ourselves – we use them to measure ourselves against our own values. 

Brenda Zimmerman describes the difference between complicated and complex.  Consider the space program, compared to raising children.  We can follow the same steps and get people to the moon and back.  We can follow the same steps with each child and, somehow, come up with completely different"results."

Cynefin – Welsh word meaning “whole man” from Dave Snowden.  This is a framework to help people make decisions and chose a response.   Hence, “complicated” situations are less prone to repeated processes (best practices) because, like raising children, it is difficult to master and predicting outcomes are pretty much impossible.

Snowden’s point is that there is no real clarity to what is needed. 

The center of the diagram is disorder.  This is where we normally are.  Most people will try and find a way to migrate to a situation that is easier to understand so we tend to impose those values from “Obvious”

The boundary between Chaotic and Obvious is a mess.  You can’t really balance between them  People who find themselves on that boundary are, as Snowden describes, in an “Oblivious” state – they think they are in Obvious, but are not – and will tumble into Chaotic. 

Most of the time, software projects are bouncing between Complex and Complicated.  This leads us from believing that “best practices” will save us – until we realize we are actually dealing with Chaos.

When you deliver software in spite of the process, you are subverting the process, not adhering to it.   
It is the crash into reality that ends the delusion.

James cites Wenberg’s “Second Law of Consulting” – “No matter how it looks at first, it’s always a people problem.”

(Pete note – this is around the 20th time in the last 4 days that a different Weinberg reference was made – all different quotes – and just the ones I have heard, or made myself.  Thank you, Mr. Weinberg, for being so inspirational.)

“Any approach to testing that ignores “people problems” and tries to tame human nature with rules, standards and rigid processes is doomed”

Thus, any attempt to force people into a box will fail.  Full stop.

When you are so terrified of something, unless you focus on the people, you are almost certainly going to bring your worst nightmare true.

Questions – first one is around what does one do to respond the people who will say things that oppose all standards and processes. 

James’ response is that people don’t understand.  He also clarifies the difference between standards and conventions.  These are similar – but not the same.   (Pete comment: and this is a simple concept many people fail to realize.)

Well done, and well said, James. 
Right - after a brief break and more hallway conversations, I'm in the room for Paco Hope's keynote "Softwarts: Security Testing for Muggles" which he refers to as a course in "Defence Against the Dark Arts.Black Hats"  Paco has been walking about the conference off and on the last 2 days wearing... I think they are "sorcerer robes?"  Except given the title of the presentation, I'd expect they'd be more "Harry Potter" style than "Sorcerer's Apprentice" style.  Ah - well - its all good.

We begin with prizes from the test lab.

And Paco is off to a running start... well - maybe not running.  He starts by admitting that he is "mixing genres" which is kind of appropriate for security testing.  The thing is, it is hard to identify who the good guys and bad guys are - without some work.  They don't really wear black hoodies when they are doing their thing (to the best of our knowledge.)

Generally, security defects are not that different than other defects.  They CAN be different, but often times they land in the realm of unintended consequences.  "We meant to do X (which is good) except Y also happened (which is bad.)"

The thing is, just because the security defects appear to be different does not mean that they automatically are.

Similarly - there is a myth that "security testers" are inherently different.  They do stuff that no one else can understand, let alone do.  Its like Tom Cruise in Mission Impossible hanging in the middle of the room to get to the keyboard.  Except he's probably a "functional" tester that need to climb into this silly rig to trigger some branch of the code.

There are things functional testers deal with that security testers do not need to (like "code coverage".)  There is an idea that security guys have this mystique - this wand or something.

Except testers actually have test inputs & harnesses.  They have user stories, use cases and documented requirements (which serve as the magic map of the kingdom.)  And there are logs and profile info that can act to support you.

And he relates a story of how he found a hole in a system dealing with interest rates & large purchases (financial stuff.)  He found a hole in the system where he could change the rate (increase) in the confirmation message.  He thought this was a big deal - Apparently it was not to them (of course, they never checked to see if

Four Principles -
1 - Orcs not Elves
An Orc is a dumb brute with a primitive weapon and you can point them somewhere and something dies.  One orc is not a problem - 50,000 orcs could be.  Think - bot-nets which can be launched in denial of service attacks, or other malicious manners (and if you know where and how, you can rent time on one of these without actually needed to build it yourself.)

Offline attacks are powerful instances that allow an attacker to brute force a system - sure its hard.  BUT - it can be done.  Think, sniffers getting encrypted versions, decrypt them (brute force) and walk right in to the account.

Orcs can be applied to our own intents.  Like, write a program to create input daya; check it works, run an attack to see what is happening.  Can you withstand the demo attack you created?

2 - No Gold Required
You don't need to take your bag of gold pieces somewhere to buy magic devices.

There is stuff like OWASP with tools and resources available - free.  There is CVSS (common vulnerability scoring system) to help score security risks.  There is Kali Linux - which is pre-build and bootable.

These tools are pretty useful - people will help you - and it does not really need that much money to make happen (many are free, or can have help obtained for the cost of beer.)

You can get this stuff by doing a little bit of work.  Sometimes you can get information from various sources - like the experts themselves - via Twitter, mailing lists, things of that nature.

3 - Reverse Alchemy 
Instead of taking ordinary stuff and turn it into gold, we are going to take gold and turn it into ... crap.

HTTP proxies are powerful tools we can use - to do our thing - by watching what happens when information passes through it.  Pretty much everything runs through proxies - why not set up one of our own?

Secure connections? Well, if you're running your own proxy then you can see what is happening.  Cool, no?

This allows us to monitor, intercept then rewrite traffic in your own proxy.  You can capture stuff and tweak it.  The data you change is now what you want to exercise against, or through, your system. 

4 - Use a Spell Book
You don't need to memorize everything - have a spell book to help you recall the right command for fireball (or whatever.)

Keeping a reference handy for what you need to do - how you can make things happen by retaining links, setup information, whatever.  Spells -> Commands; Animating the Dead -> Simulations, and so forth.  The specific things you need to do are the contents of your spellbook.

Equivalence Class partitioning may relate to XSS or SQL injections - these may be "classes" of attacks or hacks.

Think about the things that "can't happen" or places where "this could not happen."  It is rare that systems have no vulnerability.  EG., a 3rd party service provide sending data can (and will) make mistakes.  That is a vulnerability.

Combine those ideas and voila!  You can begin to be a security wizard, too!


Some announcements - and I'm almost out of power.  more updates and pictures will be loaded.
And - pictures loaded - including this one of the ice cream that was available for the afternoon break on Thursday....  Yes, for those who have not been to the Disneyland Hotel, Mickey is impossible to escape or hide from (look at the pattern in the carpet.)


About to head to the airport.  I intend to post a summary/retrospective of my week from there.

Wednesday, October 15, 2014


It is Wednesday morning and I'm sitting in Anaheim, California, at the Disneyland Hotel for the 2014 instance of StarWest.  Like many conferences, there are a couple days of training and tutorials before the conference itself.  Monday and Tuesday were the tutorial days for this conference.

Monday morning I spent working on day-job stuff, that afternoon I spent preparing for the presentation I am giving later today.  Tuesday morning, more day-job stuff in the morning, then met and sat in some conversations with testers around ideas about testing.  (Sorry, Paul, when I tried to get back into your tutorial, the nice lady at the door said I had the wrong code on my name tag.)

Wednesday started with a small but doughty lean-coffee sitting at tables outside - near the coffee shop & Goofy's Kitchen (yes, Goofy is here and stops by to say "Hello.")  A bit more day-job work to accomplish, then off to the keynotes!


Lee Copeland gave opening remarks for the day - and Julie Gardner is launching the first keynote of the conference.

Julie tweets at @cheekytester and is talking on "glueware" - the stuff that puts systems together and helps them stick.

Central to that is the idea of communication - and the difference between information and communication.  AND - my battery is going.  Summary later!

And BACK!  I have power AND a wifi connection!

I rather liked a fair amount of what Julie had to say.  I have issues with some specific points made, and those tend to be things I have issues with that many presenter say and write.  More on that later (maybe.)

In general, she covered the costs of distraction (note - Multi-tasking does not work.) She examined what happens when people face various constraints and limitations - and noted that people developing software to be integrated with  are operating under the same constraints as others are - conflicting incentives, unclear expectations, unclear approaches to development, uncertainty of use and customer needs and expectations.

These things give us customer/client responses like these:

As it was, I thought she gave a reasonable introduction to problems given the audience at the conference.  What do I mean?

In the last three days I have met many, many people attending a conference on software, testing or technology for the first time.  There have been many of them who are attending their first StarWest conference (as I am.)  There are many who are new to testing.

This is important to me.

People are here to learn, and in some cases, unlearn practices they have some exposure to and will be looking to advance what they do.


Since I've been dealing with other issues, like finding power, I have no notes on the keynote by Rob Galen.  (I could hear him through the open doors, but missed too much to give a concise description.  I hope to address that as soon as I can.)

And now, I must go give my presentation - then lunch! :)

Right.  My presentation came and went – People were there and generally seemed to enjoy it.  Of course, no session is for everyone.  We’ll see what the reviews say.

Lunch was something interesting.  I volunteered to participate in a “Lunch with the Speakers” function – where speakers have lunch, while fielding questions from people sitting at the table with an interest in the topic of the table.  We had an excellent discussion that grew out of questions people had from my session.  I thoroughly enjoyed both.

Ran into Rob Sabourin in the hall after lunch.  I should note that I ended my presentation with “If you liked the session, I’m Pete Walen and this has been ‘Growing into Leadership.’  If you did not like it, my name is Rob Sabourin and this has been ‘Testing Lessons from Sesame Street.”

So I wandered back into the same room where I had presented to catch Martin Nilsson giving an excellent presentation on being a “social tester.”  He models this idea on the “types of testers” description James Bach did, some time ago.  He uses his career as an example of how people do things and did things in the past – like, walking around with a coffee cup talking with people. 

By reaching out and discussing things – asking questions about things OTHER than projects and problems – like, reading the Times of India when he needs to work with organizations with staff in or from India.  By being aware of the potential areas of concern for staff members and colleagues in the handling of not just information, but also of interactions, testers can change the dynamics of the project handling interpersonal connections. 

Martin then describes behaviors of teams, and again, using his experience, looking at how inserting himself into these teams that were effectively siloed, changed the complete dynamics in standups, etc.  The question around improving communication can sometimes be addressed by sharing a contact point – where information from developers and project managers and everyone else, simply became shared more openly.

This is an important point. 

If we consider Weinberg’s “All software problems are people problems,” we get a tool for understanding processes and project blocks. 

An example of informal models – Martin had meetings each week with test leads from different teams – informal meetings, like having a coffee in the morning together.  He then began inviting other people to join them – like, development managers, support managers, and others.   The result was that managers began going to the test leads for the project they were interested in, because they tended to have a better understanding of what was happening than project managers sometimes did.

This was an excellent insight for me.  I found that extremely valuable and a very good idea.

He mentioned, briefly, something around the idea of a person who kept cookies at his desk – anyone could have one, but he’d say “hello” and speak briefly with them, to get a better understanding of each of them.  This way, he had a bridge he could use when he needed information or was looking for insight into something he did not understand. 

Kind of like keeping chocolate at my desk.  (I expect the chocolate to be totally gone when I return from StarWest…)
After finishing the above update (there is no wifi access in the session rooms which makes live blogging for me a bit challenging) I visited the expo and had conversations with people I needed/wanted to chat with.  (Yes, folks at work, I'm bringing you stuff...)

I also had some interesting discussions with tool providers that I need to investigate further.

Now there is a reception and more networking stuff happening.  It has been a good day thus far - and the music is starting up in "Downtown Disney" - although it could be a sound check for the show later tonight...
Note: Most of my pictures of keynote presenters simply were not very good.  Sorry folks.  The people sitting in the middle of the room snapping images got much better pics than I did.