My Wednesday begins in Lean Coffee - yeah there was one yesterday, and I missed it completely (yeah, I overslept.) So, here we are, roughly a dozen people sitting around a table. Folks suggest a topic they'd like discussed and everyone votes.
This format is not unique, in fact it is used fairly often at various conferences, and, well, meet-ups.
So far, it is far too fascinating to be able to describe adequately, so I'm going to not even try. Its good stuff.
---
Right, so sitting in the main hall before
Jurgen Appelo's keynote this morning. I have a moment to summarize the Lean Coffee discussion. So here we go:
Brilliant.
Yeah, that probably is too short, but you get the idea. Conversation was happening so fast and furious around a handful of topics (we got through three I think - they kept getting extended...) We talked on such topics as the challenges in training testers, training in softskills and ... hey wait - two on training? Yeah. And they are related. There's a couple of questions that all of us struggle with - mostly because it is hard. Deep discussion with a lot of good ideas.
And now it is time for Jurgen's keynote.
---
Let's Help Melly: Changing Work Into Life
Right, starts out with some interesting stats. A recent study concluded that over half of American workers hate their jobs. It seems that this is a global problem as over half the people in the world hate Americans as well. Wait. No. He was kidding. (I think.) The problem is that most workers in the world hate their jobs. They seem to think this is a new phenomenon. Jurgen purports that this problem began some time back, like roughly, oh, 3000 BC (he may have said BCE, don't know - there is a guy with a veritable planetoid for a head sitting in front of me and I can't see the screen so well.)
He is going through some common ideas and tossing them aside. Gotta love it when people do that.
Let's see - BPR, TQM, Six Sigma and the like, are extensions of the good old days of, oh, the Pharaohs. There are loads of studies that show they don't really work, but that does not seem to change the attempts to "make things happen."
Partly because organizations tend to be organic - they are forms of communities that are ever changing. Until people look into reinventing SOLUTIONS (not processes) things will never really get better. This is essentially what Agile "Community" is to do (per Jurgen).
He's moved into setting aside SCRUM and Kanban - and is discussing Beyond Budgeting, Lean Startup. OK - I'm liking the logical progression I'm seeing... This leads to this which conceptually leads to this which... yeah - and now Design Thinking.
The thing is, these are all attempts to correct problems where things are not working. How do we make them work? As a label, the stuff really is not that important. The
ideas are important.
How do we make things succeed? Jurgen suggests
we run safe-to-fail experiments, we steal (collect?) and then tweak ideas, collaborate on models and shorten feedback cycle
The
PROCESS is what we see -scrum/kanban boards, discussions, people working together and collaborating. The stuff in Binders, sitting on the closet are MODELS. All that stuff that describes the formal book stuff does not promise success!
Jurgen is now citing Drucker, and is suggesting that just as "everyone should be involved in testing" then "everyone should be involved in management" (which by the way is similar to the word in Dutch, German and other languages meaning a place to keep horses.)
These ideas are core to his work on Management 3.0 (which is pretty cool by the way.)
For example, he cites a company with "Kudo Box" where people can drop a note to recognize someone's work, effort, help... something. This can help lead to recognition - and acknowledgement of individuals (yeah - groups are made of individuals - so are teams and companies and... everything.)
Yeah, I can't type as fast as Jurgen is throwing out good ideas to fast to really do him justice.
People are creative when they are playing - where they learn and get ideas - and wow - highly modified football table (foosbal for my former colleagues as ISD). Lasers to detect scores, electronic score keeping, card readers to identify the players (which allows for cool score metrics) to you video cameras to record goals and play back the BEST goals in slow motion. Yeah - there's a company in Norway that does that. Cool. They also get really bright engineers.
Short feedback cycles are crucial - not just from customers but also from staff. Jurgen now throws out a slide with a "Happiness Board" that is updated DAILY - so folks (like managers) can have a sense of the way folks are feeling RIGHT THEN - THAT DAY.
As opposed to "Yes, we are interested in how our people feel, so we will send out surveys by inter office mail every 3 months." Riiiiiiiiiiiiiiiiiiiiiiiiight.
So, in short - we CAN address problems, and make life and work-life better. But we need to try not just whinge about it.
---
In
Dawn Haynes' presentation on Agile and Documentation... with UNICORNS.
Dawn is opening with a brief story of her professional development progression. "I was a massive test team of ONE... surrounded by lots of hardware. It felt good." Many folks I know have been in that situation. Some do really well. Some folks have a harder time.
It is frightening when Dawn's stories are soooooooooooooooo similar to mine. Oh yeah, this sounds familiar: "When will this be done?" "Done? NEVER!!!!!!!!!"
Well, ok, maybe that is a little harsh. It just struck me as an important point. Most folks need to learn to figure out how much has been done, and generally what still remains, according to some model. Dawn then applied some thinking (way cool great step!) and realized that you need to be able to remember more than one or two things at a time.
Her memory sounds a but like mine - you can handle two or three things in the "memory stack" and anything more than that scrolls off. So, she began with a simple spread sheet that contained information on stuff to be tested - in a prioritized order. (Pete Comment: Yeah, this is getting crazy spooky because she's describing some of the stuff I did early on as a tester.)
Her documentation allowed her to remember what she had done and what she still needed to do and sometimes dream up NEW things to try. Now, this sounds good, except she managed to "melt" Unix devices. Yeah, she rocks.
This is what documentation can do - help us remember and explain it to other people.
Now, some Agile pundits advocate NO documentation. Real world people (not unicorn-land people) who advocate "There is no need for documentation" and things of that ilk. Dawn explains, quite clearly and simply, that sometimes you need to be able to demonstrate what you did. Hence, documenting work that needs to be done.
Mind Maps, simple spreadsheets, and other tools can record what is done - what the thought process behind them - and then the results. Hey Folks! There is a lightweight test plan/test result combination. Keep an eye on what REALLY is needed. You know, to communicate and ADD VALUE - that is kind of a big part of what Agile is supposed to be all about, no?
OK - I really liked this explanation.
And now... the questions/answers are getting interesting...Can these techniques be used for more than one person? Yes. Yeah. It can.
Q: How does this work for inexperienced, maybe unskilled testers that don't know the application?
A: Non-skilled, untrained testers? I don't do that!
yup.
---
So I scooted over one room to pick up
Simon Morley's (yeah, @YorkyAbroad) take on CI and multiple teams and people. And he's underway NOW.
You get complex systems that are difficult to test which gives us slow feedback, because there is so much and it takes a lot of time and ... yeah. You get the idea.
When stuff gets rally hard to deal with, you invite other problems. Simon is pulling in the idea of "Broken Windows" - which was a popular measure in crime prevention circles in the late 1980's and well into the 1990's. Simply put, if a window is broken and not fixed, or at least not patched, then the message is that no one cares about it. This leads to more vandalism, property damage, etc. If this is not addressed, then more violent crime will creep into the area and things quickly spiral downward. So, fix the windows, even in vacant houses, paint over the graffiti, etc., quickly Deal with them when they crop up.
In Software, the equivalent is "It was broken when we got it" or "That was broken 2 (or 3 o10) releases ago." If they do not get fixed, the message is "No one cares." If no one cares about something, then likely getting them fixed later, and getting similar things addressed WHEN they happen, will get kicked down the road a bit, until it is 10 releases later and obviously, no one cared or it would be fixed when it cropped up. (right?)
(Pete Comment: Some folks seemed a little confused over what the Broken Windows thing had to do with software quality in general or testing in particular. Ummm, think it through, please?)
So, when you have many people/teams working on interconnected products - or maybe multiple aspects of the same product, sometimes, folks need a tool. Simon developed a series of spreadsheet-like tools to track progress and effort and results of tests. These summary pages allowed a user to drill down into the individual reports that then gave you more information... yeah. Pretty cool.
Then - by doing this, everyone in the organization could see the state of testing and (important point here) the Delivery Readiness of the product. Thus, integrating information - and changing the overall message can help direct thinking and people acting on cues. Sort of like saying "We are not doing Continuous Integration because we are exercising Delivery Readiness artifacts." Yeah, it might be considered semantics, but I like it. It brings to mind that the real key is to get people thinking in advance "When will we be ready?"
That is an idea I find lacking in many organizations. It is not that we don't ask that question, but we ask that question well after we should. We ask it too late (in many organizations) to address the stuff that REALLY needs to be considered before any code is written.
Why? Just as a tool (like CI) is not THE solution (although it can help) the "Complex System" is more than just software - it is the PEOPLE that interact and impact the development of the software that makes a difference. It is the people who are the solution, not the tools.
(Pete Comment: That last point is something that can bear some thought, and that is probably worthy of more consideration. I find this particularly true at "technical" conferences.)
---
LUNCH TIME!!!!
---
This afternoon,
Markus Gartner kicks off his keynote with Unicorns & Rainbows... then shifts to Matrix memes. Cool - of course his theme is Adaptation and Improvisation.
Right - All good rules should not be followed on faith. He's 3 minutes into it and people are going "O Wow."
Using a common theme on mastery, Markus presents the concept of 10,000 hours of deliberate practice to achieve mastery. Reasonable, no? Add to that, the and self discovery needed to work through ideas that enable you to write meaningfully, engage in various exercises and ... well, thinking
Citing ideas from Weinberg's Becoming a Technical Leader (Pete Comment: excellent book) on how to be successful:
1. Learn from Failure;
2. Learn by Copying other people's successful approach; needed;
3. Combine ideas in new ways.
Now citing Computer Programming Fundamentals - which Jerry was a co-author in 1961:
"If we have mastered a few of these fundamentals along with the habit of curious exploration, we can rediscover special techniques as we need them."
(Pete Comment: OK, re-read that statement then check the date - yeah - 51 years old. Yeah. I'm not sure we have reached that level yet.)
Markus slides into questions on Management and the difference between Managing and Controlling (Pete Comment: and Dictating). The problem of course is that some folks have a hard time dealing with this because - well - Controlling practices, we may not actually allow for people to learn and master and grow, these controls are self-limiting. The negative feedback loop that ensues will doom growth - personal, professional & organizational. (See Jurgen's keynote.)
Clarity, Trust, Power and the ability to Search/look are all needed to resolve problems and allow the team to grow.
Consider that delegation relies on intrinsic motivation, pride of workmanship (Pete Comment: Yeah, you can be a craftsman, really) and desire to please customers.
Allowing for accountability is the key to allow teams and organizations to grow. Yeah. That is pretty significant. The approaches to testing reflect the culture of the organization and the world-view of the management.
You MUST have trust among and between the team members. WIthout this, there is no improvement. Fear of Conflict is the next concern. Conflict CAN be good - it helps us understand and resolve questions around out understanding. Lack of Commitment - not the Scrum thing, but the question around what we do and commitment (professionalism?) to getting the best work done we can. Related to that is Avoidance of Accountability - yeah - the "Its not my fault I did this wrong." This is - well - silly. Finally the inattention to Results - the question of "What did you DO?"
In an Agile environment, this kills chances of success. If your team has these things (any of them) going pear-shaped, you get dysfunction.
Consider the idea of Testers need to be able to Code/Program. If we can grow a mixed-skills body, where people have similar skills but in different combinations, we can move from Testers & Programmers to Programming Testers and Testing Programmers.
The difference between them can be negligible. When we are open to it. Communication and mutual support is needed,
In the end, we each must decide whether we will take the Red Pill, and become fully Agile (or agile) or the Blue Pill and stay as we are.
---
After taking a break, chatting with people and getting a bit of writing done, BACK ONLINE - in
Henrik Andersson's presentation on
Excelling as an Agile Tester. Yeah. Sounds like fun AND WE'RE OFF!!!!!
Henryk is a good guy - Member of AST, has attended/presented at CAST, participated in Problem Solving Leadership and AYE - and presented at a pile of conferences. Smart, outgoing and a good thinker.
In discussing Agile Testing, Henrik tells us that the acronyms tend to carry a fair amount of baggage with the, The TDD, BDD, ATDD, and other acronyms tend to fall into the realm of Checking - in the model described by Michael Bolton. In his estimation, the techniques embodied by those terms are actually design pricniples to facilitate clean, clear, simple design - and inspire confidence. Frankly, it helps programmers know what to do and when to stop coding.
Why take those away from programmers? Why add an extra feed-back loop to get a tester in the middle?
Henrik's key point : DON'T TAKE THE VALUE AWAY FROM THE PROGRAMMER.
But don't worry, there is still a need for testers in an Agile environment (Pete Comment: or any environment for that matter.)
Citing the works of Weinberg - A Tester is one who knows that things can be different.
Citing the works of Bach and Bolton - Testing helps us answer the question "Is there a problem here?"
A tester should be driven by question that haven't been answered, or asked, before.
(Pete Comment - OK: Henryk is currently describing my views really closely, I am hoping there is not some confirmational bias on my part.)
So, what can an Agile Tester do? How do you become EXCELLENT?
Pair: with a product owner on design of Acceptance Test ; with PO whendoing ET sessions; with PO to understand the customer; programmer on checking; with [programmer to understand the program.
(Pete Comment: Yeah, I find nothing to fault in this logic - see my above comment.)
And out comes James Bach's comment - "I'm here to make you look good." And Henrick's corollary: I'm here to make us look good."
It is a subtle distinction, but an important one. A tester in an Agile environment can cross multiple conceptual walls (Pete's description) across the entire team. By working with the Product Owner as well as the Programmer(s) and the rest of the team to gain understanding, they can also help spread understanding and help manage expectations and guide the team toward success.
Sometimes gently, sometimes, well, kinda like Mr T on the A-Team television show. (Pete Comment: Is that still running somewhere?)
When testers are participating in Sprint planning - they also need to remember to engage in Daily Test planning. Henryk recommends planning these by Sessions (described by Bach) of 90 minute blocks of uninterrupted, guided/dedicated testing. With Charters to guide each session - guidelines for what functions/portions are intended to be exercised each session - these charters can be placed on the Scrum board (or Kanban or... whatever) as you plan them. Then also - put your BUGS on the same board.
Don't let them be HIDDEN - particularly in a bug system with limited visibility. Look into recording it accurately - Then get them out in front of the world. Other people may notice something or may possibly be able to provide insight into the results. Most important, it lets everyone know what you are finding.
Taking the results/findings of the sessions back to the group, helps us understand the state of the product, the state of the project. This can be done at the daily Scrum (in addition to the notes on the board.) Grouping information needs to make sense. The point is to give a nutshell - not the whole tree. Make things clear without bogging things down in detail.
Understand how the User Stories related to which Function Areas and the corresponding sessions per, along with the "Health" of the system.
Generally, Henryk finds that giving information on the Health of the system of greater value than the typical "this is what I did, this is what I plan to do, these are the problems..." This report may include session burndown charts, TBS (Test, Bug Investigation and Setup) Metric - along with the time spent in Learning, performance testing, business facing work and integration. These pieces of information give a reasonable image of the state of the testing right then. Getting to know the rhythm (Pete: there's that word) of the application and the project is crucial to understanding the application and the speed that information is being obtained.
Whew.
----
Closing Keynote for the day by
Gojko Adzic - Reinventing Software Quality.
Opening Gambit - We are collecting all the wrong data, reporting it in all the wrong way and wasting time with both when there is much more reasonable, and better data to collect. (Pete Comment: That big
CLANG was the armoured gauntlet being thrown down.)
Using his Specification by Example book as an example, Gojko launched into a story of how he found 27 problems that he considered a P1 - as in parts of words could not be read or stuff was mangled or... yeah. Problems. SO... He spent MONTHS going through the book looking for every possible problem and recorded them. Then, he went back to the publisher with his "bugs."
"Don't worry about it."
WHAT? Look at all these bugs! - ANd they showed him the reviews posted online, etc., All but one gave it full marks and rave reviews. The one negative review had nothing to do with the production quality of the book itself.
Why? Because he was looking at this in a linear way - Looking at bugs in PROCESS - not performance in the market place. Not Customer satisfaction - Not whether they liked the product or it did what they needed it to. In his estimation, ET, SBT and the TDD, ATDD stuff are all detecting bugs in the process - not bugs in the market viability.
Now - one MAY impact the other - but do we have enough information to make that decision? Maybe - Maybe not.
His comparison then moved to driving a car. The Lane Markers and seat belts help us move the car safely and make sure that we are not injured if something goes wrong. They don't provide any information on whether we are going in the right way. Other things may - Road signs for example - and sometimes those are missing or obscured. GPS Systems can also help - and generally show your distance from your destination - IF they know where they want to go in the first place.
Another Example - Blood pressure is generally an indicator of human health. Generally, reducing blood pressure, if the blood pressure is too high, may be a good idea. The way you reduce it may or may not help. For example - a little red wine may help. Maybe some medication may help. Chopping off the head will certainly lower blood pressure, but may negatively impact overall health. (Pete Comment: I wish I had dreamed that up.)
We talk about using User Stories and ... stuff to promise "good software." Yeah, well, some user stories are good, some are... rubbish (Pete edited that word.) A good User Story talks about what is going to be different. And how it will be different. Measure the change!
And moving on to the self-actualization table - as he applies it to Software.
Deployable Functionally - Yeah - the stuff does not break. Good. Does it do what we want it to do? That is another question. Is it a little over-engineered - like capable of scaling to 1,000,000,000 concurrent users may take more work than really needed.
Performance Secure - Yeah. Does it work so it does not spring leaks everywhere? OK Consider Twitter - ummm Gojko is pretty sure they do some performance testing - how much, who knows. Still, the fail-whale shows up pretty frequently. The
Usable - OK, so it is usable and does not blow up.
Useful - Does it actually do anything anyone wants? If not - WHY DID YOU DO IT???
Contribute to Success - The Hoover-Airline example is one to consider. Some years ago, Hoover Vacuum ran a promotion where if you bought certain models of their vacuum, they'd give you airline tickets. Except they made it pretty open ended - like - intercontinental. And They were giving away airline tickets worth far more than the vacuums they were selling. On top of that, they were getting scads of complaints - and between trying to buy airline tickets and dealing with problems, the UK subsidiary went broke - and were in court for years.
Another problem - the "success rates" for training. Typically we simply do it wrong and look for the wrong things to give us the information. He cites Brikerhoff & Gill's
The Learning Alliance: Systems Thinking in Human Resource Development. Generally, we fail to look at where the impacts are felt to the company and what the actual results, of training and software, actually are. We are deluding ourselves if we are tracking stuff without knowing what the data you need is.
So, Gojko provides a new Acronym for this consideration: UNICORN: Understanding Needed Impacts Captures Objectives Relatively Nicely.
The Q/A is interesting and way too fast moving to blog - check out twitter.... ;)
whew - done with engines for the day.
Thanks folks.