Thursday morning - Breakfast & Lean Coffee. Loads of tired people sitting in the Fritze at Dorint SansSoucci in Potsdam. Good energy though - the coffee helps!
Setting up for today's opening keynote -
NOTE! I will try REALLY HARD to clearly flag my own reactions to assertions made in the presentations I attend. Going as fast as my tired fingers & brain allow... Will clean the blog post up later.
Keynote: David Evans - Visualizing Quality
The Product of testing is..... ? Evans launches into "Value" questions - by inserting the idea of "more and better testing will do what for the product.
Product of Testing is Confidence - in the choices and Decisions we have to make in the steps to understand the product - not the product itself.
Testing services are not the judge, not on trial - It is the expert witness. How credible are we when it comes to providing information around the product. In the end, we must balance the risks and benefits of implementing the project - launching as it were.
Evans then describes/launches into the Challenger shuttle disaster (28 Jan, 1986). In this he describes the meeting the night before - the subject of which was "Will the O-Rings suffer a Catastrophic Failure Due to the Cold Temperatures." Of course, now, we know the answer was "yes."
Many pages of actual copies of the meeting agenda and technical notes - Yeah - These guys really were rocket scientists so there are loads of tecnical data here. They launced - the shuttle blew up.
"Failures in communication... resulted in a decision to launch based on incomplete and sometimes misleading information, a conflict between engineering data and management judgements." Wllm Rodgers, Investigator
Evans - We need to make sure that we are making it as clear as possible what the information we are presenting means and what the likely consequences are to the decisions based on that information.
Consider - the McGurk Effect - when there is a conflict between what we see and what we hear, the tendency is to rely on the seen, not heard. Is this what happened with Challenger? Is this what happens with software projects.
Now - US Military budget, 2008, was $607 Billion. (Pete: thats a bunch of cash) However, a single data point conveys not terriblly much information. Adding information on other countries gives more information. However, when comparing the spending compared to GDP - the total output of a country - while the US military, in gross terms, is the sum of the next 8 country's national spending - it was
BUG COUNTS! Any negative must be expressed in the context of "what are we getting that is positive."
In response, Evans posts the first clearly identifiable inforgraphic - with charts, lines, numbes, etc... So - this was a graph made in the 1860s of Napoleon's invasion of Russia in 1812. The lines represent the size (numbers) of the Grand Armee (Napoleon's army) at full strngth starting out with a very wide swath - and gradually narrowing over time as the army shrinks thru attrition (death.)
Consider how we are building/using/exercising testing suites compared to the actual documentation
This is in contrast to the London Tube maps - which is fantastic for giving an idea of how to get where iin London - Yet without understanding the actual street maps
US Election maps - red-state/blue state - looks like Obama should not have won - except the land doesn't vote. Adjusting by STATE level you get something DIFFERENT. When you look at each State by COUNTY, you see something different again - straight "results" and the results adjusted geographically by population density - gives us a series of interesting information for consideration.
Then there is the Penfield Homoculus, where the difference between sensory and motor control is - remarkable.
All of these boil down to 1 important thing - a diagram is a REPRESENTATION of a system - NOT THE SYSTEM. There are Other considerations around how that same system can be represented FOR A DIFFERENT PURPOSE.
Be careful of what your data points are PERCEIVED to represent.
Information Radiators can help us visualize the process. Simple low-tech stuff is still - visualization. Suggests to represent people on the board - AS WELL AS TASKS. So - not just what is in flite, but who is working, or taking point, on that task. (Pete: Good addition.)
Citing James Bach's "low-tech testing dashboard" to express state of testing - and note IT IS NOT COMPUTER GENERATED!
* What is the thing I am doing - what do I want to represent (and why)
* Stay focused on the information you wish to present - what message is given by a bar chart vs a "mountain range" of the same data;
* Transitions - if a unit is a unit, how do you know what the unit is? Save the X-axis for time - most people presume that is what it is. Order your data so people understand the point YOU WANT TO MAKE - sequence on the chart need not match the sequence in the app/software dashboard/whatever.
Remember - Testing is Decision-Support - it does not do "quality" - it gives information for people to make decisions on the product.
Track Session - Ajay Balamurugadas - Exploratory Testing in Agile Projects: The Known Secret
Ajay is a testing guru (ok - that was my personal comment) - with interests in many things. HE begins by asking if the Agile manifesto has a bearing on testing in projects in an Agile environment. He then ppresents a definition of ET from Cem Kaner - this is slightly different than James Bach's defnition. He then discusses what this means -
Important point: If your next test is influenced by the learning from your previous test, then you are doing exploratory testing.
AND PROCEEDS to LAUNCH INTO A MINDMAP based on stuff (words) from the audience. (Pete - Nice - mob mind mapping?)
This process is an interesting exercise in "testing" the session while he is presenting it. By getting people to contribute aspects of ET, based on their understanding, this is drawing people into the conversation. (Pete: photo taken from Ajay's tablet, to be tweeted or otherwise - Hey - LOOK!)
Whatever you do, if the customer is not happy, oh dear.
Probelm - much of what is on the picture is "positive" - what about problems?
* Mis-applied/misunderstood how to apply
* Hard to explain to management
* Hard to ensure coverage
* Miss important content (and context)
* Perceived as "monkey testing"
Modelling; Chartering; Generating/Elaborating; Recording
Resourcing; Observing; Refocusing; Reporting
Questioning; Manipulating; Alternating;
Consider that "Exploratory Testing is a mindset using this skillset." (Jon Bach)
The Secret of ET -
Track session: Vencat Moncompu Transforming Automation Frameworks to Enable Agile Testing - a Case Study
(Pete: I'm Missing points. This guy is speaking really fast - bang, bang, bang - challenging to keep up!)
Agile Automation by itself can't bring about faster time to market. Traditional Automation (not in an Agile environment) are representative. Problems with that include: (usually) UI dependent, accrues tech debt, builds up waste.
Function / behavior attributes are central to ensuring software quality. Challenges in automation include variances from tool to tool in language - often restricted to "regression:
The question becomes how do we make this stuff work -
Make the features scriptless and self documenting tests;
Develop scrips before UI is ready;
Options to run tests across multiple layers;
Intuitive Business freindly - behavioral testing
Presenting known/reasonably common (Pete: at least to me) multi-layer coverage ideas - (pete note: some of the slides are really hard to read between font & background colour, points spoken - not on the slides are fine, but referring to stuff on the slide makes it challenging.)
Flexibility in test automation is needed to reduce/minimize two concers - 1, Beizer's pesticide paradox where methods to find bugs will continue finding the same types of bugs; 2. James Bach's Minefield Analogy - where if you follow the same path, time after time, you clear a single path of mines (bugs) but learn nothing more about any others.
Balancing Automation & ET - the challenge is to keep things in touch with each other. (Pete: there seems to be a growing trend that there is a clear dichotomy between automation and ET. I'm not convinced this is still the case w/ more modern tools. Need to think on this.)
Cites "No More Teams" - Act of Collaborationis an act of shared creation and/or discovery.
Keeping the balance between behavior and function - and reflecting this in the test scripts may help with making this testing clear and bring the value to the test process. (Pete: I'm not certain the dichotomy between "Agile" and "Traditional" automation is as clear - or valid - as the claims some people make about it.)
Keynote: J B Rainsberger -
BANG BANG BANG - If we're so good at this why aren't we rich yet? (Kent Beck, 2003)
The fact is, we tend to sound zen-mink-like when people ask what Agile is about. Well, its like ... and its a mindset ... and we need to stay focused on... OK. We get it
Part of the problem is the mindset that improvements will cost something. This is kind of done with us being a pain in the butt with our own touchy-feely thing. We argue against restrictive dogmatic rules and we land in the thing of "we need to change this and focus on the mindset" - followed by papers, books and whatnot that date back to 2005 or 2006.
Etudes - a particular piece of music that is intended to incorporate specific exercises (Pete: in context of music). Why don't we do that with Agile? Practice skills while doing our work?
For years, we argue stuff - and still need to make things work and - somehow - something is missing.
The problem is "They" have no real reason to change - so "They" will work to the Rule - Translated, they'll do the absolute minimum based on the "rules."
Citing "New Strategic Selling" for the reasons why people don't buy. The idea of perceived cost vs pperceived value is the crux of the problem. We fail in that manner.
Cites Dale Emery - A person will feel motivated to do something as long as they have a series of elements and see value in what they want to do. Sustaining change is HARD -
The most we can really do is support each other - we know the answers and what needs to be done - so hang in there and help, listen. We can either storm off and never deal with the idiots again. Or - We can pity ourselves. Or...
We can look at things in other ways. Shows a video from Mad TV with Bob Newhart as a Doctor of some sort. And a "patient" who fears being buried in a box - His solution - STOP IT! In fact - every problem has a solution of STOP IT!
Let us consider what our most "well advertised" popular practices are... stupid.
If you haven't read "Waltzing With Bears" - then do so. There's a chapter/section to the effect of "Agile Project Management is Risk Management" - which is kind of part of what ET does. Why? maybe for the same reason that we have daily stand ups and we manage to completely miss stuff that gets /stated/ - can't get resolved in a few seconds - it gets lost. MAYbe this is what
Cucumber - what most people associate with BDD. Consider this...GAH! THIS JUNK ENDS UP LOOKING LIKE COBOL!!!!!!!!!! BAD!!!!!!!!!! We get so much baggage that stuff gets lost because we're worried about the baggage -
Rule - INVOLVE THE CUSTOMER - DUH! (And Joe says he's been saying that for 15 years.)
DeMarco/Lister's "Lost but making good time" Excellent point - AND the Swing cartoon (meh, I'll dig a copy out and post it here.)
RULE - Talk in examples - eg, lost luggage - Which of these bags are closest to your bag? followed by - How is your bag different from the one in the picture? This allows us to get common examples present, and trigger information to get important details that may be missed/overlooked.
One problem with this is we forget to have the conversation - we want to record the conversation - but forget to have the conversation in the first place.(Cites Goyko Adzuk's Communications book.)
The problem with recording stories on a card and doing the "this is how its done" thing - They are fantastic for figuring out how to create them - and many years later we have not examined any other way to do things - we simply are shifting the form of the heavily documented requirements. Translated - You should not still be limited to "As a, I want, so that"to describe stories.
This gives us some really painful problems in how to
Promiscuous Pairing and Beginners Mind: Embrace Inexperience
Awesome paper on beginning/learning to do pairing.
Angela Harms - "Its totally ok for you to suck... that means you can walk around and listen to people criticize your work and tell you it sucks. It means 'that's a stupid name' is OK to hear and to say."
The point of a RETROSPECTIVE is to be part of a continuous improvement process. Usually that part gets ignored. The reason it gets ignored is - Lack of Trust - The ting is -Trust is the key to things working - and this comes from the 'trust' to open yourself up to vulnerability.
Consider - James Shore's paper that Continuous Integration Is An Attitude, Not a Tool. CI is NOT a BUILD!!!!!!!!!!!!!!!!!
When we impose Scrum on the world without them understanding WHY and THE POINT - we get re labeled stage/stop gate stuff.
Part og the Problem is EGO - WE DON"T LIKE TO FEEL SOMETHING OTHER THAN AWESOME!
AND - since he's running out of time, Joe is BLOWING through the handful of slides -( Pete: he's got loads of visuals - I hope the slide deck becomes availabe cuz there is no way I am capturing more than 1% of the balance.)
One thing - Consider throwing test ideas up on the wall adn play the "This is what I hate about this test idea" game. See what happens.
AND WE'RE DONE!!!!!!!!!!!!
Consensus Talks -
OK -Lost connection to my blog's hosting server and THAT is a problem.
Pete note: I finally am reconnected - I got in late in a very nice woman's presentation - I met her in line for lunch and wanted to hear what she had to say - except I've forgotten her name and could get nothing recorded except one really, really important point she made.
Monika Januszek - Automation - a Human Thing (its for people)
Tools, processes, approaches, process models, STUFF that we want people to do differently or change - including automation - in order to be adopted and accepted - and then used - address ONE BIG QUESTION - "What's in it for me?"
If we can't address that - INCLUDING FOR AUTOMATION MODELS - don't expect to see any meaningful change.
Next up - Lindsey Prewer -
Pete note: So busy getting connected and following up on the last talk - now trying to catch up here!
Here we go. You can't fix problems by throwing people into the fray. When hiring people is several a month, and the goal is to make things happen cleanly - THEN - there is a cycle of hire/train/start testing/expand training.
Because they can not bring people in - smart automation approaches were needed -
Start with the values and goals
Make hiring scalable
HAve clear priorities
(Pete: at least 1 more I missed)
Mind the points in the Agile Manifesto - Particularly the FIRST one - the idea of PEOPLE.
Without people - you are doomed.
Agile Performance testing
Started with the difference between stress and load testing. (Pete: OK - I get that.)
Made significant point that not everything needs or should be automated (Pete: Almost an after thought? possibly the result of trying to fit too much in from a much larger presentation.)
Anahit Asatryan -
SAFe - Scaled Agile Framework
Tool - AtTask -
Pete: And I finally have connection to my blog again. (hmmm)
Lars Sjodhal - Comunication
Loads of stuff that is really good - so here's a summary (lost connection thru most of the presentation)
Realize that positive questions may impact results - e.g., Would you like coffee? vs Would anyone like coffee? vs I'm getting coffee can I get some for anyone else?
Silence may imply a false acceptance - OR - presence of group think - encourge full discussion
Major fail - 1991 - Christmas Cards from company called Locum -
Except someone wanted to change the o in the name to a heart - reasons unknown -
And no one said anything - and - they became famous on the interwebs as a result.
Gojko's Hierarchy of Software quality
Verify Assumptions - except first you muse learn/find/discover the assumptions ?
And the Tree Swing cartoon shows up AGAIN!
Without getting inside the head of the customers, you may not the real expectations/requirements.
"You have to start with the customer experience and work backward to technology." - Steve Jobs
If you have multiple paths, with a channel each, consider cross channel testing - (Pete - but don't cross the streams - Ghostbusters)
Consider user based analytics - who/what is your customer - then you can begin to derive possible paths.
Pete: So now that I can get to the blog - yup - lost connection - AGAIN - Trying to resume for the final key note of the whole show - bare boes notes on that as connection is sporradic.
Lisa Crispin and Janet Gregory - On Bridging gaps.
After a tremendous skit of bringing testers and developers together with customers which included a tremendous narration performance by a tester who happens to live in Michigan (ahem) - they launch into a series of experience reports around false communication models, restrictive models within the organization - the lack of any recognition of contractual understanding.
Simply put - without Trust, this stuff does not happen.
They do a fine job of referring to keynotes and selected track talks that have been presented during the week.
-- Lost connection again - and its back --
Dividing work into small pieces, baby-steps, is one way of influencing work and making things happen. It makes it a bit more palatable. It makes it easier to work on small pieces. It makes it easier to simply do stuff.
And there is a new book coming out from the Agile Testing Wonder Twins.
This wraps up the conference. I have more notes coming from Day 0 - the Tutorial Day. We have some interesting notes to finish...
AND - The conference organizers announced that next year's conference (November, 2014) will have a kindergarten to mind the kids of attendees. (Cool idea.)
Good night -
Finished with Engines.