Again, I had intended to write this last night. It is amazing top be how mentally and physically drained I am byt the end of each day at conferences. So many smart people it seems impossible to keep up.
Right, so, people. Had some really nice hallway conversations with Elana Houser, who was in the BBST Foundations course with me. We did not always agree with each other in the course, she is, however, a very good thinker. Lynn McKee, Nancy Kelln, Selena Delesie and had nice chats and gave great insights on discussion topics. I also brifely met Karen Johnson - OK folks, she is smart and wise - doesn't always come in the same package.
Amazing talk(s) with Michael Hunter - Yeah, the Braidy Tester guy. He really is as good and inspriational as his blog posts seem. Oh, now then, let's see, Had some Fantastic chats with Ajay.Balamurugadas. Ben Yaroch is crazy smart and a hard worker - really. Michael Larsen really DOES have as much energy as his podcasts make it seem like he does. Let's see. Also had some good visits with Justin Hunter, Paul Holland, Bill Matthews and Johan Jonasson - Phil McNealy is a good person to know as well.
One of the highlights for me was seeing the Emerging Topics track come together and be a reality. Some of the speakers had a bit of a rough go. Many had never presented outside their own company before - WHAT a daunting task! Yeah - Present a 20 minute idea in front of some of the best testers around. YEAH! Still, everyone made it through the experience, good information and ideas were shared - even if folks were a little nervous.
I had a chance to drop in the tail end of the Open Season of the BBST Experience track. Cool Q&A session, lots of energy. The Lightning Talks, which I dropped in on after the BBST talk ended, were interesting - ideas and "quick hits" with ideas. Fun.
I ended up having an interesting conversation with Felipe Knorr Kuhn, Gary Masnica, Phil McNealy and Lanette Creamer. Job Titles, Job Roles, What to Do, How things work... highly enjoyable, mentally invigorating. This set me up for a good session in the EdSIG - Education Special Interest Group.
Michael Larsen, me, some dozen other people talking via Skype with Rebecca Fiedler and Cem Kaner (who could not be at CAST.) Good ideas, much meaty discussion - look for another blog post on that before too long.
It was an amazing day.
Oh, I did not get elected to the Board of Directors for AST. Now, some folks tried to console me, I was unconsoleable. Well, technically, literally, there was nothing to console me about! I believe that each of the five candidates were eminently qualified to serve on the board and three were selected. This is good.
So, this morning, I find myself sitting at a table (starting this blog post actually) and Michael Hunter sat down to chat and have a little breakfast. Griffon Jones dropped his pack and went for a little breakfast, but got tied up. As it was, Michael and I had a great visit before we headed off to Michael Bolton's workshop on Test Framing. That, too, is another blog post.
Showing posts with label Testers. Show all posts
Showing posts with label Testers. Show all posts
Thursday, August 11, 2011
Monday, August 9, 2010
Cast Curtain Call - Part 2 - Conversations
I was very fortunate this last week to have had extended conversations with several people, some "movers and shakers" and some "well respected testers" and some "regular folks." Rather than sort out which is which, I'm going to focus on some of the great conversations I had, starting Sunday evening, through Thursday.
The hard part is picking out the bestest ones. So, I'm going to summarize some and the mental meanderings that resulted.
Monday, chatting with Griffin Jones, he asked bout the mission and charter for the group I work with. We had been talking about techniques and some of the on-line forum conversations around exploratory/ad-hoc/fully-scripted testing in light of Michael Bolton's blog entry on Testers: Get Out of the QA Business. He asked about this after what I thought was a tangent that was around the question of "what works best for what kind of environment?"
His simple question got me to wondering, other than the slogan on the company's internal wiki about the QA/Testing group, what is it that we are about? For some time, we have been working toward getting more people involved in the idea of tangible requirements, of QA helping define requirements and acting as a bridge in the design process. But that begged the question - What is our mission?
I wonder how many testing groups (or whatever each group calls itself) have a "slogan" but no "mission" or "purpose" statement that can be pointed to, where everyone knows about it. If you don't know about it, is it reasonable for people to act towards that - its a goal, right? How do you achieve a goal if you don't know what it is (I feel another blog post coming on, but not right now!)
I had several brilliant little chats with Scott Barber. It helps when you're sitting next to each other at the Registration table. We talked about a bunch of stuff - For those who have read his stuff or have read his articles or postings in various online forums for that matter, he really is as smart as he seems - Holy Cow!
We got onto the "mission" of testing groups and "doing things right" vs "doing things well enough." What most theory-centric folks sometimes forget is that there is a cost to "doing things 'right.'" If the product will be shipped for 2 weeks late because you want to run a 4 week duration system load test, costing approximately $1M, what will the company gain? What are the risks? If you're extremely likely to see significant problems within the first 8 to 12 hours and the likelihood decreases over time, what will that extra two or even three weeks going to get you - other than a delay to delivery and a dissatisfied customer? That, in itself, is one reason why testers should inform and advise but not make the final go/no-go decision.
Yeah - there's another blog post on that in detail.
Other people I met included Jeff Fry, where DOES he get all that energy? Then Selenia Delesie was holding court on lightning talks in the lobby. WHOA! Crazy-smart and the nice as the day is long. Selena gave two really good presentations - unfortunately, while I read the abstract and supporting paper, there were not enough of me to get to all the presentations that I wanted to get to. I think that's a sign of a fantastic conference - too many good simultaneous presentations.
Other folks I met included Michael Hunter, the Braidy Tester - What a guy, although he's now braidless. Paul Kam from DornerWorks is another really smart guy. DornerWorks was one of the sponsors of the conference. They did a lot to make this happen.
Tuesday night the "Rebel Alliance / CASTAway Social" was a hoot. Tester games and chicken-wings and varied and sundry edibles and drinkables - Thanks to Matt Heusser for making that happen. He's another one who is just crazy-smart and really friendly. If you have not seen his TWIST podcasts, check them out.
After the social, a bunch of folks went to dinner and had a fantastic time. If I recall correctly, there were 15 or 16 of us. I scored a major triumph by having Michael Bolton sit across from me at the end of the table. What an amazing time. Melisa Bugai was sitting with us as we discussed the likely causes of why the lights on the deck of the restaurant kept going out. Yes, we tested the theory when Melissa unplugged the rope light going around the railing. They all stayed on after that. WHOO-HOO!
The conversation, as any conversation with Michael, took many twists and turns. We talked on language and literacy and music and education and mental discipline and CBC radio shows and how each touched on testing. What a mind-bendingly enjoyable night.
Wednesday I had the great pleasure of dining with Lynn Mckee and Nancy Kelln - and my boss. Best part is, the boss picked up the tab! WHEEEEEEEEEEEEE! Another night of fantastic conversation on testing and wine and great food. Did I mention we talked about testing?
There were so many other great conversations - How can I give a recap of all of them? As it is, there is much to think on.
The hard part is picking out the bestest ones. So, I'm going to summarize some and the mental meanderings that resulted.
Monday, chatting with Griffin Jones, he asked bout the mission and charter for the group I work with. We had been talking about techniques and some of the on-line forum conversations around exploratory/ad-hoc/fully-scripted testing in light of Michael Bolton's blog entry on Testers: Get Out of the QA Business. He asked about this after what I thought was a tangent that was around the question of "what works best for what kind of environment?"
His simple question got me to wondering, other than the slogan on the company's internal wiki about the QA/Testing group, what is it that we are about? For some time, we have been working toward getting more people involved in the idea of tangible requirements, of QA helping define requirements and acting as a bridge in the design process. But that begged the question - What is our mission?
I wonder how many testing groups (or whatever each group calls itself) have a "slogan" but no "mission" or "purpose" statement that can be pointed to, where everyone knows about it. If you don't know about it, is it reasonable for people to act towards that - its a goal, right? How do you achieve a goal if you don't know what it is (I feel another blog post coming on, but not right now!)
I had several brilliant little chats with Scott Barber. It helps when you're sitting next to each other at the Registration table. We talked about a bunch of stuff - For those who have read his stuff or have read his articles or postings in various online forums for that matter, he really is as smart as he seems - Holy Cow!
We got onto the "mission" of testing groups and "doing things right" vs "doing things well enough." What most theory-centric folks sometimes forget is that there is a cost to "doing things 'right.'" If the product will be shipped for 2 weeks late because you want to run a 4 week duration system load test, costing approximately $1M, what will the company gain? What are the risks? If you're extremely likely to see significant problems within the first 8 to 12 hours and the likelihood decreases over time, what will that extra two or even three weeks going to get you - other than a delay to delivery and a dissatisfied customer? That, in itself, is one reason why testers should inform and advise but not make the final go/no-go decision.
Yeah - there's another blog post on that in detail.
Other people I met included Jeff Fry, where DOES he get all that energy? Then Selenia Delesie was holding court on lightning talks in the lobby. WHOA! Crazy-smart and the nice as the day is long. Selena gave two really good presentations - unfortunately, while I read the abstract and supporting paper, there were not enough of me to get to all the presentations that I wanted to get to. I think that's a sign of a fantastic conference - too many good simultaneous presentations.
Other folks I met included Michael Hunter, the Braidy Tester - What a guy, although he's now braidless. Paul Kam from DornerWorks is another really smart guy. DornerWorks was one of the sponsors of the conference. They did a lot to make this happen.
Tuesday night the "Rebel Alliance / CASTAway Social" was a hoot. Tester games and chicken-wings and varied and sundry edibles and drinkables - Thanks to Matt Heusser for making that happen. He's another one who is just crazy-smart and really friendly. If you have not seen his TWIST podcasts, check them out.
After the social, a bunch of folks went to dinner and had a fantastic time. If I recall correctly, there were 15 or 16 of us. I scored a major triumph by having Michael Bolton sit across from me at the end of the table. What an amazing time. Melisa Bugai was sitting with us as we discussed the likely causes of why the lights on the deck of the restaurant kept going out. Yes, we tested the theory when Melissa unplugged the rope light going around the railing. They all stayed on after that. WHOO-HOO!
The conversation, as any conversation with Michael, took many twists and turns. We talked on language and literacy and music and education and mental discipline and CBC radio shows and how each touched on testing. What a mind-bendingly enjoyable night.
Wednesday I had the great pleasure of dining with Lynn Mckee and Nancy Kelln - and my boss. Best part is, the boss picked up the tab! WHEEEEEEEEEEEEE! Another night of fantastic conversation on testing and wine and great food. Did I mention we talked about testing?
There were so many other great conversations - How can I give a recap of all of them? As it is, there is much to think on.
Wednesday, May 5, 2010
Requirements and Listening
At the QUEST conference in Dallas, there were many presentations, exercises and discussions around testers and requirements. Along with stressing the importance of requirements to project success, a regular theme was getting testers involved early in the project to help get the requirements “right.”
What was not often discussed was how the testers were to actually help get the requirements “right.” The problem, as I see it, is that there is not a clearly defined argument that can explain to me how being a good “tester” automatically makes a person a good “requirements definer.”
There were a couple of points made that people may have missed. One was part of a hall conversation -unfortunately I don’t recall who made it. This fellow's point was that the testers needed to do more than simply insist on “testable” requirements. Without being able to bring something to the discussion – without being able to help define the requirements, what purpose does a tester really serve at the discussion?
Nancy Kelln gave a presentation on testing in an Agile environment. It was interesting watching some of the attendees grappling with some of the basic premises found in a variety of Agile methodologies. While talking about Stand-ups, she answered the above question very succinctly. She said, in essence, the role of the tester in an Agile Stand-up, is to listen.
Simple, no? Its what all of us are supposed to do anyway, but usually find ourselves thinking about other things for at least part of the time.
By listening – by hearing what is being said, the tester can gain insight into some of the reasoning or logic or problems that are being encountered. If a tester is listening critically, and thinking like a tester, they can hear not only what is being said, but can hear what is not being said.
The thing is, most people who do not work in an Agile environment would argue something like "Well, that's Agile. We don't do Agile." You don't need to work in an Agile environment to do this. At Requirements reviews, or better yet, Requirements gathering/discovery meetings - the same technique can work: listen.
Listen critically, then, don't be afraid to ask questions. These questions can sometimes be straightforward. For example “We’ve talked about regulations changing around Y. Are there any regulations we need to consider for X?"
How many times have you been in a conversation and asked a question because you were looking for insight, and the person you asked it of had an "Ah-HA!" moment because of it? They realized that something was missing and there was an unconsidered possibility or gap.
By asking questions of the experts, the tester can clarify their own thoughts and maybe trigger others to also ask questions. Sometimes, the strength of not knowing things is asking questions and listening carefully to the answers.
What was not often discussed was how the testers were to actually help get the requirements “right.” The problem, as I see it, is that there is not a clearly defined argument that can explain to me how being a good “tester” automatically makes a person a good “requirements definer.”
There were a couple of points made that people may have missed. One was part of a hall conversation -unfortunately I don’t recall who made it. This fellow's point was that the testers needed to do more than simply insist on “testable” requirements. Without being able to bring something to the discussion – without being able to help define the requirements, what purpose does a tester really serve at the discussion?
Nancy Kelln gave a presentation on testing in an Agile environment. It was interesting watching some of the attendees grappling with some of the basic premises found in a variety of Agile methodologies. While talking about Stand-ups, she answered the above question very succinctly. She said, in essence, the role of the tester in an Agile Stand-up, is to listen.
Simple, no? Its what all of us are supposed to do anyway, but usually find ourselves thinking about other things for at least part of the time.
By listening – by hearing what is being said, the tester can gain insight into some of the reasoning or logic or problems that are being encountered. If a tester is listening critically, and thinking like a tester, they can hear not only what is being said, but can hear what is not being said.
The thing is, most people who do not work in an Agile environment would argue something like "Well, that's Agile. We don't do Agile." You don't need to work in an Agile environment to do this. At Requirements reviews, or better yet, Requirements gathering/discovery meetings - the same technique can work: listen.
Listen critically, then, don't be afraid to ask questions. These questions can sometimes be straightforward. For example “We’ve talked about regulations changing around Y. Are there any regulations we need to consider for X?"
How many times have you been in a conversation and asked a question because you were looking for insight, and the person you asked it of had an "Ah-HA!" moment because of it? They realized that something was missing and there was an unconsidered possibility or gap.
By asking questions of the experts, the tester can clarify their own thoughts and maybe trigger others to also ask questions. Sometimes, the strength of not knowing things is asking questions and listening carefully to the answers.
Subscribe to:
Posts (Atom)