I've written on this idea before. Here in fact. Many other people have written passionately about it as well. As I am fresh from presenting at STPCon Fall 2011 in Dallas and am getting my notes and reviewing my presentation for TesTrek 2011 (http://www.qaitestrek.org/2011/) in a couple weeks in Toronto, I wanted to take a moment and beat this drum one more time.
When you are at a conference, CONFER with people. Talk with them, ask question. Answer questions. Express opinions. Be open to learning. If you disagree with someone, let them know politely - and why. Maybe you are closer than you might realize and simply are stating the same thing different ways.
One really important point.
When the "official" sessions wind down and the "official" "networking opportunities" wrap up - look around for people just hanging from the conference. Then ask if you can join them. Ask what they do, where they do it, what they like about it. You may well learn really valuable ideas you can take back to the boss.
If you see a group of people from the conference sitting in the hotel bar/lounge/whatever, a quick scan will give you some idea of the conversation(s) going on. If it is vaguely related to software and/or testing, ASK IF YOU CAN JOIN THEM!
I know from my own experience, that if I have ANY energy left and no absolutely pressing duties elsewhere, I like to talk with other test professionals and learn. Yeah. I learn a lot just from talking with people. This last conference, I had some fantastic conversations with Doug Hoffman, Fiona Charles, Tony Bruce, Scott Barber, Dawn Haynes, Lanette Creamer, Catherine Powell, Robert Walsh, Dani Almog... the list goes on - Those are the folks that popped into my mind immediately. Testing Heavyweights all - and I gained insight, if not actionable information, from each conversation.
So, I invite any TesTrek Symposium attendee. If you see me sitting in a chair in the hallway sipping the web, or in the conference center lounge, please feel free to join me. Really. I like meeting people and sharing ideas, experiences and viewpoints.
I'm there to learn, too. Please help me learn.
Sunday, October 30, 2011
Saturday, October 29, 2011
STPCon Fall 2011 - Part IV
Thursday was perhaps the most relaxing day I had the week in Dallas. I enjoyed a relaxed breakfast with a large number of testers and speakers at the conference - it is easy to relax when all of our speaking commitments have been fulfilled.
Some were still not quite done however. There were a series of talks called the "Rapid Fire Challenge" with a key idea packed into a five minute presentation. Dawn Haynes gave an interesting presentation around Tester Personalities. Lanette Creamer gave a fun presentation on "Tester Tricks" (where she listed Adam Goucher as her "favorite tool.") Mark Tomlinson talked about Risks and Costs of false positives in automation testing. Scott Barber gave a cool breakdown on ideas that were useful for determining what to test and what not to test. He called it FIBLOTS. Cool.
Then, Fiona Charles delivered a stunning keynote on the question of managing Testing or the Testing Process. Ummm - wow. I was tweeting comments from that as fast as I could. It was good. It was really good.
From there, I went to Doug Hoffman's presentation on computer assisted exploratory testing. Overall, I enjoyed it and got some ideas I need to consider. The huge drawback was that there was simply too much information to squeeze into 1 hour and 15 minutes. It would take at least a full day to get a good survey of the ideas - and a couple of days would be better.
From there, I ran into James Pulley - a fellow SQAForums Moderator. We had never met in person and this was a great opportunity.
From there, I spent some time trying to get my notes in order, get things sorted out and "filed" so I could make use of it later. The rest of the time there (which was not much by now) I spent it chatting with people, having a light lunch with Fiona Charles, Matt Heusser and Yvette Francino. After that, Matt, Fiona and I headed to the airport for our respective flights out.
All in all, this was a remarkable week.
Some were still not quite done however. There were a series of talks called the "Rapid Fire Challenge" with a key idea packed into a five minute presentation. Dawn Haynes gave an interesting presentation around Tester Personalities. Lanette Creamer gave a fun presentation on "Tester Tricks" (where she listed Adam Goucher as her "favorite tool.") Mark Tomlinson talked about Risks and Costs of false positives in automation testing. Scott Barber gave a cool breakdown on ideas that were useful for determining what to test and what not to test. He called it FIBLOTS. Cool.
Then, Fiona Charles delivered a stunning keynote on the question of managing Testing or the Testing Process. Ummm - wow. I was tweeting comments from that as fast as I could. It was good. It was really good.
From there, I went to Doug Hoffman's presentation on computer assisted exploratory testing. Overall, I enjoyed it and got some ideas I need to consider. The huge drawback was that there was simply too much information to squeeze into 1 hour and 15 minutes. It would take at least a full day to get a good survey of the ideas - and a couple of days would be better.
From there, I ran into James Pulley - a fellow SQAForums Moderator. We had never met in person and this was a great opportunity.
From there, I spent some time trying to get my notes in order, get things sorted out and "filed" so I could make use of it later. The rest of the time there (which was not much by now) I spent it chatting with people, having a light lunch with Fiona Charles, Matt Heusser and Yvette Francino. After that, Matt, Fiona and I headed to the airport for our respective flights out.
All in all, this was a remarkable week.
Thursday, October 27, 2011
STPCon Fall 2011 - Part III
Wednesday at STPCon Fall was an interesting day. In the morning I had signed up to participate in the "Speed Geeking Breakfast Bytes" - 8 minute mini-presentations on a topic we wanted to be sure that people could head home with at the end of the conference. All the presenters were in a biiiiiiig room giving their presentations to a table of people all at the same time. We gave our presentations three times before the morning keynote for the day.
The topic I presented was was "Integration Testing Lessons from Pulp Fiction." Yeah. kind of a movie-theme for me this year - Harry Potter on Tuesday and Pulp Fiction on Wednesday. Fun! The first run through for me was a bit rough - actually, I did not finish before time was called. There were a couple of interruptions and, frankly, I probably needed another cup of coffee before launching into the first run. Sorry folks. The second and third run throughs went pretty well and everyone had fun. One participant in the third session was giving quotes from the movie at appropriate times!
The rest of the day was dedicated to simply going to sessions and hanging out with people I wanted to talk with. What a fantastic way to spend a conference - Not preparing for a presentation or answering questions about the presentation, but simply going to presentations and sitting in the back row. Cool.
I went to Dani Almog's presentation on Automated Test Oracles. There have been presentations before on a similar topic - what made his interesting was how he developed the oracles: "neural networks" developed from the data identified as correct or incorrect. Cool stuff. It requires a huge amount of rigor and control, not to mention structure, but it looked interesting to me how he went about building it.
The next session I went to was by Karen Johnson on Discipline in Teeting. Ironically, I got there late. Karen spoke to a really full room on how to keep motivated and moving forward. There were a lot of good suggestions - and she explained how she made use of each, from time-boxing to a form of "Pomodoro Technique" to setting small rewards, eg., "Finish this then go toand get a nice cup of .
One important aspect she suggested was to simply change location - literally. Go for a walk. Do something ELSE. Go somewhere else. Like, a coffee shop, a conference room and close the door. Forward the phone to voice mail. Find a park bench (or comparable) and try to clear your head so you can think better. What I thought was cool about this was how many people attending the session shared their ideas on what they try and do. It was really a fun session.
This brought us to lunch and the Lunch Keynote by Matt Heusser. Matt's topic was "How to Reduce the Cost of Testing on Monday." Meaning, things you can start with when you get back to the office to be able to focus on testing - not time reporting, not attending meetings, not preparing project status reports and status reports on the status reports and status reports on the status reports on the... yeah, you get the idea.
He talked about taking steps to open up communication - to help people be able to work more effectively and spend more time and energy focused on testing - so they are really testing, not sort-of-testing. It was really interesting.
After this, I headed up to listen to Lanette Creamer present on pairing programmers and non-programmers. Now, she did not mean the "Paired Programming" some of the XP (and other) folks mean. Instead, she meant more of spending some time working together to get things sorted out - either in planning, designing or executing tests - or talking about the application - or... yeah. A cool idea I like to call "Communication". I know its kind of a weird concept, but it seems to have some potential.
Following this, I caught up with Catherine Powell, a crazy-smart tester, Matt Heusser (who had come down to Earth after the success of his keynote) and a handfull of other folks for a quiet chat and a little relaxation. Wonderfull people.
We then headed off to the "Open Jam Sessions" - a bit of fun before dinner. Folks split up into groups to play a variety of games and fun exercises and generally have a good light series of exercises. Lots of fun.
More conversations with a variety of people over dinner then a little writing wrapped up my last full day in Dallas for STPCon - BUT - there was still Thursday to look forward to.
The topic I presented was was "Integration Testing Lessons from Pulp Fiction." Yeah. kind of a movie-theme for me this year - Harry Potter on Tuesday and Pulp Fiction on Wednesday. Fun! The first run through for me was a bit rough - actually, I did not finish before time was called. There were a couple of interruptions and, frankly, I probably needed another cup of coffee before launching into the first run. Sorry folks. The second and third run throughs went pretty well and everyone had fun. One participant in the third session was giving quotes from the movie at appropriate times!
The rest of the day was dedicated to simply going to sessions and hanging out with people I wanted to talk with. What a fantastic way to spend a conference - Not preparing for a presentation or answering questions about the presentation, but simply going to presentations and sitting in the back row. Cool.
I went to Dani Almog's presentation on Automated Test Oracles. There have been presentations before on a similar topic - what made his interesting was how he developed the oracles: "neural networks" developed from the data identified as correct or incorrect. Cool stuff. It requires a huge amount of rigor and control, not to mention structure, but it looked interesting to me how he went about building it.
The next session I went to was by Karen Johnson on Discipline in Teeting. Ironically, I got there late. Karen spoke to a really full room on how to keep motivated and moving forward. There were a lot of good suggestions - and she explained how she made use of each, from time-boxing to a form of "Pomodoro Technique" to setting small rewards, eg., "Finish this then go to
One important aspect she suggested was to simply change location - literally. Go for a walk. Do something ELSE. Go somewhere else. Like, a coffee shop, a conference room and close the door. Forward the phone to voice mail. Find a park bench (or comparable) and try to clear your head so you can think better. What I thought was cool about this was how many people attending the session shared their ideas on what they try and do. It was really a fun session.
This brought us to lunch and the Lunch Keynote by Matt Heusser. Matt's topic was "How to Reduce the Cost of Testing on Monday." Meaning, things you can start with when you get back to the office to be able to focus on testing - not time reporting, not attending meetings, not preparing project status reports and status reports on the status reports and status reports on the status reports on the... yeah, you get the idea.
He talked about taking steps to open up communication - to help people be able to work more effectively and spend more time and energy focused on testing - so they are really testing, not sort-of-testing. It was really interesting.
After this, I headed up to listen to Lanette Creamer present on pairing programmers and non-programmers. Now, she did not mean the "Paired Programming" some of the XP (and other) folks mean. Instead, she meant more of spending some time working together to get things sorted out - either in planning, designing or executing tests - or talking about the application - or... yeah. A cool idea I like to call "Communication". I know its kind of a weird concept, but it seems to have some potential.
Following this, I caught up with Catherine Powell, a crazy-smart tester, Matt Heusser (who had come down to Earth after the success of his keynote) and a handfull of other folks for a quiet chat and a little relaxation. Wonderfull people.
We then headed off to the "Open Jam Sessions" - a bit of fun before dinner. Folks split up into groups to play a variety of games and fun exercises and generally have a good light series of exercises. Lots of fun.
More conversations with a variety of people over dinner then a little writing wrapped up my last full day in Dallas for STPCon - BUT - there was still Thursday to look forward to.
STPCon Fall 2011 - Part II
Tuesday at STPCon in Dallas was an astounding day. Matt Heusser and I were slated to present a session entitled "On Complete Testing" immediately following Rex Black's morning keynote address. As I wanted to prepare the room for our presentation and make sure all our potential examples were queued up, I admit that I ducked out early.
Our Tuesday morning session started with a simple question to the participants "When the boss comes in and says 'We need this completely tested?' or 'We need this to be bug free' what is really meant?" at is complete testing?" We got an answer we expected - "That we have complete coverage in our testing." OK. Coverage of what? "Requirements."
We began discussing that idea, which drew out things like validation metrics, boundaries and equivalences defined within the requirements, and what can be done about undocumented requirements, assumptions and presumptions, expectations that were not communicated, and other problems.
We moved on state diagrams - mapping each potential state within the application and how that can be exercised. I pulled out a way cool example of a system I had worked on a few years ago. The basic functions worked really well. However, by sheer accident a memory leak in the application was found. At no I had tested a few years ago. This was an error found simply by letting the application idle over the weekend. This proved to be an example of a problem of strange problems that can occur outside of our control or way of expecting any issues.
We moved on to the idea of code coverage as a means toward complete coverage. This lead us to the differences between statement coverage and branch coverage. This was an interesting discussion on just what the differences could be and how we could potentially miss paths even if every branch is tested. We may miss combinations of branches. We agreed that the idea of "100 Percent" coverage of lines or branches still would not give us complete testing.
We did agree that none of these techniques would give us true complete testing in isolation. If we made use of all of these techniques, we would have a greater likelihood of coming close to "complete" testing.
We continued down through input combination coverage, the idea of of order and filtering of memory, state and interrupt problems. All in all, we had a rolling discussion and the hour and 15 minutes flew past for both Matt and I.
We had an absolute blast!
(Note - everyone who was in the session, we promised to send a transcript of the discussion to all who requested it. SO, those who dropped off their business cards or gave us their email address on the "signup list" - I'm still working on that and I'll get it out as quickly as I can. OK?)
The session wrapped up. We headed out and got a cool beverage, then had some really interesting hallway conversations, and headed to lunch.
The afternoon found me in a series of chats and conversations with people and just loving every minute of it. Around 3:30, I reached into my briefcase, pulled out and put on a tie I brought specifically for my upcoming presentation on test leadership.
My afternoon presentation was on Test Leadership Lessons from Harry Potter. Yes, I know, a geeky-nerdy topic, but that is kinda me.
I walked in, got a very nice introduction from Fiona Charles who introduced me as "a colleague and friend" - which left me gobsmacked (not a good thing to have happen just before speaking.) Suffice to say that the idea of technical leadership and the example of Harry Potter as a reluctant leader, one who is not appointed and does not seek to be a leader, but finds himself in that role, has similarities in the ideas of Technical Leadership expressed so well by Gerald Weinberg.
It was a fun session for me, and it seemed to me that the participants also had fun. I encouraged them to write - for themselves, in blogs, newsletters and, well, anywhere. Learn and share what you learn. Experiment and share your results. Be bold and dare greatly - (kinda like what testers are expected to do when testing, right?)
The session wrapped up and we retired to the "Welcome Reception" on the conference center's patio. This was a great evening with nice appetizers, great conversation, meeting people and generally having a great time. I was even interviewed by Yvette Francino for Search Software Quality! How Cool!
My day wrapped up with a nice relaxing evening back with another crowd of testers sharing stories and having a good day.
I can not imagine a better way I could have celebrated my 50th birthday.
:-)
Our Tuesday morning session started with a simple question to the participants "When the boss comes in and says 'We need this completely tested?' or 'We need this to be bug free' what is really meant?" at is complete testing?" We got an answer we expected - "That we have complete coverage in our testing." OK. Coverage of what? "Requirements."
We began discussing that idea, which drew out things like validation metrics, boundaries and equivalences defined within the requirements, and what can be done about undocumented requirements, assumptions and presumptions, expectations that were not communicated, and other problems.
We moved on state diagrams - mapping each potential state within the application and how that can be exercised. I pulled out a way cool example of a system I had worked on a few years ago. The basic functions worked really well. However, by sheer accident a memory leak in the application was found. At no I had tested a few years ago. This was an error found simply by letting the application idle over the weekend. This proved to be an example of a problem of strange problems that can occur outside of our control or way of expecting any issues.
We moved on to the idea of code coverage as a means toward complete coverage. This lead us to the differences between statement coverage and branch coverage. This was an interesting discussion on just what the differences could be and how we could potentially miss paths even if every branch is tested. We may miss combinations of branches. We agreed that the idea of "100 Percent" coverage of lines or branches still would not give us complete testing.
We did agree that none of these techniques would give us true complete testing in isolation. If we made use of all of these techniques, we would have a greater likelihood of coming close to "complete" testing.
We continued down through input combination coverage, the idea of of order and filtering of memory, state and interrupt problems. All in all, we had a rolling discussion and the hour and 15 minutes flew past for both Matt and I.
We had an absolute blast!
(Note - everyone who was in the session, we promised to send a transcript of the discussion to all who requested it. SO, those who dropped off their business cards or gave us their email address on the "signup list" - I'm still working on that and I'll get it out as quickly as I can. OK?)
The session wrapped up. We headed out and got a cool beverage, then had some really interesting hallway conversations, and headed to lunch.
The afternoon found me in a series of chats and conversations with people and just loving every minute of it. Around 3:30, I reached into my briefcase, pulled out and put on a tie I brought specifically for my upcoming presentation on test leadership.
My afternoon presentation was on Test Leadership Lessons from Harry Potter. Yes, I know, a geeky-nerdy topic, but that is kinda me.
I walked in, got a very nice introduction from Fiona Charles who introduced me as "a colleague and friend" - which left me gobsmacked (not a good thing to have happen just before speaking.) Suffice to say that the idea of technical leadership and the example of Harry Potter as a reluctant leader, one who is not appointed and does not seek to be a leader, but finds himself in that role, has similarities in the ideas of Technical Leadership expressed so well by Gerald Weinberg.
It was a fun session for me, and it seemed to me that the participants also had fun. I encouraged them to write - for themselves, in blogs, newsletters and, well, anywhere. Learn and share what you learn. Experiment and share your results. Be bold and dare greatly - (kinda like what testers are expected to do when testing, right?)
The session wrapped up and we retired to the "Welcome Reception" on the conference center's patio. This was a great evening with nice appetizers, great conversation, meeting people and generally having a great time. I was even interviewed by Yvette Francino for Search Software Quality! How Cool!
My day wrapped up with a nice relaxing evening back with another crowd of testers sharing stories and having a good day.
I can not imagine a better way I could have celebrated my 50th birthday.
:-)
Wednesday, October 26, 2011
STPCon 2011 Fall - Part 1
It has been an interesting couple of days for me. I flew from GRR (yes, of ParkCalc fame) to Dallas with Matt Heusser. We talked about our joint presentations, our individual presentations and what we hoped to learn and hear while at STPCon.
We landed safely, after talking for FOUR HOURS! Yeah, in the plane to Detroit, waiting at Detroit for the lay-over, then on the flight to Dallas. I think the people around us were exceptionally glad that the flights were over. We had a fantastic conversation, the woman next to us said "Wow. I've never heard two guys get so excited about something so boring." I figure she was in marketing... or maybe upper management.
OK, so before I go on, let me just say that the turkey burger was delicious. I found an undocumented requirement. "Delicious AND well-done ground turkey." Bad night Sunday after a highly enjoyable conversation with Matt, Fiona Charles, Rich Hand, Abbie Caracostas and a bunch of other people.
Monday morning I had very little energy, my own fault in retrospect, but we still gave a fairly solid presentation. By afternoon I was closer to "up to speed" and could contribute much more. We learned a lot from doing the workshop in front of living, breathing, thinking people and have already begun making changes for the future.
What we presented, and the exercises we conducted, involved a series of testing ideas, problems and scenarios. We began with Matt talking about the idea of "quick attacks" testing. That is, doing some basic hits against an application even if you don't have much information about it. We then applied a series of exercises around that idea.
Then, we introduced the idea of working against states specifications and expectations and how insights to that will change approaches to testing against the same applications, and others. We then began discussing core ideas around bounds and equivalences in data and how that may impact our testing approach. After lunch, we moved on to discussing a variety of topics.
Matt and I knew that there would be far more information to talk about than we possibly could get into a single day workshop. We created a list of potential topics that could be of interest. We presented that list to the class, allowed them to add their ideas and vote for the topics of interest to them. Each participant was allowed three votes, we sorted based on the number of votes and began working our way down the list.
This was a hugely fun exercise for us and resulted in some interesting discussion among all the participants in the session, as well as Matt and I. What made an impression on some of the students is that we did not always agree. In fact, there were cases where we made a point of showing where we differed, and how our experience and environments, the context in which we worked, impacted some of those views.
As time wound down and we came to the end of the session, we had several topics we had not addressed. Matt pointed out to them that in testing, as in the exercise, we may not have time to test everything on "the list" to be tested. We will then have to work on the items that are of the most interest - Just as we selected topics that showed the most interest to discuss and work through with the students.
Monday night we settled down for a light supper of appetizers and various beverages with a host of intelligent people. The ideas and excellent conversation flowed, although I decided to call it a day and retire fairly early to prepare for Tuesday's sessions and get some rest.
We landed safely, after talking for FOUR HOURS! Yeah, in the plane to Detroit, waiting at Detroit for the lay-over, then on the flight to Dallas. I think the people around us were exceptionally glad that the flights were over. We had a fantastic conversation, the woman next to us said "Wow. I've never heard two guys get so excited about something so boring." I figure she was in marketing... or maybe upper management.
OK, so before I go on, let me just say that the turkey burger was delicious. I found an undocumented requirement. "Delicious AND well-done ground turkey." Bad night Sunday after a highly enjoyable conversation with Matt, Fiona Charles, Rich Hand, Abbie Caracostas and a bunch of other people.
Monday morning I had very little energy, my own fault in retrospect, but we still gave a fairly solid presentation. By afternoon I was closer to "up to speed" and could contribute much more. We learned a lot from doing the workshop in front of living, breathing, thinking people and have already begun making changes for the future.
What we presented, and the exercises we conducted, involved a series of testing ideas, problems and scenarios. We began with Matt talking about the idea of "quick attacks" testing. That is, doing some basic hits against an application even if you don't have much information about it. We then applied a series of exercises around that idea.
Then, we introduced the idea of working against states specifications and expectations and how insights to that will change approaches to testing against the same applications, and others. We then began discussing core ideas around bounds and equivalences in data and how that may impact our testing approach. After lunch, we moved on to discussing a variety of topics.
Matt and I knew that there would be far more information to talk about than we possibly could get into a single day workshop. We created a list of potential topics that could be of interest. We presented that list to the class, allowed them to add their ideas and vote for the topics of interest to them. Each participant was allowed three votes, we sorted based on the number of votes and began working our way down the list.
This was a hugely fun exercise for us and resulted in some interesting discussion among all the participants in the session, as well as Matt and I. What made an impression on some of the students is that we did not always agree. In fact, there were cases where we made a point of showing where we differed, and how our experience and environments, the context in which we worked, impacted some of those views.
As time wound down and we came to the end of the session, we had several topics we had not addressed. Matt pointed out to them that in testing, as in the exercise, we may not have time to test everything on "the list" to be tested. We will then have to work on the items that are of the most interest - Just as we selected topics that showed the most interest to discuss and work through with the students.
Monday night we settled down for a light supper of appetizers and various beverages with a host of intelligent people. The ideas and excellent conversation flowed, although I decided to call it a day and retire fairly early to prepare for Tuesday's sessions and get some rest.
Friday, October 14, 2011
No Black Swans Or Always Expect the Unexpected
So, I expect many, if not most reading this, have heard of Taleb's Black Swan theory. He put this forward around those things that people write off as being so far in the extreme, or so improbable, that "no one" could predict them. Many folks far more learned than I have discussed this many times over. Not just around software events, but in disasters, both natural and, well, not so natural.
Fact is, I have seen so many things in software testing that other folks would write off as "improbable" or "unrealistic" or simply snort in derision over. There was the developer who once tried to say "No user would ever run this purge process with nothing to purge." Really? Never? They'd always know better because, well, they'd know never to do that? How would they know?
I can also think of the times I walked into the same trap, unwittingly. I learned. I learned to be aware that I can not anticipate everything. Now, sometimes that seems odd. Then again, when I think about it, I ran into the same kind of problems other folks had, like those for whom a test result, or worse, an actual event, in production or in the wider world. That problem was, and sometimes still is, perception.
My way of thinking, approaching problems or questions, is sometimes self-limiting. The fact is, I suspect it is the same for most people. What I believe, or maybe hope, is that my awareness of this can help me work around it and be open to multiple possibilities.
Hmmm - that sounds kinda wishy-washy.
What I mean is that I try and be open to the possibility that I missed something. Usually when I do miss something, its because of my own perceptions, my way of approaching a question or scenario. Broadly, my frames.
These models of thought can be usefull. If we are not aware of potential limitations, we will find ourselves in the "No user would ever do X" camp.
Now then. Something REALLY unexpected?
My lady-wife keeps a large garden. We also capture rain water in a couple of large barrels to water that garden. Sometimes, when there is a lot of rain, we will line up some buckets and catch extra water from the run-off of the car-port roof.
An interesting thing happened last week. The lady-wife was trimming some plants. One of the branches had some nice looking flowers on. She decided to put it in a rain-filled bucket until she could bring the flowers in the house. She looked in the first bucket in line and saw a small fish.
A FISH!
Just a small little guppy looking thing. No. We did not put it there. We have no idea how it got there, although we've bounced around some fun theories. Have we come up with a model for how it got there? Sure. Several. We don't know which, if any, is correct.
So a fish in the rain bucket is something I definitely did not expect.
Maybe I should have.
Fact is, I have seen so many things in software testing that other folks would write off as "improbable" or "unrealistic" or simply snort in derision over. There was the developer who once tried to say "No user would ever run this purge process with nothing to purge." Really? Never? They'd always know better because, well, they'd know never to do that? How would they know?
I can also think of the times I walked into the same trap, unwittingly. I learned. I learned to be aware that I can not anticipate everything. Now, sometimes that seems odd. Then again, when I think about it, I ran into the same kind of problems other folks had, like those for whom a test result, or worse, an actual event, in production or in the wider world. That problem was, and sometimes still is, perception.
My way of thinking, approaching problems or questions, is sometimes self-limiting. The fact is, I suspect it is the same for most people. What I believe, or maybe hope, is that my awareness of this can help me work around it and be open to multiple possibilities.
Hmmm - that sounds kinda wishy-washy.
What I mean is that I try and be open to the possibility that I missed something. Usually when I do miss something, its because of my own perceptions, my way of approaching a question or scenario. Broadly, my frames.
These models of thought can be usefull. If we are not aware of potential limitations, we will find ourselves in the "No user would ever do X" camp.
Now then. Something REALLY unexpected?
My lady-wife keeps a large garden. We also capture rain water in a couple of large barrels to water that garden. Sometimes, when there is a lot of rain, we will line up some buckets and catch extra water from the run-off of the car-port roof.
An interesting thing happened last week. The lady-wife was trimming some plants. One of the branches had some nice looking flowers on. She decided to put it in a rain-filled bucket until she could bring the flowers in the house. She looked in the first bucket in line and saw a small fish.
A FISH!
Just a small little guppy looking thing. No. We did not put it there. We have no idea how it got there, although we've bounced around some fun theories. Have we come up with a model for how it got there? Sure. Several. We don't know which, if any, is correct.
So a fish in the rain bucket is something I definitely did not expect.
Maybe I should have.
Monday, October 10, 2011
Of Bugs and Weeds or Why Tugging Gently May Reveal More Than Pulling Hard
We have a fair sized garden for where we live. To be fair, my wife has the garden. I'm the laborer who makes some of the bigger chores happen.
One chore that we get to do twice a year (mid to late spring and early to mid fall) is pull Virginia Creeper vines out of the lilacs, mock oranges, off the fence, and generally out of everywhere we can pull it out from. Now, its a pretty enough plant. However, like most vines it tends to not "stay put" and grows pretty aggressively.
It can, and will, choke out other plants - it has done in a couple of ours and several of the neighbor's much loved mock orange trees. It looks a bit like poison ivy and actually works pretty well as a deterrent for keeping some of the unscheduled visits to the garden and yard from youngsters (and oldsters) in the area to a minimum - they don't realize that vine is NOT poison ivy.
In the fall, the leaves turn a stricking red - astoundingly bright color.
Did I mention that it grows REALLY fast?
While I pulled a huge amount from the one area I worked in, I know there is more there. I focused on the big mature stuff that would be sending out more runners in the spring. Any smaller vines I came across I also pulled but I did not go looking for them. (Kind of like looking for defects that really impact a system vs those that should be fixed but don't really impact anyone.)
Like many people dealing with this plant, I started out, many years ago, pulling hard and aggressively. I was going to show it who was the boss. I was going to WIN! Ummm, not so much. You see, vines tend to break off at resistance points. So if a couple of tendrils have looped themselves around a wire in a fence or a branch of another bush, the "stem" will break if you pull hard - the tendrils will hold the rest in place.
What I learned, and greatly amused my lady wife as she watched me do this, is I can identify a large vine, gently lift it and apply even pressure on it. That will break the tendrils off so I'll have a much large section with little or no resistance to pulling. I'll then look up to the top of the bushes (the lilacs along the North side are quite tall - over 10 feet) and watch what moves when I tug on the vine. That way, if the vine DOES break, I'll know generally where it broke off.
I've also found that if I start a gentle, consistent pressure, I'll get much more of the vine off per attempt than if I give it a good yank. Like all heuristics, that one is fallible. Sometimes it works, sometimes it doesn't. Much of the time through it works.
So as I was pulling vines this last week, I got to thinking about software defects. If I dive in aggressively looking for HUGE problems, I tend to find some. If I use a more gentle, subtle approach, I may find some of the same ones I found with the aggressive techniques. I also find others that I perhaps may not have found.
I'll experiment some more tonight when I get home. I need to transplant some of the smaller lilac bushes. It should be much easier since so much of the virginia creeper was pulled out of that portion of the garden.
One chore that we get to do twice a year (mid to late spring and early to mid fall) is pull Virginia Creeper vines out of the lilacs, mock oranges, off the fence, and generally out of everywhere we can pull it out from. Now, its a pretty enough plant. However, like most vines it tends to not "stay put" and grows pretty aggressively.
It can, and will, choke out other plants - it has done in a couple of ours and several of the neighbor's much loved mock orange trees. It looks a bit like poison ivy and actually works pretty well as a deterrent for keeping some of the unscheduled visits to the garden and yard from youngsters (and oldsters) in the area to a minimum - they don't realize that vine is NOT poison ivy.
In the fall, the leaves turn a stricking red - astoundingly bright color.
Did I mention that it grows REALLY fast?
While I pulled a huge amount from the one area I worked in, I know there is more there. I focused on the big mature stuff that would be sending out more runners in the spring. Any smaller vines I came across I also pulled but I did not go looking for them. (Kind of like looking for defects that really impact a system vs those that should be fixed but don't really impact anyone.)
Like many people dealing with this plant, I started out, many years ago, pulling hard and aggressively. I was going to show it who was the boss. I was going to WIN! Ummm, not so much. You see, vines tend to break off at resistance points. So if a couple of tendrils have looped themselves around a wire in a fence or a branch of another bush, the "stem" will break if you pull hard - the tendrils will hold the rest in place.
What I learned, and greatly amused my lady wife as she watched me do this, is I can identify a large vine, gently lift it and apply even pressure on it. That will break the tendrils off so I'll have a much large section with little or no resistance to pulling. I'll then look up to the top of the bushes (the lilacs along the North side are quite tall - over 10 feet) and watch what moves when I tug on the vine. That way, if the vine DOES break, I'll know generally where it broke off.
I've also found that if I start a gentle, consistent pressure, I'll get much more of the vine off per attempt than if I give it a good yank. Like all heuristics, that one is fallible. Sometimes it works, sometimes it doesn't. Much of the time through it works.
So as I was pulling vines this last week, I got to thinking about software defects. If I dive in aggressively looking for HUGE problems, I tend to find some. If I use a more gentle, subtle approach, I may find some of the same ones I found with the aggressive techniques. I also find others that I perhaps may not have found.
I'll experiment some more tonight when I get home. I need to transplant some of the smaller lilac bushes. It should be much easier since so much of the virginia creeper was pulled out of that portion of the garden.
Sunday, October 9, 2011
An Exercise in Task Prioritization Or Where Has Pete Been?
I find it astounding that the last blog entry I posted was written on August 24. That seems a long time for me to go between postings. That is not to say I have not thought about writing some thoughts down. In fact, I have a fairly lengthy list of post-it notes stuck on my bulletin board with "This would be a good topic for a blog entry." As of today, its quite a lengthy list.
The simple fact is, I fell into a common trap for software people - I'd see a side project and think "I can do that in my spare time, it can't take too long." Or, I'd say "Here's an opportunity for me to do X. Those don't come along too often. I can do that."
Pretty soon, I had more "side projects" than time to do them in.
On top of that, there was a "side project" at the day-job that needed to be addressed. Not a big deal, just a minor little thing of 40 to 60 hours and a couple months to do it. Of course, it was not on the project schedule because it was pretty small and could be done in the middle of other things. Until it was scheduled to be shipped to a customer. THEN the priority ramped up BIG time. (No one has ever seen that before, right?)
So, in the meantime...
The result is I missed a peer conference I had intended to participate in. I also did NOT submit any proposals in the last 6 weeks to speak at conferences coming up next year (yeah, there werre a bunch of deadlines that I waved to as they whished by.)
The good news, and things I'm quite pleased about, include a resumption of workshops on drumming (pipe band drumming to be particular.) I agreed to teach a bagpipe band's fledgling drum corps this year. I had done a series of workshops, fairly intense 4 and 5 hour sessions, starting with "holding the drumstick" and ending with "playing as an ensemble." They liked it so much that they asked me to repeat the lessons for their novices/newly joined drummers and pick up with more advanced material for last year's students. Cool.
The day job has had several "wins" from a business view, a software view, a testing view and personally. Things are far from perfect, but there looks to be an interesting time ahead.
So, expect an flurry of blog posts as I try and work myt way through the list.
What did I learn? Hmmm - Jury is still out on that one. Off the cuff, I'd say I should have learned to not take on more than I can handle. What I may have learned instead is that sometimes the stuff we agree to do had better be fun, because we may not have a chance to do other stuff that is fun.
The simple fact is, I fell into a common trap for software people - I'd see a side project and think "I can do that in my spare time, it can't take too long." Or, I'd say "Here's an opportunity for me to do X. Those don't come along too often. I can do that."
Pretty soon, I had more "side projects" than time to do them in.
On top of that, there was a "side project" at the day-job that needed to be addressed. Not a big deal, just a minor little thing of 40 to 60 hours and a couple months to do it. Of course, it was not on the project schedule because it was pretty small and could be done in the middle of other things. Until it was scheduled to be shipped to a customer. THEN the priority ramped up BIG time. (No one has ever seen that before, right?)
So, in the meantime...
- I wrapped up slide decks for two presentations at two conferences;
- Wrote up supporting articles on both;
- Wrote an article taking a contrarian view toward the answer to a question asked;
- Wrote up four short essays answering other questions;
- Wrote up notes from CAST2011 and filed them neatly for later use;
- Took the lady-wife on an extended weekend at a music festival that had no electiricity (and allowed no generators) in the camping area - and had a fantastic time;
- Did the usual (and expected) "end of summer" family stuff;
- Got caught up on the day-job's projects (well, relatively);
- Slept in yesterday.
The result is I missed a peer conference I had intended to participate in. I also did NOT submit any proposals in the last 6 weeks to speak at conferences coming up next year (yeah, there werre a bunch of deadlines that I waved to as they whished by.)
The good news, and things I'm quite pleased about, include a resumption of workshops on drumming (pipe band drumming to be particular.) I agreed to teach a bagpipe band's fledgling drum corps this year. I had done a series of workshops, fairly intense 4 and 5 hour sessions, starting with "holding the drumstick" and ending with "playing as an ensemble." They liked it so much that they asked me to repeat the lessons for their novices/newly joined drummers and pick up with more advanced material for last year's students. Cool.
The day job has had several "wins" from a business view, a software view, a testing view and personally. Things are far from perfect, but there looks to be an interesting time ahead.
So, expect an flurry of blog posts as I try and work myt way through the list.
What did I learn? Hmmm - Jury is still out on that one. Off the cuff, I'd say I should have learned to not take on more than I can handle. What I may have learned instead is that sometimes the stuff we agree to do had better be fun, because we may not have a chance to do other stuff that is fun.
Subscribe to:
Posts (Atom)