A few weeks ago, in this blog post, I described the discussion from a LeanCoffee format meeting of GR Testers on what makes a successful software project. What I did not describe are those topics that did not garner enough votes to be discussed, or those which we ran out of time to discuss that evening.
I find these topics interesting and perhaps a bit revealing. What would have been discussed on these factors contributing to the success of a project? I suspect that in the course of conversation some would have questioned the assertion that some of these would actually contribute to the success of a project.
The question of what makes a project a success is an interesting one. We have all heard some level of discussion around this. I thought the ideas expressed in the meeting were quite good, for the most part. Honest Communication was a core theme that seemed to bubble up even in the midst of discussing other ideas.
And so, what did not make the discussion?
In no particular order:
A project is a success whenever a major stakeholder says it is;
OK, well, this basic idea came up in a couple of ways and a couple of different perspectives. Does this really contribute to a success? Is it part of a greater measure? I'm not sure.
Setting Expectations;
Isn't this part of the open communication? You know, people say what they want and others say what they can provide and people keep talking until they reach an agreement. Then they need to remind themselves and each other what they agreed an committed to.
Communication with end users involved;
The is tied up with the expectations thing, right? Unless the "end users" are not actually employees of the company. There are other ways, of course, for example contacting a user group or some other affiliation of folks who
Check doesn't bounce;
Trust;
This is an interesting one. Trust is nearly impossible to achieve in an environment where folks view others, other groups, teams, business experts, developers, testers, PMs, with suspicion. The idea of "what are they up to?" will undermine the chances of trust growing and blooming. When the atmosphere is confrontational, there is almost no chance of trust developing. Honest communication and delivering difficult messages, without drama and without guile can help build trust. Having said that, "Trust" can contribute to a project's success. However, it is often reflected in other things.
Happy with Outcome/Code/Performance;
Isn't our goal to make sure people are happy with the outcome? The great question is "Which People?" Hmmmm. This is certainly a consideration - after all, if no one is happy with the results, by some measure, is there any chance of the project being called a success? Probably not.
Corporate culture;
OK - See Trust and Stakeholders and a bunch of other ideas. I can see how the culture of the organization can help make it easier for projects to succeed or make it nearly impossible. I've said before that if people are trying to reign in chaos, that is not going to happen if a big-enough boss does not publicly, consistently support them. Then again, maybe people like that.
It works - all planned functionality functions;
I smiled at this one. "It works, by some model." That was what I first thought on this. Then again, it is entirely possible that delivering all the planned functionality is adequate. I then wonder whose definition of functionality counted. What does "function" mean? Who decides what functionality gets planned? Who decides? If the one piece a given department really needs does not make the "planned" list, will they agree that the project was a success?
Project meeting/exceeding customer expectations;
The results of the project meeting or exceeding customer expectations is certainly a consideration around the success of the project. Does this have a true bearing on making a project a success? Not sure.
Team members only on this project;
This can really help. Of similar importance is if people rely on each other. This also touches on many of the ideas that were discussed in the LeanWings session. Of course, this is seriously tied to the corporate culture thing. If the rest of the organization isn't on board, then this one probably isn't going to happen, right?
Collaboration;
That ties to the one above. It also ties to the question of corporate culture and a pile of the items discussed in the meeting. Real collaboration doesn't happen without open, honest communication. For that you kind of need trust, right?
Establishing KPI and reviewing them after project completion;
Ya know, 20 years ago I might have been all over this in agreement. Now, part of me kinda wishes this one had made the cut to see what was meant. Do KPI's really ensure success? Or are they something else entirely?
P1 issues addressed prior to prod deployment;
P1? As in Priority 1? As in "no show stoppers"? As in "all the really huge big problems we found were fixed"? Isn't this part of making sure stuff behaves? Again, I'm not convinced this is part of making a successful project.
Requirements met and meeting business users and owners expectations;
Now, some folks may say "Yeah, that makes a successful project, right?" Well, maybe. But how ones does that is really the question.
No hot fixes or major issues after deployment;
OK, so this kind of ties into the P1 Issues thing, right? Stuff goes into production and it works, right? Or maybe, it works by some measure. Or maybe things don't break. Or, something. Right?
===
Friday, July 26, 2013
Sunday, July 21, 2013
Expertise, Skills and the Whole Team
I am not a "major name" in software testing, or Agile or, software anything. I'm OK with that. I would rather focus on things local to me and get people to see there may be a better way. Of course, without being a "major name" that, too, presents problems.
This may cause me some problems among my colleagues in the Agile community. It may ensure I don't speak again at another Agile conference. Then again, it may serve for someone somewhere to consider some of the truisms that get bandied about when it comes to Agile development.
An Open Letter to Those Espousing a "Whole Team" Approach
Dear Respected Colleagues,
Over the last several years, I have heard and read many instances and places where people are citing this authority or that authority over the need or lack of need for testing specialists in an Agile Team. Many denounce the concept as being counter to the "whole team" approach (since everyone is responsible for "quality"), others cite examples of how their projects did just fine without people with determined, specific roles and responsibilities.
Everyone can help with everything. Everyone can contribute to everything. Everyone can help anything better,
Yup. Agreed. When there is a shared vision and understanding of the point of the application, the reason why the project is being undertaken, people working together can do wonderful and amazing things. It will be beautiful.
As long as we get it right.
"Right" in this context, is not meant to be anything other than whatever measures we choose to determine success are met or exceeded. Of course, if we have rubbish measures for success then our "successful project" will still be rubbish.
This works fine as long as the context we are working in allows for people to adapt their views to accommodate the needs of the project and the organization at large. For larger companies, where software is not their primary concern, this can be problematic.
When there are large systems the new product must integrate with, it is probably a good idea to actually get information from the experts on those systems that our new system will need to work with. We must realize that some skill sets may not be contained within the team.
Enough Chicken and Pig
I can hear the roar already - "But that's the whole chicken and pig thing, right?" Wrong. There are people who may be able to assist us in our efforts. We must recognize they have other priorities. We must recognize that for them, we are just another request for their expertise.
We may want to do X all that we can possibly want in sprint N. And if we need the help of some specialist outside of the team, we may not get it when we want it. Someone needs to contact these external (to the team) experts and figure out their schedule needs, the kind of lead times they typically need for service requests and what kind of information you will need to provide, and when.
I know it is cool to run as clean as possible - as little overhead, plan then work/code right now. The problem is, if we need people to wield their magic sauce, right now may have had to be planned a month ago. It does no good to rant and rave if they are running at capacity and can not get to our project when we want them to if we have not done our homework.
A lack of planning on our part is not their problem - it is ours. And it has nothing to do with who is all in or not.
Essential Skills
The obvious solution is to engage said experts from the outset of the project, no? Get them or their boss or some other proxy the information you have for what they will need to do as it is developed, no? We may get one piece in Sprint 1, possibly as a note that when you get to Story M there will be help needed from Group Q on some feature. In Sprint 3 we may have a little more information on this. Make sure it is recorded, possibly passed on, and then made available when Group Q can do their bit.
Simple, no? It is part of recognizing that not everyone is skilled in some technique needed to ensure the project is a success. Pretty basic.
It also recognizes that some people have skills that are anything other than ordinary. It recognizes that some skills are not (yet) considered commodities to be picked up with an hour or two of training or that "anyone" can do.
Why do we presume that others skills are not that way as well? Sure, we can do some things - we have some shared skills on some level. Frankly, I would not trust any code I wrote in Java or Java Script to run correctly in production. What little skill I once had is now quite rusty. What about the business owner or business representative? Can they write production quality code?
Some may be able to, others likely consider it magic.
Recently I encountered someone who asked why I was spending so much time on one page in their really simple app when, to them, things looked "fine." My response was to show her the list of odd behaviors I had encountered simply by flipping back and forth on the elements on the page and leaving and returning to the page. She stopped and blinked.
Wow. I never would have considered looking at any of that.
That, my dear colleagues, is why a person skilled in setting aside expectations of how the code behaves, expectations of how the person using the program behaves, confidence in the ability of the team to match code development and business expectations and needs and look at what is.
It strikes me that an awful lot of people read an article or a bit of a book or hear a snippet of a conversation and figure that anyone can do this. It is challenging. It takes courage of convictions to not give in to political or other social pressure. It takes perseverance when everyone around you claims something is easy.
I call people who have those skills "testers".
Yes, everyone is responsible for quality of the software. The whole team.
It takes specialized skills people may or may not have in varying degrees to make that software more than an idea. Among those skills are those typically found in a person who has selected software testing as a profession.
This may cause me some problems among my colleagues in the Agile community. It may ensure I don't speak again at another Agile conference. Then again, it may serve for someone somewhere to consider some of the truisms that get bandied about when it comes to Agile development.
An Open Letter to Those Espousing a "Whole Team" Approach
Dear Respected Colleagues,
Over the last several years, I have heard and read many instances and places where people are citing this authority or that authority over the need or lack of need for testing specialists in an Agile Team. Many denounce the concept as being counter to the "whole team" approach (since everyone is responsible for "quality"), others cite examples of how their projects did just fine without people with determined, specific roles and responsibilities.
Everyone can help with everything. Everyone can contribute to everything. Everyone can help anything better,
Yup. Agreed. When there is a shared vision and understanding of the point of the application, the reason why the project is being undertaken, people working together can do wonderful and amazing things. It will be beautiful.
As long as we get it right.
"Right" in this context, is not meant to be anything other than whatever measures we choose to determine success are met or exceeded. Of course, if we have rubbish measures for success then our "successful project" will still be rubbish.
This works fine as long as the context we are working in allows for people to adapt their views to accommodate the needs of the project and the organization at large. For larger companies, where software is not their primary concern, this can be problematic.
When there are large systems the new product must integrate with, it is probably a good idea to actually get information from the experts on those systems that our new system will need to work with. We must realize that some skill sets may not be contained within the team.
Enough Chicken and Pig
I can hear the roar already - "But that's the whole chicken and pig thing, right?" Wrong. There are people who may be able to assist us in our efforts. We must recognize they have other priorities. We must recognize that for them, we are just another request for their expertise.
We may want to do X all that we can possibly want in sprint N. And if we need the help of some specialist outside of the team, we may not get it when we want it. Someone needs to contact these external (to the team) experts and figure out their schedule needs, the kind of lead times they typically need for service requests and what kind of information you will need to provide, and when.
I know it is cool to run as clean as possible - as little overhead, plan then work/code right now. The problem is, if we need people to wield their magic sauce, right now may have had to be planned a month ago. It does no good to rant and rave if they are running at capacity and can not get to our project when we want them to if we have not done our homework.
A lack of planning on our part is not their problem - it is ours. And it has nothing to do with who is all in or not.
Essential Skills
The obvious solution is to engage said experts from the outset of the project, no? Get them or their boss or some other proxy the information you have for what they will need to do as it is developed, no? We may get one piece in Sprint 1, possibly as a note that when you get to Story M there will be help needed from Group Q on some feature. In Sprint 3 we may have a little more information on this. Make sure it is recorded, possibly passed on, and then made available when Group Q can do their bit.
Simple, no? It is part of recognizing that not everyone is skilled in some technique needed to ensure the project is a success. Pretty basic.
It also recognizes that some people have skills that are anything other than ordinary. It recognizes that some skills are not (yet) considered commodities to be picked up with an hour or two of training or that "anyone" can do.
Why do we presume that others skills are not that way as well? Sure, we can do some things - we have some shared skills on some level. Frankly, I would not trust any code I wrote in Java or Java Script to run correctly in production. What little skill I once had is now quite rusty. What about the business owner or business representative? Can they write production quality code?
Some may be able to, others likely consider it magic.
Recently I encountered someone who asked why I was spending so much time on one page in their really simple app when, to them, things looked "fine." My response was to show her the list of odd behaviors I had encountered simply by flipping back and forth on the elements on the page and leaving and returning to the page. She stopped and blinked.
Wow. I never would have considered looking at any of that.
That, my dear colleagues, is why a person skilled in setting aside expectations of how the code behaves, expectations of how the person using the program behaves, confidence in the ability of the team to match code development and business expectations and needs and look at what is.
It strikes me that an awful lot of people read an article or a bit of a book or hear a snippet of a conversation and figure that anyone can do this. It is challenging. It takes courage of convictions to not give in to political or other social pressure. It takes perseverance when everyone around you claims something is easy.
I call people who have those skills "testers".
Yes, everyone is responsible for quality of the software. The whole team.
It takes specialized skills people may or may not have in varying degrees to make that software more than an idea. Among those skills are those typically found in a person who has selected software testing as a profession.
Sunday, July 14, 2013
Successful Software Projects: LeanWings GR Testers Style
This last week featured, for me, the monthly meeting of the GR Testers Meetup Group. We are a bunch of folks who get together and talk, present ideas, share ideas, ask questions, eat pizza, eat stuff that isn't pizza and things of that ilk. We are not terribly formal. We rarely have a topic more than one meeting in advance.
It is funny. I was asked once "How do you come up with ideas for meeting topics?" One of the ways we find topics is in the course of discussion of another topic. That is where this month's topic came from.
A couple of months ago, in the course of discussing bugs or some such, one of the participants asked a question:
What Makes A Software Project Successful?
Awesome question. That became the topic for July. It would have been the topic for June, but we had already lined up an international jet-setting speaker presenting (informally) some ideas on test automation. So, this landed in July.
Talking with my colleague Matt Heusser on the topic, he said "Man, that is a huge thing. There are so many things that go into that. How can we make that work in such a short time?" And the answer presented itself - LeanCoffee: Time-boxed discussion around ideas proposed and voted on by the participants.
Perfect.
Since it was an evening meeting, and coffee seemed inappropriate, we had BBQ wings chips and soft drinks and a room with whiteboards so we could draw and a door we could close so we could be boisterous.
Boisterous? Testers? Really? Yup. Boisterous.
I was minding the time and trying to keep notes and capture ideas. The attempt to write up a post for the group became challenging. It also dawned on me that the ideas presented from this mix of testers, developers, designers, PM types and even a recruiter type (who was a developer before turning into a recruiter) was an interesting collection of ideas - particularly as they were discussed and debated in very short increments of time.
The summary below is in the sequence of the discussion. The topics are in descending order by the number of votes each topic received.
The Topics:
1. Define Success
Right. That was the point of the discussion, right? But in this case, after a fair amount of round-about discussion around what was meant by success, we landed on an idea started by one participant, honed a bit by a couple of others and the result was: Establish a shared vision of what success looks like for this project.
The last bit is really important. We can't define what success looks like for every project we will encounter. Many of the folks participating don't work for software development companies per se, the work for companies that do other stuff and need software to do that or do it better.
We can't promise we will "please the customers" if we can't communicate with them, directly or indirectly. We can't promise anything around bugs or adherence to requirements or any of the oft-stated measures of what good stuff looks like.
However, as testers we can help shape the shared vision of success with the project team and stakeholders. We can help form an unify that vision through our dedication to service to our craft and the organizations we are working with.
2. Communication/Honest Communication
Two people submitted the same basic idea. A central tenet of anything that touches on getting people to actually work together is clear communication. In that, being honest within the communication cycle is a challenge.
The farther up the food chain a story goes, the less accurate/more embellished it becomes. Isn't it amazing how some test progress reports morph from "We have some concerns with problems found in function X and function Z. Because of these we have not been able to exercise function L or function Y at all." and magically turn into "Things are going great! There are just two more functions that need testing."
Being willing, and having the courage to deliver difficult messages crucial to success in a project. Likewise, people being willing and open to receive difficult messages are essential. We must be open to delivering and receiving honest communication.
To help make that happen, we must be certain of the clarity of words and thought used within the communication cycle. It is easy to bury meaning if we want to with techno-babble or intentionally obfuscating if we choose to. Don't do this. Call out others when they do. If you are not certain what something means, ask that it be explained. Even if you are certain, asking that something be explained might be a good idea.
3. Project Vision That is Worth Working For
People tend to do better work if they can identify with the project at some level other than some VP wants a change made. Projects where the goal, the vision, is clearly laid out along with the impact and benefits that are hoped will be achieved tend to get stronger affiliation from people working on them
When no one on the project team has any idea where the request came from, or why, that is a really bad sign. In contrast, if it is clear why the request was made and what the impact will be, these tend to get more emotional investment from the team. That tends to, though not always, result in people being willing to do better work.
There must be clear identification why we are doing this project and why it is important to the business and customers. The scope and context must work together.
4. All Projects Are Failures
This was submitted by a developer/designer type. His point was, simply, if there is more than one team or group using a piece of software, it is entirely possible that someone may not like the changes resulting from the project.
That is amazingly true, particularly in the area of bug-fixes. One group hates the way something works. One group doesn't like it, but has found a way to make use of it. By fixing the bug, the first group may not be happy, but the second is most likely going to be unhappy.
And then there is the problem with political capital in software projects. People want enhancements made, submit proposals for them - and then don't invest the time, energy, whatever, to make sure that what is being developed is what they actually want or need. If the requester/product specialist folks who insist that enhancement K be included no matter what and then "can't waste time in meetings talking about stuff they have already explained" this is an almost certain instance of pending failure.
Things will not go well for this - no matter how well the team pulls off the crazy-nuts request, no matter how close they get to hitting the expressed need - something is going to be missed. Mostly because the team doesn't have a clue that it is expected. And they don't know because "it should be obvious" according to the folks who don't have time to waste in meetings.
Are people overly critical? Sometimes. Well, maybe not. Well, maybe. It kind of depends. And in that exchange the seeds of failure are sown.
5. Within Time / Within Budget
Ya knew this or something like it would appear, right? While some manager types will define success in those terms, can testers? Can the project team? Really?
Who defines the timeline? Well, most folks would say that if you plan out the project needs and what can be done when, that will give you a timeline. Unless there are hard-dates that must be met for legal, regulatory or other reasons. Or unless someone promised the project would be delivered by a given date. Or unless... right, you get the idea.
Who defines the budget? Most teams can make time estimates, which is a form of budget. But who decides if extra equipment can be purchased? Who decides if we can bring in more people to the project to help? What about bringing in experts in the area to shore up the staff/project team so they can learn from working with the experts? Can the project team make these decisions? How many times have our well-considered estimates been rejected as "too high" and slashed by a third, or more? Can we really control that?
(I think my favorite line from that part of the discussion was "If the software doesn't have to work, I can get it done really fast.")
What the project team may be able to do is schedule the tasks or features to be worked on so the most critical features can be finished by the delivery date. In some instances, this may mean putting the most critical features at the end, particularly of they involve changes that would require a process audit, e.g., a PCI compliance audit.
Awareness of these concerns is crucial to having any form of success in a project.
6. Buy-In From Management
Sometimes the view from upper or senior management may not be in alignment with what the project participants might call a success. Simply put, if management folks call it a success, officially, it is.
Having said that, managers (both line and upper managers both) can smooth the waters and eliminate road blocks. This can take the pressure off the staff/project team and allow them to focus on getting the job done.
However, managers can't do these things without the project team communicating clearly what is going on and the actions being considered. Translated - the communication must be clear and bi-directional.
7. Software Works As Intended / Fit for Use
The last topic we could fit in for the night to discuss. This boiled down to a couple of very salient points:
* What is happening when we use the software?
* What did we expect to happen when we use the software?
Pretty simple, eh? Of course, the details are what make this a challenge.
Conclusion
With that, time was up. We picked up the room, put away the stuff that needed to be put away and headed home.
Observation - notice how many of those actually focus around "communication"? Yeah. I kinda thought that in the course of the discussion.
It is funny. I was asked once "How do you come up with ideas for meeting topics?" One of the ways we find topics is in the course of discussion of another topic. That is where this month's topic came from.
A couple of months ago, in the course of discussing bugs or some such, one of the participants asked a question:
What Makes A Software Project Successful?
Awesome question. That became the topic for July. It would have been the topic for June, but we had already lined up an international jet-setting speaker presenting (informally) some ideas on test automation. So, this landed in July.
Talking with my colleague Matt Heusser on the topic, he said "Man, that is a huge thing. There are so many things that go into that. How can we make that work in such a short time?" And the answer presented itself - LeanCoffee: Time-boxed discussion around ideas proposed and voted on by the participants.
Perfect.
Since it was an evening meeting, and coffee seemed inappropriate, we had BBQ wings chips and soft drinks and a room with whiteboards so we could draw and a door we could close so we could be boisterous.
Boisterous? Testers? Really? Yup. Boisterous.
I was minding the time and trying to keep notes and capture ideas. The attempt to write up a post for the group became challenging. It also dawned on me that the ideas presented from this mix of testers, developers, designers, PM types and even a recruiter type (who was a developer before turning into a recruiter) was an interesting collection of ideas - particularly as they were discussed and debated in very short increments of time.
The summary below is in the sequence of the discussion. The topics are in descending order by the number of votes each topic received.
The Topics:
1. Define Success
Right. That was the point of the discussion, right? But in this case, after a fair amount of round-about discussion around what was meant by success, we landed on an idea started by one participant, honed a bit by a couple of others and the result was: Establish a shared vision of what success looks like for this project.
The last bit is really important. We can't define what success looks like for every project we will encounter. Many of the folks participating don't work for software development companies per se, the work for companies that do other stuff and need software to do that or do it better.
We can't promise we will "please the customers" if we can't communicate with them, directly or indirectly. We can't promise anything around bugs or adherence to requirements or any of the oft-stated measures of what good stuff looks like.
However, as testers we can help shape the shared vision of success with the project team and stakeholders. We can help form an unify that vision through our dedication to service to our craft and the organizations we are working with.
2. Communication/Honest Communication
Two people submitted the same basic idea. A central tenet of anything that touches on getting people to actually work together is clear communication. In that, being honest within the communication cycle is a challenge.
The farther up the food chain a story goes, the less accurate/more embellished it becomes. Isn't it amazing how some test progress reports morph from "We have some concerns with problems found in function X and function Z. Because of these we have not been able to exercise function L or function Y at all." and magically turn into "Things are going great! There are just two more functions that need testing."
Being willing, and having the courage to deliver difficult messages crucial to success in a project. Likewise, people being willing and open to receive difficult messages are essential. We must be open to delivering and receiving honest communication.
To help make that happen, we must be certain of the clarity of words and thought used within the communication cycle. It is easy to bury meaning if we want to with techno-babble or intentionally obfuscating if we choose to. Don't do this. Call out others when they do. If you are not certain what something means, ask that it be explained. Even if you are certain, asking that something be explained might be a good idea.
3. Project Vision That is Worth Working For
People tend to do better work if they can identify with the project at some level other than some VP wants a change made. Projects where the goal, the vision, is clearly laid out along with the impact and benefits that are hoped will be achieved tend to get stronger affiliation from people working on them
When no one on the project team has any idea where the request came from, or why, that is a really bad sign. In contrast, if it is clear why the request was made and what the impact will be, these tend to get more emotional investment from the team. That tends to, though not always, result in people being willing to do better work.
There must be clear identification why we are doing this project and why it is important to the business and customers. The scope and context must work together.
4. All Projects Are Failures
This was submitted by a developer/designer type. His point was, simply, if there is more than one team or group using a piece of software, it is entirely possible that someone may not like the changes resulting from the project.
That is amazingly true, particularly in the area of bug-fixes. One group hates the way something works. One group doesn't like it, but has found a way to make use of it. By fixing the bug, the first group may not be happy, but the second is most likely going to be unhappy.
And then there is the problem with political capital in software projects. People want enhancements made, submit proposals for them - and then don't invest the time, energy, whatever, to make sure that what is being developed is what they actually want or need. If the requester/product specialist folks who insist that enhancement K be included no matter what and then "can't waste time in meetings talking about stuff they have already explained" this is an almost certain instance of pending failure.
Things will not go well for this - no matter how well the team pulls off the crazy-nuts request, no matter how close they get to hitting the expressed need - something is going to be missed. Mostly because the team doesn't have a clue that it is expected. And they don't know because "it should be obvious" according to the folks who don't have time to waste in meetings.
Are people overly critical? Sometimes. Well, maybe not. Well, maybe. It kind of depends. And in that exchange the seeds of failure are sown.
5. Within Time / Within Budget
Ya knew this or something like it would appear, right? While some manager types will define success in those terms, can testers? Can the project team? Really?
Who defines the timeline? Well, most folks would say that if you plan out the project needs and what can be done when, that will give you a timeline. Unless there are hard-dates that must be met for legal, regulatory or other reasons. Or unless someone promised the project would be delivered by a given date. Or unless... right, you get the idea.
Who defines the budget? Most teams can make time estimates, which is a form of budget. But who decides if extra equipment can be purchased? Who decides if we can bring in more people to the project to help? What about bringing in experts in the area to shore up the staff/project team so they can learn from working with the experts? Can the project team make these decisions? How many times have our well-considered estimates been rejected as "too high" and slashed by a third, or more? Can we really control that?
(I think my favorite line from that part of the discussion was "If the software doesn't have to work, I can get it done really fast.")
What the project team may be able to do is schedule the tasks or features to be worked on so the most critical features can be finished by the delivery date. In some instances, this may mean putting the most critical features at the end, particularly of they involve changes that would require a process audit, e.g., a PCI compliance audit.
Awareness of these concerns is crucial to having any form of success in a project.
6. Buy-In From Management
Sometimes the view from upper or senior management may not be in alignment with what the project participants might call a success. Simply put, if management folks call it a success, officially, it is.
Having said that, managers (both line and upper managers both) can smooth the waters and eliminate road blocks. This can take the pressure off the staff/project team and allow them to focus on getting the job done.
However, managers can't do these things without the project team communicating clearly what is going on and the actions being considered. Translated - the communication must be clear and bi-directional.
7. Software Works As Intended / Fit for Use
The last topic we could fit in for the night to discuss. This boiled down to a couple of very salient points:
* What is happening when we use the software?
* What did we expect to happen when we use the software?
Pretty simple, eh? Of course, the details are what make this a challenge.
Conclusion
With that, time was up. We picked up the room, put away the stuff that needed to be put away and headed home.
Observation - notice how many of those actually focus around "communication"? Yeah. I kinda thought that in the course of the discussion.
Thursday, July 4, 2013
Conferences & Thinking & Learning & Crazy-Smart People
Lately, I've been engaged in some one-on-one training/coaching/mentoring/how-can-I-do-this-better activities at my client. Of course, this falls on top of project work. It is nothing so formal nor so grand as what some folks do. Often times it consists of a cup of coffee or tea, sitting at a table outside - there are some nice nooks and crannies that someone thought about and put tables and chairs in for quiet conversations. One day, a particularly warm day, it was a nice bit of gelato instead of coffee.
At the end of one of these sessions that featured me saying things like "That is an interesting statement, what do you mean by that?" I gave a bit of homework to "the other party." The original question was "am I worse than average of the testers and test leads you know?" (There was a bit of self-doubt that day.) We talked about her question and different aspects of people's strengths and weaknesses. That was when I gave her the homework.
I told her to consider the skills needed as a tester who designs and executes tests. Then consider her own skills and how her skills compared. I pointed her to this blog post. She looked at me and blinked and said "You were fired? Really?"
I then gave her the "homework" - the same exercise I did for myself with skills I'm good at, skills I need to improve and stuff I don't know anything about and want to learn.
That Sounds Really Hard!
I smiled at her response. If you are honest with yourself it is not easy. Not in the least. She was astounded that I continued to do this on a periodic basis. "But you go and speak at conferences and do workshops and stuff. You're an expert!"
Hardly an expert. I know somethings. Some of those things I know really well. Other things I desperately need to improve on. That is part of why I go to conferences. Agreeing to speak at conferences simply makes the face-to-face conversations a little less expensive.
Learning is not something I can "schedule" easily. Learning opportunities are everywhere. Making time to learn what I need to learn, or sometimes what I want to learn is the challenge. If I have the opportunity to talk with people, even for a few minutes, I often find I can take something away that can help direct my next stage of learning in a given area.
She seemed confused. "You mean, you go and talk with people the way I am talking with you? Really?" Absolutely. I need guidance. I need a nudge sometimes. I need to check my thought processes and see if what is going through my mind is accurate or not. I need the inspiration I get from hearing people talk on topics they are absolutely comfortable with.
You mean that they are passionate about?
Not exactly. People can be passionate about things and actually know very little. They can be moved to passion by something, but have incorrect factual information that is the basis for their passion. they are basing their passion on.
I am looking for enlightenment and inspiration from people who have done good, solid work to be able to set aside the rhetoric, the marketing fluff and the stuff that will get them the next gajillion dollar consulting contract and present information and ideas.
I don't have to agree with them to learn from them. I can disagree on the conclusions. I can disagree with them on interpretation. If they have solid evidence, fact (not truth - truth is two doors down the hall) behind their views, I often can learn something from them.
When we both have finished our "work" for the day, I can then seek them out, speak with them and hopefully engage in a learning session.
This astounded my colleague - "You go there to learn?"
That is precisely why I go to conferences and meetups (even ones I help put together). I go and participate so I can learn. Plain and simple.
Conferences
There are two conferences I will be participating in later this year. Other opportunities presented themselves, but schedule would not allow me to participate in two of them. My submissions for a third were not accepted. No hard feelings there. I will certainly submit (better) proposals again.
The ones coming up are in August and October. They are nicely spaced, from my view, for getting client work done (keeps them happy and generating checks) and not so far apart as to feel there are ages between them.
In August, I will be in Madison, Wisconsin at CAST - the Conference for the Association for Software Testing. I'll be doing an "unofficial add-on" session Monday evening with Matt Heusser. We limited the participation in this simply because it gets unmanageable if there are too many people. We have a cap at 24 - and have only a few seats left (last I knew). I'm kind of excited. The Saturday before CAST, Matt is running TestRetreat. That looks to be really cool.
In October, I'll be in Potsdam, Germany at Agile Testing Days. There I'll be teaming up with Matt, again, on a full-day tutorial on Exploratory Testing. I'll also be running a workshop. Matt will be giving a keynote.
I've been to CAST before - missed last year because of day-job commitments. Last year was my first experience with Agile Testing Days. That was also my first experience with a testing conference in Europe.
If you are in North America, I know conferences in Europe are expensive even to get to - but the experience is astounding. Way smart, articulate people. Excellent conversation. Wonderful thinking.
So, yeah. I go to conferences to share ideas. Mostly, I go so I can learn from way crazy-smart people. At both of those conferences, I find it very easy to not be the smartest person in the room. That is what drives learning for me.
At the end of one of these sessions that featured me saying things like "That is an interesting statement, what do you mean by that?" I gave a bit of homework to "the other party." The original question was "am I worse than average of the testers and test leads you know?" (There was a bit of self-doubt that day.) We talked about her question and different aspects of people's strengths and weaknesses. That was when I gave her the homework.
I told her to consider the skills needed as a tester who designs and executes tests. Then consider her own skills and how her skills compared. I pointed her to this blog post. She looked at me and blinked and said "You were fired? Really?"
I then gave her the "homework" - the same exercise I did for myself with skills I'm good at, skills I need to improve and stuff I don't know anything about and want to learn.
That Sounds Really Hard!
I smiled at her response. If you are honest with yourself it is not easy. Not in the least. She was astounded that I continued to do this on a periodic basis. "But you go and speak at conferences and do workshops and stuff. You're an expert!"
Hardly an expert. I know somethings. Some of those things I know really well. Other things I desperately need to improve on. That is part of why I go to conferences. Agreeing to speak at conferences simply makes the face-to-face conversations a little less expensive.
Learning is not something I can "schedule" easily. Learning opportunities are everywhere. Making time to learn what I need to learn, or sometimes what I want to learn is the challenge. If I have the opportunity to talk with people, even for a few minutes, I often find I can take something away that can help direct my next stage of learning in a given area.
She seemed confused. "You mean, you go and talk with people the way I am talking with you? Really?" Absolutely. I need guidance. I need a nudge sometimes. I need to check my thought processes and see if what is going through my mind is accurate or not. I need the inspiration I get from hearing people talk on topics they are absolutely comfortable with.
You mean that they are passionate about?
Not exactly. People can be passionate about things and actually know very little. They can be moved to passion by something, but have incorrect factual information that is the basis for their passion. they are basing their passion on.
I am looking for enlightenment and inspiration from people who have done good, solid work to be able to set aside the rhetoric, the marketing fluff and the stuff that will get them the next gajillion dollar consulting contract and present information and ideas.
I don't have to agree with them to learn from them. I can disagree on the conclusions. I can disagree with them on interpretation. If they have solid evidence, fact (not truth - truth is two doors down the hall) behind their views, I often can learn something from them.
When we both have finished our "work" for the day, I can then seek them out, speak with them and hopefully engage in a learning session.
This astounded my colleague - "You go there to learn?"
That is precisely why I go to conferences and meetups (even ones I help put together). I go and participate so I can learn. Plain and simple.
Conferences
There are two conferences I will be participating in later this year. Other opportunities presented themselves, but schedule would not allow me to participate in two of them. My submissions for a third were not accepted. No hard feelings there. I will certainly submit (better) proposals again.
The ones coming up are in August and October. They are nicely spaced, from my view, for getting client work done (keeps them happy and generating checks) and not so far apart as to feel there are ages between them.
In August, I will be in Madison, Wisconsin at CAST - the Conference for the Association for Software Testing. I'll be doing an "unofficial add-on" session Monday evening with Matt Heusser. We limited the participation in this simply because it gets unmanageable if there are too many people. We have a cap at 24 - and have only a few seats left (last I knew). I'm kind of excited. The Saturday before CAST, Matt is running TestRetreat. That looks to be really cool.
In October, I'll be in Potsdam, Germany at Agile Testing Days. There I'll be teaming up with Matt, again, on a full-day tutorial on Exploratory Testing. I'll also be running a workshop. Matt will be giving a keynote.
I've been to CAST before - missed last year because of day-job commitments. Last year was my first experience with Agile Testing Days. That was also my first experience with a testing conference in Europe.
If you are in North America, I know conferences in Europe are expensive even to get to - but the experience is astounding. Way smart, articulate people. Excellent conversation. Wonderful thinking.
So, yeah. I go to conferences to share ideas. Mostly, I go so I can learn from way crazy-smart people. At both of those conferences, I find it very easy to not be the smartest person in the room. That is what drives learning for me.
Subscribe to:
Posts (Atom)