Any project managers out there, upon encountering this blog post, are probably looking at the title and saying, "Yeah! You better believe it!" In many software development organizations (perhaps most of them), it officially falls to those in project management to worry about risk. In reality, though, risk identification and mitigation are two concepts that every one of the project participants should care deeply about.
To draw on a current real world example, look at the state of the financial industry in America right now. No one in their right mind would describe it as anything less than disastrous, no matter how much people may disagree on the exact root cause. One of the main contributors to the meltdown, we've been told many times, was unbridled optimism. Whether it was home owners taking out mortgages that they couldn't afford (but all of whom believed that they could), real estate speculators assuming that housing prices could only continue to go up, credit businesses refusing to consider the consequences of a market downturn, or the public in general failing to set aside much in the way of "rainy day savings," the result was the same: once the first domino fell, all of the rest were lined up in a perfect, catastrophic pattern behind it. In other words, a lot of folks misinterpreted a variety of "best case outcomes" as being virtual certainties.
In the world of software, we face a very similar problem. When we create a plan - whether it's a traditional Microsoft Project plan for Waterfall environments or an Iteration plan for an Agile team - I've too often seen what I can only describe as that self-same unbridled optimism. In one way, that's a commendable attribute: after all, it's indicative of a "can do" attitude that's much preferable to a bunch of nay-sayers who won't even try, for fear of failing. But it still has to be tempered with reality. At the very least, any group involved in this sort of planning activity has to spend some time discussing what might reasonably go wrong, and what the impact(s) for each would be.
That's the identification part of the equation, and I'd say (from my own experience) that programmers in general tend to grade somewhere around a C- at that. You'll get some who are really good at it, and others who apparently can't see anything at all in front of them except the Yellow Brick Road that leads directly to Oz. Team members with a testing background or responsibility are usually better at this, perhaps because their stock and trade is stuff not working the way it's supposed to. I'd say the average test-inclined person maybe scores a B- at risk identification, although, again, you'll certainly get all sorts of results. Some testers, for example, only think in terms of bugs in the code, rather than considering "bigger picture" hazards like environmental issues, requirement ambiguity or dependency failures.
When you bring that programming perspective together with the testing POV - either in a meeting, or as Agile tends to do, in one person or a close-knit team - I think the discussions often provide some very good risk identification. The two views feed off of each other, and pretty soon you'll hear some very creative ideas of "what could go wrong." This is more of a refreshingly-effective B+ effort, if you can make it happen. There can almost be a gallows humour aspect to those talks that may cause the unseasoned project manager/stakeholder to run for the Pepto-Bismol! (So, yeah, it can be fun!)
For risk mitigation, though, I think the software world in general really lets our PM friends down. I'd even go so far as to say that we probably earn a solid F in this particular area (or perhaps it's simply an "Incomplete"). In part, the bad performance here stems from the fact that mitigation is often as dependent on the corporate culture as it is on anything else. As an example, consider the classic case of an external dependency not being met. In some companies, such an event would be regarded as reason enough (some might say "excuse") for the greater initiative to fail. Therefore, there might be little stomach for planning any mitigative actions for that possibility, as the blame would be laid squarely at the feet of the non-deliverer. End of story. But in other environments, there can exist expectations - spoken or otherwise - that those involved in the local project can somehow work around unfulfilled external deliverables. Those two scenarios are very different, and it behooves everyone on the project to know which type they operate within. And that's something that makes this whole "what do we if this risk becomes reality?" question hard to answer, especially when you consider the unspoken expectations that often go along with the handling of these risks.
Another challenge to good risk mitigation is that old favourite of mine: empowerment. If a company has a consistent and well-understood empowerment model, then it's easier for a group of programmers/testers/product reps to tackle what to do about each potential hazard. In a top-down model, you may be able to simply kick the problems upward, and comfort yourself in the knowledge that your job is to identify the risks and then escalate them. Again: end of story. Where teams are more empowered than that, then they'll probably look for creative solutions that may involve spending money (buy additional hardware, set up network redundancies, increase bandwidth) or moving human resources around (get temporary expertise from elsewhere, increase headcount on the project), with some reasonable assumption that their time wasn't wasted in considering those options. Where the empowerment model is muddy - maybe one type gets lip service paid to it but another is found in practice - then the job of risk mitigation becomes that much more difficult.
As someone who worries a lot about risk - maybe too much, according to my wife! - I can't help but wish that it got more consideration in the minds of others. When things are going well, there's a natural tendency to expect that delightful state to continue forever. But it's our ability to imagine just what might come along and turn all of our plans to mush that gives us an advantage... if we remember to use it! Otherwise, we end up with situations like the cratering American financial sector.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment