Wednesday, June 17, 2009

The Software Development Cycle

With my recent contract now complete, I thought I'd post a few non-specific details about what I did during it. While it's not completely germane to Agile, you'll see a few places where there was some overlap, at least.

There were basically three parts to complete as part of the contract:
  • review the organization's current software development process
  • compare that picture to what a more standard development process would look like
  • provide recommendations on changes or enhancements that could be implemented to make things even better
Initially there had been some thought of doing a fourth part, as well: implement some of the recommendations, alongside the development team. However, virtually none of that came to be, through a combination of factors the most relevant of which was that the developers were in the middle of a high priority project that couldn't be delayed at the moment.

In order to get a clear idea of how development happened in this company, I read over a fair number of documents that related either to standards, or to recent projects, within the organization. From that exercise, along with a few strategically-timed interviews with personnel there, I was able to figure out what type of artifacts were produced, as well as which practices were followed. I'd describe what I saw as a fairly lightweight model, which is what I'd expected to find, given the age (fairly young), size (6 to 8 developers) and position within the company (an Information Technology group within an industry that's far removed from being an IT provider). In some areas, it was really only rigour that was missing; in others, there was simply a lack of experience with "best practices for large-scale software development", let's say. And that's the sort of thing you bring a consultant in for, anyway.

For the comparison aspect, my boss suggested a Gap Analysis in the form of a spreadsheet, which I thought was quite appropriate. The U.S. arm of the company had previously brought in-house a very comprehensive System Development Life Cycle model for the American IT groups (which are considerably larger than the Canadian one located here), and I was happy to use that as my baseline for the comparison. This model was very thorough (and very gated, as befits a Waterfall approach) and so it provided an excellent "ideal" for me to measure their setup against. There were roughly 150 artifacts or activities included in it, and so I was able to evaluate my clients' methodology in terms of which of those items they included, and (in a few cases, at least) how their versions measured up to the baseline.

Once the Gap Analysis was complete, I identified what I believed to be the highest priority gaps. This ended up being just over 20 items. For each, I then wrote up a Recommendation, in the following format:

Recommendation: xxx
Benefits: xxx
Approach: xxx
Effectiveness Metrics: xxx

In other words, I provided the what, the why, the how, and some notion of how to measure the results to make sure that they weren't wasting their time making the change.

That Gap Analysis spreadsheet, which included the 20-some Recommendations, was the primary deliverable from my contract. However, in talks with my boss we decided that a couple other concrete artifacts would be desirable to come out of this, as well. The first was a Microsoft Project Plan that was essentially a skeleton Plan that included all of the steps from the "ideal" SDLC model, customized somewhat to their environment, in order to make it easier for all of the steps to be remembered in the future. Also, because I considered a thorough and comprehensive Release Plan to be a prime example of where some additional rigour would help them out, I provided a template for that, too.

Those three documents were reviewed with most of the development staff on my last day there, and were extremely well-received. I had tried to incorporate as much feedback from the staff as I could in the content of my deliverables, and I think they appreciated that. One example was the inclusion of a recommendation that features be planned in a way that allowed them to be completed more serially than they had in the past, so as to allow at least some testing of some of the features before being passed on to the business area (who, in that environment, act as "QA"). That, and some of my unit testing suggestions, definitely benefited from my two years as Agile Manager!

All in all, it was a good learning opportunity for me, and I think that the clients received good value from my 22 years of IT experience. I think that if the timing had been different, such that the people there who needed to play a bigger part had had more time available, then it could have gone better and more immediate gains might have been realized. But as the poker enthusiasts say: you have to play the hand you're dealt, and that's what I did.

No comments: