Friday, December 16, 2005

Agile Rescues CMM?

I finished up my 9 month Regulatory Credit Risk gig last week. We worked about 40 days straight for the UAT phase, including some late night Thanksgiving runs, but it was well worth it. This multi-million dollar project went into production with nearly all of the scope and functionality promised a year earlier.

The last 2 months of the project were critical and a number of important practices made all the difference in the world. Ironically, don't these smell like the best bits of Scrum (with CMM Marathons instead of Sprints) and XP ?

1) Daily defect review meeting with the Business & Technology teams, including all management on both sides of the fence. This meeting was held every day, without fail, rain or shine. The content for the meeting was also exceptionally managed using Mercury Test Director as the foundation, but with lots of Excel Pivot Table manipulation.

2) Blurred distinctions between traditional CMM roles and responsibilities. Business analysts, Developers, Project Managers; everyone got their hands dirty and the fresh perspectives and frequent, candid discussion helped the forest stay very visible above the trees.

3) When it became clear that 6-9 months of requirements analysis and design documentation didn't cover all the nitty gritty details and in some cases were horribly out of date, Business Analysts paired with Developers to drive the code to completion. You know something is working when your Business' Managing Director starts being able to spot issues with J2EE configuration disparaties between testing environments.

Overall, this large project (40+ people) was successful because of very strong project management practices and business-led PM of the technology deliverables. One of the most interesting practices was the superb technology dependency management to help control the implementation across a very distributed, component based credit risk architecture.

Even though 2005 was successful, everyone in the post-mortem agreed that what we accomplished would likely not be repeatable.

How to improve for 2006?

Some key practices could help put this program on a path for repeatable success in 2006 and beyond:

1) Have the Near-Shore development team institute functional unit testing of the credit risk rules. Hundreds of defects were identified late in the integration testing cycle and limited regression testing was in place as I left.

2) Daily Stand-up during all phases of the CMM Level 2 lifecyle, particularly during requirements analysis and design, when the "sign-off" trap is all too comforting and risky.

3) Most middle-office risk projects don't own any of the data they use and they often have to get it from multiple suppliers; sometimes hundreds of suppliers. This project needs to promote data integration to be a full-fledged development effort and treat it with the same care as the Java programming effort, including all requisite artifacts and Unit Testing tools, etc...