Tuesday, October 07, 2008

Fact checking of an Agile development exercise

Recently I have been working with a development team on various sub-projects of a bigger project. New features were being introduced in form of new sub-projects and the team was eager to adopt Agile development. So we decided to do so in one of those sub-projects in order to get them familiar with the concept and hopefully apply it to all the future development projects.
I’ll briefly elaborate the conditions in which the development of the agile sub-project and a similar non-Agile project have been carried out, list their outcomes and leave the judgment to you. 

Project A:
  • Scope is small.
  • Requirements are captured in form of detail writing; provided by the Product Management team. (Took them one week to finish). This was the usual practice in the entire company. That is, no development starts without a written spec.
  • The development team goes through the document (which is at least 50 pages) to find out the areas that are not very clear and clarify them with the PM team. They also estimate the development time for the features requested in the sub-project and negotiate the features with the stakeholders (A lot of buffer is used in estimates to mitigate risks).
  • The development team shares the requirement with the Q.A team and starts development. Whenever a major feature is developed (iteratively) development team releases the unit-tested code to the Q.A team. This goes until the sub-project is complete. No change request is accepted while development is ongoing.
Project B:
  • Scope is small.
  • Requirements are captured in form of Sketches and Domain Models along with a list of identified functions (the preliminary copy was ready in three days). The development team and Q.A team have a couple of meetings with the Product Management team afterward to make sure they understand the models and the priority of the features. They start then designing the overall architecture with the help of Architecture team, finding the reusable components and developing the required base code while the Product Management works on the detail of some of the high priority features. Q.A team develops test plan and required test cases. These three teams constantly update each other during this process.
  • Product managers evaluate the prototype and come up with a list of adjustments. In the meantime, Dev and Q.A teams study the detail document of the high priority features provided by the PM team to come up with more accurate estimate for high priority features.
  • Teams discuss the changes, the detail document and the estimates.
  • Dev team continues the development to solidify the architecture and frequently releases the developed and unit-tested code to the Q.A team to integrate and test. When all the bugs are resolved for each build, it will be released to the product managers for evaluation and assessment.
  • Product managers can ask for change before the architecture is frozen which usually happens after the second release. In contrast, the Dev team has the opportunity to negotiate features.
Outcome of Project A:
  • Some features could not make it to the deadline and some were not accepted by product managers as they were misunderstood by the Dev team.
Outcomes of Project B:
  • Estimates were ready earlier and more accurate, too, as they were provided based on the actual development work that the Dev team had done.
  • More angles of requirements were clear to Dev and PM teams because of the presence of the Q.A team in all the discussions from the beginning. Amazingly in some cases Q.A team members knew more about requirements than even PMs. That’s mainly because they have been testing other sub projects and are as familiar with overall requirements. Besides, they always ask the best questions. I believe that’s because they think about test cases and scenarios in advance.
  • PM team was very happy about the outcome since all the important and high priority features that could finish in the given time and were agreed upon, were present and entirely tested. They were also happy about the fact that clients could start adoption of new features earlier.
I don’t think I need to explain which project was more successful. 

Before closing this post, I'd like to highlight a few of the pitfalls of agile development that I see or hear frequently.
  • Pure modeling and sketching is not always a suitable way of requirement gathering. If you have distributed teams, challenging stakeholders or very complex requirements then you might want to spend a bit more time digging and documenting.
  • If you are using TDD (Test Driven Development), it doesn’t mean that you don’t need a Q.A team or downstream black box testing; no matter how much developers may write unit tests and run white box testing.
  • Change Adoption doesn’t mean that after every release stakeholders can come up with change requests. Change can be accepted and handled before the Architecture is stable. So it’s the development team’s responsibility to release the code as frequent as possible in form of an executable application to the stakeholders and it's stakeholders’ responsibility to assess the released application and come up with change requests as early as possible. That means, if you have a stakeholder that has no idea what he wants (and believe me, there are such stakeholders) you need to spend more time building prototypes and exploring requirements before you stabilize the Architecture.
  • The project that was chosen for the team to exercise Agile development was relatively small. That’s because it was their first time doing iterative development in Agile manner and I wanted to mitigate potential risks imposed by failure of the project. Also, they had an agile practitioner in their team; me! Don't do this without a coach.
Post a Comment