The challenge of running an agile project with an annual funding model
In my last role in my former life as an employee at Lonely Planet, I was Project Manager on one of the major projects running in Product/IT for that year. I understand that the funding process has changed a lot since then, and I like to think I and my colleagues helped contribute to that with our actions.
Reporting
As part of our reporting responsibilities we were required to do a breakdown on project spend, feature completeness and time remaining. All standard stuff, with the aim of retaining tight governance on a fairly large project with a substantial budget.
There were however, a number of factors that made this a rather painful process.
1) The project had gotten its funding from a number of sources, both print and digital, as well as external clients. In memory, there were at least 6-8 buckets of money contributing to the overall project budget, and each of these expected something for their dollars. This was/is a fairly standard way to fund projects in organizations and wasn't unique to LP.
2) My report had to cover the completion, spend and $budget remaining on each of the areas that had provided funding, even when it wasn't 100% clear what they had provided funding for.
3) The governance model in place expected a traditional project model to be adhered to, and that money would be spent towards an end date, when the code was deployed to production.
Originally at our reporting period to the Project Committee, I was required, with my colleagues help, a chart that looked something like this:
From memory, the chart showed:
- % complete 'originally scoped' functionality per business unit providing the funding,
- % Spend on the money they provided,
- and a third line in RED (of course) to highlight when we had gone 'over' budget.
Growing Watermelons
Being an agile project, we aimed to deploy as often as possible. It took a while, but once we got our app into production, sometimes we would be deploying 2 - 3 times a week. As you would hope some of the functionality we delivered became popular and we focused on them. We started to drop anything that was proven to be of little or no use. As a result our charts soon looked like the one above where, we started to focus on new emerging/popular requirements, rather than the items we were originally scoped / funded to work on. Therefore, if we stuck to reporting the original spending model, our charts looked seriously ugly, some areas with spending way over the original allotted amount, even if overall spend was well below the total budget.
This led to quite a number of meetings with the Project Steering group where we were roasted for not sticking to original project goals, and not controlling spending on certain areas, and failing in others. This is exactly the sort of action that leads PM's to run their projects as Watermelons - Green on the outside and red in the middle, not highlighting major problems until the last possible moment.
A different approach
However, after consultation with my project buddies, we decided on a different tactic. Firstly, after another week of being harangued for being RED (because our spend was too high in some areas) We asked the PMO to consider a different funding approach. Basically;
Fund us by weekly or 2 weekly increments, at the end of each of these periods, we'd have a demo, talk about what we'd done, the decisions we'd made and show the progress, usage etc. We suggested that if the PMO decided we were actually going off track, or were spending too much money, they could decide to can the project at that time, rather than wait for the end of the project. We argued this would actually save money in the long run by failing as early as possible.
We could also show ACTUAL value, because we were deploying code to production and clients were now using it. This is significantly different to EARNED value, a measure used in more traditional (read Waterfall) project costings.
As you might well imagine, this wasn't a popular suggestion. It was out of the comfort zone of this meeting to make these sort of decisions, however, it did achieve some surprising results. The Project Committee agreed that our decisions up to this point where made with sound judgement, the project was already delivering value, and it made no sense to report our spend the way we did.
We started from that point on to no longer report % of overall spend to Original Scope, just total spend on the business area functionality. As the chart below shows (this one is actually reproduced from the original report). And in the final few reports, we even dropped this chart, instead, we started actually demo'ing the working software and showing usage figures.
Comments
Post a Comment