Projects have been running over-budget and late since the Pharaohs built their tombs. Since then, good practice has been published and qualifications have been established, so why do projects and programmes still go wrong? Whilst most projects slightly over-run, a significant minority goes catastrophically bad, even sinking their host companies. What can you do to make sure you are not among them? This article reviews recent research and guidance in the light of experience.
Why do it?
In any major change, it is well worth being really careful first to make sure that the reasoning is clear, that this is shared and that it is never forgotten. It is common for effort to be expended in drafting a beautiful business case to gain funding, only to see it locked away, never to be used. Beware of unidentified and unquantified savings, particularly those that have no clear contributory link to operational improvements.
A group within the University of Oxford has been tracking UK project management for more than a decade. Old surveys  reported just 16% of projects hitting all their targets and an average 18% overrun on budgets. Only 24% of practitioners reported paying great attention to established methods and practice. More recent work , found that by 2011 the average over-run had risen to 27%. Informed by an alliance with Nassim Nicholas Taleb of Black Swan fame, it concentrated on the minority that ran more than 200% over original projections. The data bank has also been analysed by McKinsey . This has produced some useful insights to inform current practice and highlight warning signs.
There is also a growing body of work on the way in which people make decisions [4,5]. This is important because projects and programmes are governed (I hope yours are!). Understanding how the people of the governance board go about making a decision as part of the project / programme is an important part of assuring effective control, complementing good process and organisation.
When the causes of deviation from plan were analysed, the following contributions were found:
13% – Missing focus. Unclear objectives, poor requirements, lack of business focus.
11% – Execution issues. Unrealistic schedule. Reactive planning.
9% – Content issues. Shifting requirements, technical complexity.
6% – Skill issues. Unaligned team. Lack of skills, experience and resource.
Just 6% of the deviation remained unexplained or “other”. The same issues are being seen again and again.
These factors can simply and easily be tested for through standardised project and programme appraisal techniques. If your project / programme is significantly off-track or headed that way, it is likely that a dispassionate review of the current state will identify contributing factors and allow the rapid identification of corrective action. Taking that action will have a cost. Not taking it will also have a cost, which is likely to be higher. The governance board must decide which to pay. Avoiding all cost is rarely an option.
Managing strategy and stakeholders
Many IT people have at best a grudging relationship with the business. It is quite common to hear dark mutterings from technologists about the business “not getting it” or equivalent. I recently heard an architect complain about managers interfering with the purity of his work. Equally, the business often sees IT as being in an expensive world of its own, drawing architectures full of boxes with no connection to their concerns, priorities or needs. It is easy for rudeness to escalate where people do not communicate effectively. An essential purpose of a governance board is to provide a forum for senior staff to decide, set direction, allocate resources and hold those responsible to account. Does the project reflect complete understanding of what is needed to drive the business forward, and alignment to the governance board’s concept of strategy?
Some of these boards can include some big personalities. Challenging behaviours can also arise from without. The interests, concerns, whimsy and frustration of people must be managed just as much as the tasks of delivery. The tools and techniques of stakeholder management can help here, best developed within sales and account management where such concerns are second nature.
Pope Urban VIII found Galileo Galilei’s ideas about the sun being the centre of our solar-system to be inconvenient. He simply denied it and had Galilei thrown in jail. I have seen senior staff take a similar approach with project managers and their hopes for delivery. The result is usually comparable, with the massive celestial bodies continuing on their ordained path, and the deadlines being blown. Wishing the sun should orbit the earth without working out the means to make it so does not have much effect, other than on the blood-pressure of the innocent.
The wise make sure that the strategic contribution and scope of the project are sensible. Also that the views are shared between the governance board, the organisation as a whole and the delivery team. It is horribly common to find delusion set in . Some time ago, I was engaged by a CIO to review IT’s ability to support a corporate change programme. This involved the construction of new channels for customer engagement, with cost being cut by moving many transactions from face-to-face to either telephone or the web. The big-name consultancy delivering the change had stated the assumption that its programme would be delivered without any change to IT. We suggested otherwise, as there was then no web channel and the telephony facility was small and primitive. This appeared to fail the “sensible” test. Correcting it was an inconvenient truth that caused some budgetary difficulties. There are times when an effective governance board has to ask some searching questions if it is to avoid deluding itself. The answers must come from someone who is prepared to speak honestly and draw on good data. Fortunately, I avoided jail.
Issues commonly arise in the areas of budget, resource and schedule setting. The research  and experience suggest the use of comparative projects to ground these in reality. How long/ much resource /expensive were the comparator projects? What are the circumstances that lead us to think that this project should be better / worse than them, and hence a % adjustment to the comparative estimate? This brings some rationality to the decision.
Mastery of Technology and the Business
Those seeking to deliver change must have a strong understanding of the technologies available, how to make them work and what the business really needs. A skilled practitioner will see beyond requirements stated as “I want you to deliver technology x for me to deliver function y so that I can do z for my customers” and express it in terms of business outcomes, project outputs, functions and technical components. The disciplines of benefits realisation and Enterprise Architecture have much to add in this. Programme management brings a good delivery framework. Too often projects and programmes overlook the benefits to be delivered. Benefits may not be realised until long after the project has been delivered but this does not mean they should be ignored. Mastery brings efficiency through the ability to optimise the solution, as opposed to just doing something that works.
Long ago, I noted the curious tendency of experts to equate the scope of their expertise with the extent of the issue. So a process expert would see a process to be improved. A technical architect would see databases, applications and IT infrastructure. A project manager would see tasks to be coordinated. All are right, and all are wrong. Mastery is a high art because it draws on the expertise of all to assemble the total solution. There must be one “master brain” that understands the end-to-end solution and draws it and the supporting experts together. Without that, you have a bag of bits. Only by seeing the totality can the whole be optimised in the knowledge of the interactions. Experience suggests that the return on investment from such expertise and skill is extraordinarily high. Running with the “B” or “C” team may be an expensive option indeed. They do not realise what the data from early delivery is telling them.
Excel at Core Project Management Practices
A programme director I met recently estimated that 70% of his delivery costs were associated with gateway process compliance. This risk-averse organisation had made mistakes in the past, and had put control upon control until their delivery arteries were choked. Nothing got through. But they made fewer mistakes, at huge cost to the business. The smart application of controls applies them sparingly but with rigour. The programme director would happily have set a match to all controls, which would not have been tolerated by his operations colleagues, charged with providing continuity of service. His extreme language did at least prompt a review of current practice.
Another organisation made a large initial change (in a security-critical environment) under careful control and followed it up with the sustained application of lean six-sigma techniques to deliver process simplification. This was to drive performance quality improvement whilst simultaneously cutting costs. This has been delivering spectacular results for the last three years since the initial change. The real power of this is that the whole team from top to bottom has grown together in mastery of the techniques. First results can usefully be delivered quickly. Mastery takes longer. Shared views and aligned incentives are important enablers. The people must also be appropriately equipped with skills, experience, robust methods and most critically, attitude. They started with the essential core processes, progressively moving into others such as change management to spread the influence and efficient control.
The recommendation concentrates specifically on clarity of what quality looks like. Anyone who has run a project under Prince2 will swear heartily at that method’s bureaucracy. They would have to admit that the Product Description’s role in clearly defining what acceptable quality looks like has value. The art is to set these criteria such that they retain the essential controls and dispense with the rest. Critical practices may also be found in planning, tracking, risk management and change control.
Fail quickly and cheaply
Large projects are far more likely to fail than are small ones. Part of this is doubtless due to the risks associated with complexity. Cutting a large task into many small ones does not necessarily eliminate complexity, as much as move it to the task of integrating the many. However, there is wisdom in proceeding in controlled steps, and the results of research bear this out. Agility is all the rage, and has the bonus of delivering a stream of benefit (at least when done properly). The approach is one of trying something, appraising the results and being prepared to either reinforce it or cut and try again depending on results. This means that although all trials will not produce perfect results first time, the risk of delusion is minimised. Rather than charging ahead with massive momentum regardless of results, this approach feels its way. People are encouraged to experiment. This has to work in the context of effective decision-making and data if it is to work.
Some years ago, I worked with an examinations board that wanted to get away from moving paper scripts around the world. Multiple marking was a nightmare of logistics. My team supported them in testing on-line marking. Before the board went any further, they had to know that marking in a different way would not introduce a bias, as the reliability of marking was critical to their reputation and market success. Their research department designed a rigorous and statistically robust test, which we built and ran. Fortunately, the results were clear and positive. The organisation had effectively addressed the risk at known and managed cost. It also took care to learn, and uncovered substantial unexpected benefit in the process.
Projects and programmes are undertaken to deliver change. This involves risks which can be managed but not altogether eliminated. Good methods (MSP, M_o_R, PMI, Prince2 and the like) are necessary and frequently under-used. Careful design of the environment can significantly improve the chances of success. The warning signs of deviation from plan can be detected and causes diagnosed by simple dispassionate checks, as long as the Governance Board is attentive to the reports. What then remains is the decision: do you want to correct it or abide by the results of chance?
 The State of IT Project Management in the UK 2002-2003. Chris Sauer and Christine Cuthbertson Templeton College, University of Oxford
 Why Your IT Project may be riskier than you think. Bent Flyvbjerg and Alexander Budzier. HBR Sept 2011
 Delivering large scale IT projects on time, on schedule and on budget. Michael Bloch, Sven Blumberg, and Jürgen Laartz. McKinsey Quarterly Oct 2012
 Delusions of success. How optimism undermines executives’ decisions. Dan Levallo and Daniel Kahneman HBR July 2003
 Why Good Leaders Make Bad Decisions. Andrew Campbell, Jo Whitehead, and Sydney Finkelstein HBR Feb 2009
This article was first published in Outsource Magazine 2013 August 22 and is reproduced with permission.