Posted on October 10, 2017 by Sandy Gordon
Major programmes are expensive, high profile and carry great uncertainties and risks. For most government bodies, undertaking a major programme will involve doing something new, with relatively little organisational experience. Many fall short of their objectives, in terms of cost and/or outcomes. So it’s not surprising that they are the focus of many NAO reports – about 100 since 2010. Our new Framework to review programmes shows the questions we typically ask, and brings together many of our recent findings. We hope it will show what we are looking for and what we expect to see when we examine major programmes.
Our reports illustrate how risky programmes can be. If they’re innovative, those risks are higher, and it’s harder to learn from past experience. If they’re complex they are affected by many unpredictable factors. And the scale of challenge is increasing as government bodies support major new infrastructure projects, introduce new technology and reorganise to make the most of scarce resources as well as implementing the changes necessary as a result of leaving the European Union.
We tend to look at the biggest and riskiest programmes at key points during their life, or when a particular event prompts a review. Our examinations may seek to address different questions depending on the context, but our Framework to review programmes sets out 18 key questions that are likely to be important, with links to examples from our past work and to further more detailed guidance on specific issues.
Learning from successes and failures
Our framework of questions is based on lessons we’ve learnt from our previous reports. These are just some of the examples highlighted in our Framework.
Managing interdependencies: In our review, Modernising the Great Western railway, we found that Network Rail had not worked out the minimum feasible schedule for the work, including dependencies between key stages. This led us to focus our questions more how on interdependencies are managed within programmes.
Learning from early experience: Some of our examples highlight progress made in projects after we identified problems at the early stages. For instance our report on Progress with preparations for High Speed 2 said the Department for Transport had taken steps to address weaknesses in the business case we reported on in 2013, and had learned from High Speed 1, where the benefits had not materialised as expected, as we reported in The completion and sale of HS1.
Sharing lessons across a portfolio: In our report Welfare reform – lessons learned, we found that the Department for Work and Pensions (DWP) implemented around 30 distinct programmes over five years with few operational problems. This was largely because DWP learnt from early failings such as: sticking too rigidly to fixed deadlines; thinking too late about the management information and the leading indicators it needed to understand progress and performance; and relying too much on dealing with difficulties as they emerged rather than anticipating what might go wrong.
Sharing lessons more widely
From our review of DWP’s transformation on this huge scale we produced this Briefing: Lessons for major service transformation (pdf – 116KB), which details 11 lessons:
- Transformation programmes raise the greatest risks of failure
- Set realistic goals and be honest about what really matters
- Policy development must take account of implementation
- Don’t be tempted to score benefits early
- Do identify tangible short-term gains
- Recognise the (senior) organisational cost of transformation
- Don’t underestimate what you can learn from engagement
- Recognise the value of learning and market development
- Do anticipate the need to make changes in live running
- Recognise the opportunities and limits of technology
- Set out clear decision-making and challenge
Lessons from successful programmes: Our review of the London 2012 Olympics found that the successful delivery of the Games owed much to: improvements made to portfolio management in 2009; the governance and oversight structure; and a risk management approach that included contingency planning and intensive scenario testing.
Focusing on the key issues
To deal with the range and complexity of issues we might look at, the Framework allows us to go into plenty of detail if needed: it contains 107 detailed sub-questions and also points to other resources that help with more specific aspects of programmes, such as contract management or payment-by-results schemes.
Some of these tools are also mentioned in our previous post, The challenges of major projects, which outlines five key areas in which poor performance is commonly found. As that post notes, further useful NAO publications can be found on our Managing major projects web-page.
We have found it useful always to consider the four key areas shown in this diagram, which are good questions to ask about any project or programme.
The Framework is just a starting point, and not intended to be a checklist. We ask the key questions first, then use the detailed questions and more specific tools to probe deeper into areas of particular interest for each programme.
We expect to update the Framework and refresh it with new examples as we learn from ongoing and future work. So we strongly encourage your feedback, and invite you to comment on this post or contact us.
About the author: Sandy Gordon is an Audit Manager with over 25 years’ experience in conducting value-for-money studies, specialising in examining the implementation of new services and major change programmes across government. Sandy currently leads the NAO’s Project Delivery team and is working to improve the NAO’s approach to examining programmes, including by supporting NAO teams’ use of this Framework to review programmes.
8 responses to “A systematic look at major programmes”