Left column

Middlecolumn

Article

    A systematic look at major programmes

  • Posted on October 10, 2017 by

    Major programmesMajor programmes are expensive, high profile and carry great uncertainties and risks. For most government bodies, undertaking a major programme will involve doing something new, with relatively little organisational experience. Many fall short of their objectives, in terms of cost and/or outcomes. So it’s not surprising that they are the focus of many NAO reports – about 100 since 2010. Our new Framework to review programmes shows the questions we typically ask, and brings together many of our recent findings. We hope it will show what we are looking for and what we expect to see when we examine major programmes.

    Our reports illustrate how risky programmes can be. If they’re innovative, those risks are higher, and it’s harder to learn from past experience. If they’re complex they are affected by many unpredictable factors. And the scale of challenge is increasing as government bodies support major new infrastructure projects, introduce new technology and reorganise to make the most of scarce resources as well as implementing the changes necessary as a result of leaving the European Union.

    We tend to look at the biggest and riskiest programmes at key points during their life, or when a particular event prompts a review. Our examinations may seek to address different questions depending on the context, but our Framework to review programmes sets out 18 key questions that are likely to be important, with links to examples from our past work and to further more detailed guidance on specific issues.

    Learning from successes and failures

    Our framework of questions is based on lessons we’ve learnt from our previous reports. These are just some of the examples highlighted in our Framework.
     
     
    Great Western RailwayManaging interdependencies: In our review, Modernising the Great Western railway, we found that Network Rail had not worked out the minimum feasible schedule for the work, including dependencies between key stages. This led us to focus our questions more how on interdependencies are managed within programmes.
     
     
    Progress with HS2Learning from early experience: Some of our examples highlight progress made in projects after we identified problems at the early stages. For instance our report on Progress with preparations for High Speed 2 said the Department for Transport had taken steps to address weaknesses in the business case we reported on in 2013, and had learned from High Speed 1, where the benefits had not materialised as expected, as we reported in The completion and sale of HS1.
     
     
    Welfare reformSharing lessons across a portfolio: In our report Welfare reform – lessons learned, we found that the Department for Work and Pensions (DWP) implemented around 30 distinct programmes over five years with few operational problems. This was largely because DWP learnt from early failings such as: sticking too rigidly to fixed deadlines; thinking too late about the management information and the leading indicators it needed to understand progress and performance; and relying too much on dealing with difficulties as they emerged rather than anticipating what might go wrong.

    Sharing lessons more widely

    From our review of DWP’s transformation on this huge scale we produced this Briefing: Lessons for major service transformation (pdf – 116KB), which details 11 lessons:

    1. Transformation programmes raise the greatest risks of failure
    2. Set realistic goals and be honest about what really matters
    3. Policy development must take account of implementation
    4. Don’t be tempted to score benefits early
    5. Do identify tangible short-term gains
    6. Recognise the (senior) organisational cost of transformation
    7. Don’t underestimate what you can learn from engagement
    8. Recognise the value of learning and market development
    9. Do anticipate the need to make changes in live running
    10. Recognise the opportunities and limits of technology
    11. Set out clear decision-making and challenge

    Olympic GamesLessons from successful programmes: Our review of the London 2012 Olympics found that the successful delivery of the Games owed much to: improvements made to portfolio management in 2009; the governance and oversight structure; and a risk management approach that included contingency planning and intensive scenario testing.
     

    Focusing on the key issues

    To deal with the range and complexity of issues we might look at, the Framework allows us to go into plenty of detail if needed: it contains 107 detailed sub-questions and also points to other resources that help with more specific aspects of programmes, such as contract management or payment-by-results schemes.

    Framework to review programmesSome of these tools are also mentioned in our previous post, The challenges of major projects, which outlines five key areas in which poor performance is commonly found. As that post notes, further useful NAO publications can be found on our Managing major projects web-page.

    We have found it useful always to consider the four key areas shown in this diagram, which are good questions to ask about any project or programme.

    What next

    The Framework is just a starting point, and not intended to be a checklist. We ask the key questions first, then use the detailed questions and more specific tools to probe deeper into areas of particular interest for each programme.

    We expect to update the Framework and refresh it with new examples as we learn from ongoing and future work. So we strongly encourage your feedback, and invite you to comment on this post or contact us.
     
     

    Sandy GordonAbout the author: Sandy Gordon is an Audit Manager with over 25 years’ experience in conducting value-for-money studies, specialising in examining the implementation of new services and major change programmes across government. Sandy currently leads the NAO’s Project Delivery team and is working to improve the NAO’s approach to examining programmes, including by supporting NAO teams’ use of this Framework to review programmes.

     


    Tagged:                   


  • 8 Comments

    8 responses to “A systematic look at major programmes”

    1. Martin Paver says:

      A great article and very useful framework. I would also encourage readers of this blog to read similar information in the US, Canada, Australia and NZ. Further information is held in the reports from public inquiries (e.g. Edinburgh tram), public accounts committees (e.g. NHS24), assurance and lessons learned reports. There is a rich dataset out there.

      I don’t agree with the conclusion that lessons learned are harder to apply for innovative projects. The project management lessons are very similar, but the recipe for implementation will differ. Readers may find that the key is how such projects manage uncertainty and complexity, where good practice emerges from experience. Your statement encourages PMs to conclude that their project is different, hence, they ignore the experience of the past.

      • Sandy Gordon says:

        Many thanks for your comment, Martin. To clarify; it’s not that organisations can’t learn lessons for innovative projects, but that it’s more difficult for organisations to access learning in areas where they have less previous experience. I agree strongly that there are lessons on how to deliver innovation, some of which we discussed (in a slightly different context) in our 2009 report Innovation across central government.
        Sandy Gordon, Audit Manager, Project Delivery Team

    2. Rod Sowden says:

      It is another good paper from the NAO, but it never fails to amaze me that these failures continue to happen.

      Much of this has been around since the first version of MSP in 1999 when I first used it. It is definitely all covered by the current version, I know that because I was lead author.

      I don’t understand why the NAO isn’t more forthright in pointing this out. I was part of the Olympics review for the NAO, and we used MSP model for evaluation and that was how we highlighted the business change risk that ultimately matured through G4S.

      • Sandy Gordon says:

        Many thanks for your comment Rod. This Framework is not intended as a substitute for a well-established programme management approach such as Managing Successful Programmes, and as I’m sure you will appreciate, it’s not our role to tell organisations that they should use a specific tool or methodology.
        We agree that there is insufficient learning from past failures in project delivery – that’s why identifying and applying lessons is one of the topic areas in our Framework. By publishing this document we hope we are making a small contribution by making it easier to find and access our previous work on the range of project delivery issues we tend to highlight.
        Sandy Gordon, Audit Manager, Project Delivery Team

    3. @JagPatel3 says:

      Whereas the NAO has learnt from its previous investigations into the inner workings of Whitehall, the same cannot be said about the Departments of State themselves – not least because they keep making the same mistakes again and again, like believing what they are told by private sector suppliers when approached to provide goods and services for the benefit of the public.

      Take for example, military equipment acquired by the Ministry of Defence, which is the single largest contributor to the major projects portfolio examined by the Infrastructure and Projects Authority – both in terms of the number of projects and their combined whole life cost. It is not about the level of experience of managing large-scale projects that counts, but the honesty with which the Contractor conducts his relationship with MoD that is even more important.

      When it comes to procuring new equipment for the Armed Forces, the first and foremost question politicians always ask is, how much is it going to cost?

      Any meaningful attempt at answering this question is hampered by the fact that, very few people in Whitehall understand and appreciate that the single most important factor that determines the ultimate whole life cost of any defence equipment programme is the maturity of the existing starting-point for the Technical Solution in the possession of Defence Contractors – the closer the developmental status of the starting-point to the Requirement, as described in the technical specification, the lower the cost the Exchequer will have to bear associated with completing the remaining work to bridge the shortfall.

      Even more worryingly, those who do know, are not in decision-making or leadership positions.

      The maturity of a starting-point for the Technical Solution can fall anywhere between two extremes, as shown in this illustration pic.twitter.com/56z9HnxOJc. At one end, starting from a ‘blank sheet of paper’ amounts to a non-existent solution whereas at the other end, an off-the-shelf equipment corresponds to a readily available, fully engineered and supported Technical Solution which satisfies the totality of the Requirement at no additional cost or risk to MoD, that is to say, it does not require any development work laden with risk to be performed upon it.

      Additionally, MoD does not possess the capability in the form of intelligent and experienced procurement officials who have an adequate understanding of what it takes (in terms of skill types, funding, tools, processes, materials, scheduled work plan, inter-business contractual agreements etc.) to advance an immature Technical Solution from its existing condition, to a point where it will satisfy the technical specification requirement, within a Private Sector setting driven by the profit motive and people who instinctively employ unethical business practices – leaving them susceptible to exploitation and manipulation by Defence Contractors. Consequently, they are not able to establish what the true status of the evolving technical solution is, based upon claims made by Contractors. The harsh truth is that these people have no business acumen – on account of not having spent a single day of their lives in the Private Sector.

      So instead of simply telling the truth, Defence Contractors are consciously engaged in an exercise in subterfuge to take advantage of the ignorance of procurement officials, by making exaggerated claims (see illustration pic.twitter.com/xXygxC2bwi) about the maturity of their starting points for the Technical Solution – a scam which has led directly to initial programme costs being grossly underestimated by MoD – a condition referred to as the conspiracy of optimism.

      To add to this wanton act of deception, Defence Contractors have also been deploying the old favourite of touting the so-called, minimal development solution – a commonly used ploy advanced to con procurement officials into believing that they have a nearly-ready Technical Solution on offer, when in reality, they probably have something in hand which is closer to starting from a ‘blank sheet of paper’!

      This deceitful behaviour is a common trait in the defence manufacturing industry, beginning with the Select Few at the top, and extending right down the entire supply chain.
      @JagPatel3

    4. Nuno Gil says:

      The blog reads well, but I would really like to take exception here with the framing around “Learning from successes and failures”. I argue that this reductionistic framing is unfortunate, and exactly the type of framing that NAO would want to avoid in order to further the debate.

      In such pluralistic enterprises, there is no such thing as an universal conceptualization of project organizational failure or sucess. It is always relative to the baseline that one adopts. What is failure from one perspective, it is a remarkable success from another perspective. If you evaluate the Olympics 2012 project against the 2007 budget envelope you may argue delivery occurred within target. But if you measure project performance against the 2007 cost forecast (put together after 2 years of further planning), you wonder how come they still needed to draw £2billion from contingency. And if you evaluate against the first budget put together by Arup and then validated by PWC, you wonder how could they got it so wrong.

      The case of the Olympic stadium is telling. Once you add the costs of adapting the stadium post-games, it is one of the most expensive venues in the world and shows a eye-popping cost overrun even relative to the 2007 cost forecast. Yet, you can argue that they did remarkably well considering the Herculean difficulties in attempting to reconcile the bid commitments (to fold the Olympic stadium into an atheletics venue) with institutional pressure from extremely powerful stakeholders in the environment to ditch the bid commitments and turn the stadium into a football venue.

      I think the point is clear. The whole debate on performance is relative to your point of view, and rightly so when you operate in a pluralistic society. But I would expect NAO as an independent watchdog would want to have a much more sophisticated discourse as opposed to embrace into a rethorical narrative that obfuscates the real issues and the managerial complexity. I would leave that approach to the elected leaders.

      Nuno Gil, Professor of New Infrastructure Devleopment, Manchester Business School, The University of Manchester

      • Sandy Gordon says:

        Many thanks for your comments, Professor Gil
        We agree that major programmes are highly complex and the degree to which they proceed as planned or achieve their intended outcomes does, of course, vary between different parts of each programme. The sub-heading in my blog-post was intended to highlight the fact that useful learning can be taken from all programmes, however successful they are ultimately considered to be.
        It can be difficult to draw conclusions about the success of programmes, as you’ve pointed out about the London Olympic Games. As you may be aware, the objective of our work is to form a judgement on whether value for money has been achieved, and we define good value for money as the optimal use of resources to achieve the intended outcomes. As part of our role, we also make recommendations on how to achieve better value for money and how to improve the services under examination.
        The organisations that we audit frequently tell us that they value the lessons that we can bring from our value-for-money work across government and this Framework is one of the resources we have created to make that learning easier for organisations to access and apply to their own work. We hope that they will find it useful for that purpose.
        Sandy Gordon, Audit Manager, Project Delivery Team

    5. Walter Ashton says:

      The HS2 £55.7 billion funding is OTT for saving 20 minutes per journey to Manchester, surely more improved trains and carriages on time, would be a better effected solution !!

    Leave a Reply

    Your email address will not be published. Required fields are marked *

Right column

  • About the NAO blog

    Our experts share their views about issues and common challenges facing government, what public sector leaders should look out for and how organisations have addressed issues. Our posts draw together threads from across our reports, share secrets spilled in events and reveal our experts’ expectations for the future.

    We encourage comments that support the exchange of ideas for improvement, but ask that those posting are respectful of others.

  • Sign up for automatic feeds

    Sign up to receive email alerts:

  • Recent posts