Left column

Middlecolumn

Article

    Beautiful models: robust and reasonable

  • Posted on March 18, 2016 by

    Model framework photo£5 billion: Potential underestimate of forecast costs in the Ministry of Defence 10 year equipment plan. 32,000: The over-estimate in the number of new homes the New Homes Bonus policy would deliver in its first ten years, as a result of an arithmetical error in a forecasting model. These findings in our reports Major Projects Report 2015 and Forecasting in government show that models really matter! So how can you be sure your model is producing accurate forecasts?

    As auditors, we review lots of government models to be sure they’re producing reasonable outputs. Following discussion with modelling experts across government and beyond, we’re delighted to share the Framework to review models we use to guide our approach. In creating it, we combined our in-house analytical experience and capability with the evidence and guidance set out in publications such as HM Treasury’s (HMT’s) review of quality assurance of government analytical models in 2013, and the Aqua Book: guidance on producing quality analysis for government in 2015.

    Modelling errors can be extremely costly. They can also be embarrassingly high-profile. The error in the model used to evaluate bids in the InterCity West Coast franchise competition in 2012 led to unforeseen costs to taxpayers of £54 million (as reported in our Forecasting report). It was this case that prompted HMT to commission the Macpherson review of the quality assurance of modelling, which recommended that departments put in place the right processes and culture to support quality assurance.

    So what goes wrong? Typically the problems relate to:

    These factors – and an actual lack of forecasting or modelling – were the key weaknesses found in 71 reviewed NAO studies undertaken between 2010 and 2013, as detailed in Forecasting in government to achieve value for money.

    For example, we recently published a report on training new teachers that reviewed the Department for Education’s (DfE) Teacher Supply Model, designed to estimate how many initial teacher training places are needed each year. Using our framework, we found that the model was logical, had been carefully thought through and used the department’s best routinely available data. But we concluded that the DfE had yet to demonstrate the accuracy of the model. There are also significant risks that it will generate incorrect trainee recruitment targets due to uncertain data inputs.

    Now that I’ve got you thinking about the outputs of your modelling, let me share some knowledge with you that could help you to increase your assurance about it.

    In reviewing models the first question to answer is: what’s the scale of appraisal needed? This schematic from HMT’s ‘Review of quality assurance of Government analytical models’ shows indicative types of quality assurance that might be appropriate given different levels of complexity and risk.

    Quality Assurance

    Once you have answered this question, our newly-published Framework to review models enables you to determine and improve the quality of your model and its outputs. The framework is split into the following seven modelling stages (key below):

    Framework to review models

    Key to stages:

    1. To review how well the governance arrangements oversee the design, development, implementation and assurance of the model.
    2. To ensure there is clarity in the reasons behind the creation of the model, and the expectations for how model output will be used.
    3. To provide assurance the model is logical, accurate and appropriate and has been built and developed robustly.
    4. To review the quality of the data in the model and assess whether it’s appropriate for use within the model.
    5. To review the quality of assumptions in the model and to assess the evidence base and rationale for inclusion.
    6. To understand the drivers and tolerances of the model and to quantify uncertainty.
    7. To assess whether forecasts receive sufficient challenge, are integrated into decision making and risk management systems and are compared with actual outcomes to inform future development.

    We are determined to make sure public sector expenditure and decision making is underpinned by high quality modelling. Please contact us to let us know how we can improve our framework to achieve this crucial aim.

    Please note, too, that this framework is the latest edition to our new Self-assessment resources web-page, where you can find a range of guides and frameworks to aid your own reviews and decision-making.

     
    Elliott-White

    About the author: Elliott White is an operational researcher and head of the NAO modelling discipline, part of the Methods, Economics and Statistics Hub. Elliott is currently on secondment from the Government Operational Research Service with experience working in policy analysis and business planning at Department for Education and Department for Environment, Food and Rural Affairs.

     


    Tagged:           


  • Leave a Comment

    Leave a Reply

    Your email address will not be published. Required fields are marked *

Right column

  • About the NAO blog

    Our experts share their views about issues and common challenges facing government, what public sector leaders should look out for and how organisations have addressed issues. Our posts draw together threads from across our reports, share secrets spilled in events and reveal our experts’ expectations for the future.

    We encourage comments that support the exchange of ideas for improvement, but ask that those posting are respectful of others.

  • Sign up for automatic feeds

    Sign up to receive email alerts:

  • Recent posts