The PDFs on this page have been archived. Links will take you to documents on the National Archive Website.

Sir John Bourn, head of the National Audit Office, reported today on the progress that Departments have made in establishing robust data systems to measure and report performance against 2003-06 Public Service Agreement targets. This report presents the overall findings from the NAO’s examination of the data systems used by 18 Departments and the cross cutting Sure Start programme. It summarises the findings from those validations and highlights successful practices which have wider applicability and can improve the management of data systems across government.

Jump to downloads

The NAO found that Departments had made variable progress in meeting good practice principles for managing data systems. For many targets, Departments had developed and operated good systems which addressed the main risks to the reliability of reported data. But for other targets Departments had not, at the time of validations, developed systems that managed all the significant risks to data reliability or explained the existence of those weaknesses to readers of their public performance reports.

In 2002, a total of 18 Departments and the cross-cutting Sure Start programme announced 122 Public Service Agreement (PSA) targets for the period 2003-06, including 18 targets shared by two or more Departments. This report outlines the findings from the National Audit Office’s validation of the data systems used to monitor and report progress against all these PSA targets. We found that while 77% of data systems provided a broadly appropriate basis for measuring progress, in around two-thirds of data systems we looked at, Departments had encountered problems to varying degrees. Hence our examinations revealed that there is much opportunity for Departments to take further action to ensure that data systems for all PSA targets are robust. These include the following:

  • Departments should develop a more systematic approach to data quality assurance.
    For example they could:
    • introduce a formal process of risk assessment for key performance data and, where necessary, include data quality risks in their corporate risk registers;
    • allocate clear responsibilities for data quality management, including active oversight of and challenge to systems;
    • formalise the role of Departmental statisticians and other data specialists in the quality assurance of PSA data systems to ensure standards and checks are applied consistently; and
    • develop a clear policy on the disclosure of data limitations for reporting outturn for all PSA targets.
  • They should plan and co-ordinate the data needs for new systems.
    Many weaknesses stem from inadequate attention to data issues when PSA targets are selected and specified. When setting PSA targets, Departments should consider their capability to measure progress and judge when success has been achieved. Departments should define the quality of data needed for effective progress monitoring, and then assess whether existing or new data systems can best meet the requirement. This process should involve staff from the relevant business areas, statisticians and analysts, and the providers of data whether within or outside the Department.
  • Systems must be adequately documented and updated for any significant changes.
    Clear definitions of terms, well-documented controls and unambiguous criteria for judging success enable systems to operate consistently over time and provide the foundations for making robust judgments of performance. Where Departments revise systems for PSA targets they should update documentation and agree major changes with HM Treasury and explain them in their Technical Notes.
  • Managers should check that data obtained from other organisations are fit for purpose.
    Many PSA data systems rely on data that are produced by other organisations. Managers need to discuss with these organisations to assure themselves that the data are appropriate and that any limitations are clearly understood.
  • Departments should make users of performance data aware of limitations in underlying systems.
    When reporting progress, Departments should explain the implications of any data limitations that might affect how outturn figures are interpreted. This approach builds trust in public reporting by helping users make informed assessments of reported results.

"I welcome the fact that Government Departments have managed in many cases to establish robust data systems where measurement presented considerable difficulties. However, progress across the board in developing sound systems has been variable and Departments must overcome the difficulties and develop performance management systems that allow the full benefits of PSA targets to be realised."

Sir John Bourn

Downloads

Publication details

Latest reports