Skip to main content

Second validation compendium report 2003-06 PSA data systems

Sir John Bourn, head of the National Audit Office, reported today on the progress that Departments have made in establishing robust data systems to measure and report performance against 2003-06 Public Service Agreement targets. This report presents the overall findings from the NAO’s examination of the data systems used by 18 Departments and the cross cutting Sure Start programme. It summarises the findings from those validations and highlights successful practices which have wider applicability and can improve the management of data systems across government.

The NAO found that Departments had made variable progress in meeting good practice principles for managing data systems. For many targets, Departments had developed and operated good systems which addressed the main risks to the reliability of reported data. But for other targets Departments had not, at the time of validations, developed systems that managed all the significant risks to data reliability or explained the existence of those weaknesses to readers of their public performance reports.

In 2002, a total of 18 Departments and the cross-cutting Sure Start programme announced 122 Public Service Agreement (PSA) targets for the period 2003-06, including 18 targets shared by two or more Departments. This report outlines the findings from the National Audit Office’s validation of the data systems used to monitor and report progress against all these PSA targets. We found that while 77% of data systems provided a broadly appropriate basis for measuring progress, in around two-thirds of data systems we looked at, Departments had encountered problems to varying degrees. Hence our examinations revealed that there is much opportunity for Departments to take further action to ensure that data systems for all PSA targets are robust. These include the following:

  • Departments should develop a more systematic approach to data quality assurance.
    For example they could:

    • introduce a formal process of risk assessment for key performance data and, where necessary, include data quality risks in their corporate risk registers;
    • allocate clear responsibilities for data quality management, including active oversight of and challenge to systems;
    • formalise the role of Departmental statisticians and other data specialists in the quality assurance of PSA data systems to ensure standards and checks are applied consistently; and
    • develop a clear policy on the disclosure of data limitations for reporting outturn for all PSA targets.
  • They should plan and co-ordinate the data needs for new systems.
    Many weaknesses stem from inadequate attention to data issues when PSA targets are selected and specified. When setting PSA targets, Departments should consider their capability to measure progress and judge when success has been achieved. Departments should define the quality of data needed for effective progress monitoring, and then assess whether existing or new data systems can best meet the requirement. This process should involve staff from the relevant business areas, statisticians and analysts, and the providers of data whether within or outside the Department.
  • Systems must be adequately documented and updated for any significant changes.
    Clear definitions of terms, well-documented controls and unambiguous criteria for judging success enable systems to operate consistently over time and provide the foundations for making robust judgments of performance. Where Departments revise systems for PSA targets they should update documentation and agree major changes with HM Treasury and explain them in their Technical Notes.
  • Managers should check that data obtained from other organisations are fit for purpose.
    Many PSA data systems rely on data that are produced by other organisations. Managers need to discuss with these organisations to assure themselves that the data are appropriate and that any limitations are clearly understood.
  • Departments should make users of performance data aware of limitations in underlying systems.
    When reporting progress, Departments should explain the implications of any data limitations that might affect how outturn figures are interpreted. This approach builds trust in public reporting by helping users make informed assessments of reported results.

"I welcome the fact that Government Departments have managed in many cases to establish robust data systems where measurement presented considerable difficulties. However, progress across the board in developing sound systems has been variable and Departments must overcome the difficulties and develop performance management systems that allow the full benefits of PSA targets to be realised."

Sir John Bourn

Notes for Editors

  1. Public Service Agreements (PSAs) are at the centre of Departments performance management systems. They are three year agreements, negotiated every two years between each of the 18 main Departments and HM Treasury during the Spending Review process. Each PSA sets out the Department's high-level aim, priority objectives and key performance targets. The Agreements set for 2003-06, as well as those Departments will be working towards in 2005-08, are available from HM Treasury's website at: http://www.hm-treasury.gov.uk/documents/public_spending_and_services/publicservice_performance/pss_perf_table.cfm
  2. In his 2001 report on Government Audit and Accountability, Lord Sharman recommended that there should be external validation of Departmental information systems as a first step in a process towards validation of key published data. Following his recommendation, the Government invited the Comptroller & Auditor General in March 2002 to review the reliability of data systems underlying PSA targets at least once during the lifetime of a target. The Government also established a Treasury-led working group which included representatives from spending departments and the National Audit Office. This group developed good practice principles for managing and validating data systems.
  3. The NAO have taken a staged approach to this new area work. In 2003, we developed our methodology by working with five Departments on the data systems they were using for a number of their 2001-04 targets. We provided advice on how the Departments could improve their data systems to support better performance management. We then refined our approach, and in 2004 completed dry-run validations of data systems operated by seven Departments and the cross-cutting Sure Start programme for their 2003-06 PSA targets. An interim report highlighting issues arising from those validations was published in March 2005. Subsequently, in 2005 we validated data systems operated by eleven Departments. This report summarises the findings from our dry run validations.
  4. Press notices and reports are available from the date of publication on the NAO website at www.nao.org.uk. Hard copies can be obtained from The Stationery Office on 0845 702 3474.
  5. The Comptroller and Auditor General, Sir John Bourn, is the head of the National Audit Office which employs some 800 staff. He and the NAO are totally independent of Government. He certifies the accounts of all Government departments and a wide range of other public sector bodies; and he has statutory authority to report to Parliament on the economy, efficiency and effectiveness with which departments and other bodies have used their resources.

PN: 23/06