Are you achieving what you planned to achieve? Most NAO reports stress the need to measure performance. But how do you do it and use that information to get the best you can with the resources you have? How do you avoid the risk that (only) ‘what gets measured, gets done’? And how do you […]
Posted on November 4, 2016 by Simon Banner
Are you achieving what you planned to achieve? Most NAO reports stress the need to measure performance. But how do you do it and use that information to get the best you can with the resources you have? How do you avoid the risk that (only) ‘what gets measured, gets done’? And how do you do measure performance when – like regulators and many others – you don’t have direct control over the outcomes, but must influence others? There are many lessons of widespread value in our new guide, Performance measurement by regulators, which focuses on the particular challenge for those who regulate the products and services so important to our daily lives, such as utilities, transport, health, food and financial services.
Good performance information is essential for the management of all organisations, and a particular challenge for regulators. Like other organisations, regulators need to know whether they are operating efficiently and effectively, and whether their resources are being targeted in the best possible way. Regulators need to know whether the providers and sectors they regulate are delivering their intended objectives and outcomes, such as protecting consumers or service users in terms of price, service quality, or safety. They also need to assess whether to make use of their regulatory powers and intervene in the sector.
But the challenge for regulators is that they can only influence: it’s the service providers who actually deliver the intended outcomes.
Of course, delivering through others is not unique to regulators, as we at the NAO are very aware. As just one example, Parliament must hold government to account for activities, and we at the NAO play a key role in informing Parliament through our audits of financial statements and value for money. We therefore know full well the challenge of measuring our impact in terms of the actions of other organisations.
So how do regulators know whether their actions have made a difference? Perhaps the outcomes would have been delivered anyway.
Regulators use a range of “tools” to set their expectations of providers’ outcomes and behaviours, such as rules and guidance. They then influence behaviour, both informally and more formally, such as through mandating actions, or sanctions for unwanted behaviour.
Although some of the tools are unique to regulators, the measurement challenge is similar for any organisation seeking to influence others’ actions. They need to understand and measure both the outcomes that providers deliver, and the effectiveness of their influence over providers and outcomes (and to develop ways of drawing information on both elements together) – taking into account “external” influences on providers’ behaviour, such as economic or financial developments in the market or the wider economy.
Because of these complications, regulators and many others sometimes revert to measuring what is measurable, rather than trying to measure the influence over outcomes, which is more difficult. But without the latter measure, there’s no comprehensive picture of performance and effectiveness – of a regulator (or other influencing organisation), or the whole sector it regulates. And without this picture, regulators may not have sufficiently early warning of hidden or emerging problems in the sector or within individual providers.
Steps to better performance measurement
Our guidance details actions regulators can take to overcome the measurement challenges; steps that can be taken by anyone delivering through influence.
Set out the objectives and outcomes you are trying to achieve, based on understanding the sector and context, and the delivery organisations.
Identify the success criteria that will show whether those objectives and outcomes are being met.
Specify and measure your ‘inputs’, ‘outputs’ and ‘outcomes’. A good understanding of the linkages will allow you to demonstrate value for money in terms of economy (the relationship between resources and inputs), efficiency (the relationship between inputs and outputs) and effectiveness (the relationship between outputs and outcomes).
Have a basket of ‘good’ indicators: Our guidance sets out criteria for ‘good’ indicators of performance for regulators (see the Appendix). Regulators and any organisation influencing others needs to develop a body of evidence and context around indictors – not least to minimise risks that indicators themselves become the main motivation for providers, potentially leading to unintended consequences. You’ll get greatest confidence from a ‘basket’ of indicators that includes a mix of input / output / outcome indicators and lead / lag indicators. This will make it easier to assess overall performance against aims, and help identify hidden or emerging problems at an early stage.
Know where you have more or less influence: Developing such an understanding has two benefits: (i) you can see how well you are using the influencing techniques available to you, and (ii) it can help you identify and assess whether you have made full use of potential interest.
Not surprisingly, we think transparent and accessible reporting of performance information is essential. There are no formal standards for reporting performance information – unlike information in financial statements. Regulators have therefore developed their own approaches and our guide sets out some good practice examples that we’ve identified in regulators’ annual reports. Further examples can be found in Building Public Trust Awards – Examples of good practice in annual reports 2015, summarised in our blog-post: Great ideas for annual reports.
Lessons from regulators: influencing food standards
We’ve worked with a number of regulators to help them understand and analyse their influence over the outcomes. For the Food Standards Agency (FSA), we took one aspect of their operations – efficiency – and held workshops with their staff to identify the factors affecting FSA’s efficiency. We helped them categorise these factors according to how much influence they had over the outcomes relative to external influences, and how important those factors were to their overall efficiency. This analysis revealed where the FSA could make best use of its influence and focus its efforts in becoming more efficient. The following is a simplified version of this categorisation.
None of us wants to be in the position of not knowing whether our hard work is making a difference. This is why many organisations appreciate our ‘value for money’ studies, seeing them as detailed assessment about whether or not their activities are achieving their objectives with economy, efficiency and effectiveness.
I welcome your comments about the issues in this post or any of our guides and reports, and invite you to contact us to discuss any of matters raised.
Our other work on performance measurement
Our guidance builds on a long history of NAO reporting on performance measurement in government, such as the following reports:
Choosing the right FABRIC: Standing for focused, appropriate, balances, robust, integrates, and cost-effective, ‘FABRIC’ summarises the properties of good performance measurement frameworks – that is, organisations’ systems linking all the performance information available, and encompassing the people, systems and processes involved in collecting, organising and analysing that information, and the criteria for developing good indicators within those frameworks.
Taking the measure of government performance, Appendix 3, sets out a number of risks and ways to mitigate them. From data complexity to subjectivity, lack of expertise to inadequate oversight, there are risks that performance measurement systems could produce inaccurate or misleading measures.
Performance Frameworks and Board Reporting II looks at the integration of financial and performance information to ensure understanding of the link between money spent and results achieved, and enable informed Board decision-making. It also sets out a Maturity Matrix that enables organisations to assess the level of maturity of board reporting practice across the three areas above – developing a performance measurement framework, reporting and using information.
A range of other reports highlighting performance measurement matter can be found on our performance measurement web-page.
About the author: Simon Banner is an Audit Manager within the NAO and has worked in our Regulation, Consumers and Competition team on a range of value for money studies covering regulation of utilities, financial services and other sectors. Simon is currently with our Corporate Finance team.
Share this article on social media: