Left column


April 2021

    Better data means better services – so how can government get there?

  • Posted on April 29, 2021 by

    The shielding programme was a swift government wide response to identify and protect clinically extremely vulnerable (CEV) people against COVID-19.

    Our recent report on Protecting and supporting the clinically extremely vulnerable during lockdown, shows how government quickly recognised the need to provide food, medicines and basic care to those CEV people shielding. This had to be pulled together rapidly as there were no detailed contingency plans.

    But there was a problem.  In order to do this, government was faced with the urgent task of identifying the people who needed support based on existing, disparate data sources.

    Difficulties in extracting and combining data

    The urgency of this exercise was recognised by all involved, but difficulties in extracting, matching and validating data from across many different systems meant that it took time for people to be identified as CEV.

    At the start of the pandemic, there was no mechanism to allow a fast ‘sweep’ across all patients to identify, in real time, those who fell within a defined clinical category.

    It was a major challenge to identify and communicate with 1.3 million people by extracting usable data from a myriad of different NHS and GP IT systems all holding data differently.

    This lack of joined-up data systems meant NHS Digital had to undertake the task of accessing and extracting GP patient data, stored in different ways in each practice and holding specific details about people’s medical conditions to merge with their own databases. It took a huge effort by the team to complete this task in three weeks.

    Data issues were not resolved by the time of the second lockdown

    Government had identified systems were not capable of ‘speaking’ to each other across hospital, primary care, specialist and adult social care services following the first iteration of shielding (March – August 2020), and sought to apply them to the second lockdown towards the end of 2020. However, our report highlighted resolving the data issues was not an area where significant progress had been or could be made.

    This reflects the wider issues of data across government

    These challenges are examples of broader issues that we have previously highlighted in our report on Challenges in using data across government. People often talk about better use of data as if this is a simple undertaking. But there are significant blockers and constraints that require sustained effort to overcome, which apply to all areas of government trying to use and share data other than for the single purpose it was originally created for.

    The basic issues are widely known and acknowledged:

    • Huge variability in the quality and format of data across government organisations
    • Lack of standardisation within departmental families and across organisational boundaries making it difficult for systems to interoperate
    • The extent of legacy IT systems across government further compounding the difficulties
    • Ownership and accountability aren’t easily agreed where a shared dataset of personal data is brought together and has equal value to different services.

    It’s unclear to us how calls to establish and enforce data standards are going to work in practice if existing systems can’t be modified to support them and there is no firm timetable, road map or funding commitment for replacing them.

    In our report Digital transformation in the NHS, we reported that 22% of trusts did not consider that their digital records were reliable, based on a self-assessment undertaken in 2017. The average replacement cycle for a patient records system is something in the region of once every 15 years so this change isn’t going to happen overnight.

    Our aim is to support government in tackling these issues, and not to be critical of past failings, because we recognise that it is hard. We set out a number of recommendations in our data report and they are summarised in our accompanying data blog.

    Some are aimed at the centre of government and others are steps that individual organisations can take. Our cross-government recommendations were primarily around accountabilities, governance, funding and developing rules and common ways of doing things.

    Our recommendations for individual organisations are:

    • Put in place governance for data, including improving the executive team’s understanding of the issues associated with the underlying data and the benefits of improving that data
    • Set out data requirements in business cases. This should include an assessment of the current state of the data, and the improvements or new data that are necessary. These assessments should have an explicit consideration of ethics and safe use
    • Implement guidance for front-line staff for handling data, including standardisation, data ethics and quality.

    Organisations that hold a cohesive view of their citizen/patient data must address this issue in a managed and incremental way, rather than having to resort to one-off costly exercises which have to be repeated when the next need arises. This will require sustained effort and perseverance.

    Unfortunately, there are no easy shortcuts, but with a will to put in the necessary effort progress can be made one step at a time.

    Yvonne Gallagher

    Yvonne Gallagher

    Yvonne is our digital transformation expert, focused on assessing the value for money of the implementation of digital change programmes. Yvonne has over 25 years’ experience in IT, business change, digital services and cyber and information assurance, including as CIO in two government departments and senior roles in private sector organisations, including the Prudential and Network Rail.


    Comment on this post...

  • Improving services is a team game

  • Posted on April 28, 2021 by

    Imagine being asked to design and provide a service that meets the needs of everyone in your street. This crossed my mind when the borough where I live in London became a surge testing location for coronavirus. For me that means producing a service that works for Margaret and David, the retired couple that have lived next door for what seems like forever. But also Ed, the single parent with two teenage children across the street at number 42….and everyone else living in the 40 houses and flats in the street.

    And then think about designing services not just for your street or town, but for the whole country. No matter how well I think I know my neighbours, it’s hard for me to truly know the complications of each of their lives. Can I be certain what each of them might need from a service – let alone the rest of the country?

    I was thinking about this as I ordered a home testing kit so that I could take a coronavirus test a time convenient for me. I had to return the test to a local site. One of the three sites was only five minutes walk away, so it was easy for me to drop it off. Less easy, I guess, for people in other parts of the borough.

    What struck me as I dropped off my test was how easy it was to do, really well organised, but also how everyone returning tests, or queuing to take a test, were a similar age and ethnicity. This despite the borough’s demographic being 30% ethnic minorities and 10% people over the age of 65. I wondered how effective the surge testing service was at reaching all people in the borough. People that spend their working day in another part of the city, people who have caring responsibilities, and all the hundreds of other personal characteristics that inform our daily life.

    How do you design a service for such a range of circumstances? 

    That’s the huge challenge that government has to strive to get right. Whether it’s long-term policy outcomes such as achieving net zero, creating new services such as test and trace, or working out how to provide benefits in a better way. Get it right and government can be more certain of putting to good use the £456 billion it estimated to spend on public services, grants and administering services in 2020-21. Get it wrong and not only is that money put at risk but also the experience of the people using the services.

    I’ve been lucky to see up close how government deals with service delivery challenges. From seeing the lived experience of immigration enforcement officers and the people they come into contact with, to work coaches helping people find jobs. I’ve worked with 40 government organisations and seen how they provide over 120 services. Our recent good practice guide collects what we’ve learned in one place sharing practical actions, questions to ask, pitfalls and warning signs to look out for.  

    We’re sharing our learning to help government think about these challenges and benefit from our insight. That tells us there are five areas to get right: 

    Adopting a whole-system approach 

    1) Aligning objectives, funding, governance and accountability  

    2) Closing the gap between policy design and service reality 

    Managing operations in your organisation 

    3) Building technical and leadership capability 

    4) Meeting diversity of users’ needs 

    5) Taking an end-to-end service perspective 

    Success isn’t based on individual heroics. It comes from different organisations, inside and outside government, central and local government and people in headquarters and front-line roles working together. It requires all sorts of people to play their role in translating a policy idea into a lived experience for people across the country. It’s easy to see a service challenge through the lens of our own experience, our own role or our organisation’s perspective. So it’s crucial that people designing policy, deciding funding, providing front-line services, or overseeing whole sectors or policy outcomes understand how their contribution affects people using services. And then work together to make it a success.

    How will the way I think about my part of the wider system influence the overall outcome achieved? 

    It’s crucial to think about questions such as: 

    Are we clear on each other’s objectives, whether they align, and how to resolve conflicting priorities?  

    Do we have shared understanding of the problem to fix – whom to involve in achieving it, and whom it will affect? 

    Are we making decisions based on a detailed understanding of the actual or likely impact on different types of people using our service? 

    Is performance measurement based on averages, masking service problems that affect particular groups?

    Providing services for government’s diverse range of users is not easy. But government has a better chance of getting it right if it thinks about the questions and the pitfalls from our experience. A better chance of knowing Margaret and David’s needs and how to change a service to more closely meet them.

    A summary of the main points in our good practice guide is available here 

    Alec Steel

    Alec Steel
    About the author: Alec has led our operations management specialism since 2010, and supports government thinking across the UK, Europe, USA and Australia. He has authored reports on subjects ranging from major project assurance to the use of consultants, and his assessment of operations management capability across central government in 2015 drew on learning from 32 organisations and 86 operational services.

    Comment on this post...

  • Auditing government’s pandemic response

  • Posted on April 22, 2021 by

    Like governments around the world, ours has committed unprecedented amounts of public money to the fight against coronavirus. By the end of 2020, this reached £271 billion in the UK and will continue to increase. 

    As the UK’s independent public spending watchdog, the National Audit Office has been tracking the Government’s pandemic spending commitments, reporting to Parliament and the public on whether that money has been accounted for correctly and spent as intended. The Committee of Public Accounts has also used our findings as the basis for taking evidence from senior civil servants and for making its own recommendations. 

    One year on from the start of the pandemic, what can be learned from the way government has responded to it? 

    Like many countries, the UK was not well prepared for this pandemic. While we recognise that government cannot be expected to plan for every eventuality, we have repeatedly found that there was no contingency plan to deal with the unfolding situation. And, where plans were in place, these did not anticipate this type of pandemic. 

    Reflecting on this difficult starting point, we have independently assessed each element of the Government’s response based on what was reasonable to expect in the circumstances. 

    The urgency and scale of action required meant that ensuring value for money for the public did not always take priority. Trade-offs were necessary, which increased the risk of financial losses. So, the question we looked to answer in our audits was how well those trade-offs were understood and managed. 

    The furlough scheme, designed and implemented by HMRC and HM Treasury, and the scaling up of Universal Credit payments by the Department of Work and Pensions, were delivered at impressive speed and against a sudden and huge increase in demand. 

    Yet, even with this sterling effort by the Civil Service, the speedy response has come at a cost – higher levels of fraud and error than government would have otherwise expected. 

    This increased risk of financial losses is seen most clearly in the Bounce Back Loan Scheme. The scheme has provided vital cash flow support to small- and medium-size businesses – by the end of March 2021, more than 1.5 million loans had been issued with a total value of £47 billion. 

    However, when initially launched, the scheme proved slow and cumbersome for smaller businesses. In response, credit and affordability checks were removed from the process for loans of up to £50,000 and government guaranteed 100 per cent of the loans. This sped up loans and proved a lifeline for thousands of smaller businesses, but is likely to come at a cost.

    When we reported on this in November 2020, government estimated that 35 per cent to 60 per cent of borrowers may default on the loans. A better indication of the true cost to the public purse will begin to emerge from May when the first repayments are due, although businesses are able to apply to defer this. 

    Those involved in the procurement of personal protective equipment (PPE) faced the considerable challenge of an overheated global market for PPE and an inadequate UK stockpile. Necessary trade-offs, to allow for rapid acquisition of life-saving equipment, were less well managed. At the height of the emergency, it was reasonable to place urgent orders directly with suppliers, rather than use slower competitive tendering methods. But even allowing for the urgency of the situation, essential standards of government transparency were not consistently met. This includes how some suppliers were picked and how a high priority channel for considering certain suppliers was created. 

    In emergency situations where the assurance provided by open competition is not available, it is even more important to provide prompt and full transparency to maintain public trust in how taxpayers’ money is being used. 

    The public health response to the virus has required government to create and deliver a vaccine programme and a test and trace operation at a scale and pace never seen before. The success of the UK vaccine programme is based on shrewd investments in candidate vaccines, brilliant scientists, effective commercial agreements with industry and the delivery power of a National Health Service bolstered by an army of volunteers. Public money had to be committed when there was no guarantee of vaccine effectiveness and this risk was managed well, making good use of relevant scientific and commercial expertise. 

    As the vaccine roll-out progresses and as lockdown measures lift, NHS Test and Trace’s role in identifying and suppressing outbreaks will become more vital. Our initial work on Test and Trace in December found that it had achieved a rapid scale-up in activity and had built much new infrastructure and capacity from scratch. However, we also highlighted value for money concerns and weak evidence of the effectiveness of the service. 

    Test and Trace’s operations will transfer to the new UK Health Security Agency, which is expected to become fully operational later in the year. We will report again on the progress Test and Trace has made in the summer. 

    There is much to be learned already from the pandemic. To promote transparency, government must clearly define its appetite and tolerance of risk, particularly under emergency spending conditions. Uncompetitive procurement practices must not be allowed to become a new norm. It should also monitor how Covid-19 programmes are operating, dynamically updating demand forecasts, and ensuring it has the ability to flex its response. 

    Reporting on the Government’s ongoing response to the pandemic will remain a priority for the NAO. Our upcoming work will include a review of the role of Greensill Capital in Covid-19 loan schemes. We will also publish a series of lessons learned reports starting in May, designed to be of value for the remainder of this pandemic and to help the UK better prepare for future emergencies. 

    This article was first published in the Daily Telegraph


    Comment on this post...

  • What should we ask ourselves to break the cycle of poor programme performance?

  • Posted on April 20, 2021 by

    We are keen to support those responsible for both delivering and overseeing government programmes. In that spirit, we today publish our Framework to review programmes summarising what we ask when auditing major programmes. It draws from our experience of around 200 studies and includes examples of what we have seen.  

    Our work provides insights on the challenges in getting government programmes right. Programmes often encounter difficulties, take longer and cost more than planned, and don’t deliver the intended aims, with significant and high-profile consequences. Our unique remit means we can reflect on this across the lifecycle of different programmes, whether they aim to deliver long-term government ambitions (e.g. net zero);  infrastructure (e.g. High Speed Two); or processes (e.g. offender rehabilitation). 

    We repeatedly see similar problems, many of which have their root causes within the programme scope; cost and schedule planning; interdependencies; and oversight. In November, we published Lessons learned from Major Programmes which examined these root causes and what government can learn from them. This included ensuring the management approach evolves over time and the need to consider operational planning from the start.  These lessons will relate, to a lesser or greater extent, to all programmes. For example, ambitious technology-enabled business change may mean digital programmes need to handle greater uncertainty, which we will report on later this year.  

    These lessons are not new. You will have heard them before and organisations like the Infrastructure and Projects Authority and Association for Project Management share similar insights. For example, the IPA’s Principles for project success outlines the need to plan realistically, tell it like it is and control the scope. However, given this consistency of thinking, why are things so hard to change? What can be done?  

    Although not necessarily the golden bullet, we see real value in those managing and overseeing programmes asking themselves the seemingly simple and straightforward questions, particularly early in a programme. Our framework shares 18 possible questions across four themes – purpose; value; set-up; and delivery and variation management.  

    In terms of purpose, you want to ask – Is it clear what objective the programme is intended to achieveDoes the programme make sense in relation to the organisation’s strategic priorities?  

    HM Treasury guidance requires departments to establish a strategic case setting out how a programme will meet an underlying rationale and objectives. However, we often see bodies struggle to maintain a clear focus on a programme’s objectives. Government major programmes are likely to have multiple objectives, sometimes involving more than one department, and we have seen a lack of coherency and prioritisation. For example, in November 2020 we considered progress in Achieving government’s long-term environmental goals. Government’s plan bringing together commitments and aspirations did not provide a clear and coherent set of objectives so it was hard to determine how the ambitions related to pre-existing targets and each other.  

    A clear scope and objectives enable government to make trade-offs and better understand the impact of other projects and programmes. Our report on improving the A303 between Amesbury and Berwick Down found that the project could only fully deliver its strategic objectives as part of a completed A303/ A358 corridor. This required further projects, with five of these being individually appraised as low to poor value for money.  

    In terms of value, you may want to ask – Are cost and duration estimates appropriate to the stage of the programme, with risks and uncertainties appropriately reflected?  

    Does the programme have a plan to deliver benefits and is this being implemented? 

    We have reported how the relative lack of information early in a programme means estimates will be highly uncertain. In many programmes we reviewed, governments have not sufficiently recognised these inherent uncertainties and risks. We have seen how using these estimates as targets can drive behaviours detrimental to successful delivery.  

    We highlighted this in our report on High Speed Two progress, whilst the framework references our early review of the risks faced by Parliament’s Restoration and Renewal. This recognised that uncertainties needed to be understood and recommended developing evidence based cost and time ranges with milestones setting out when estimates could be reassessed and the ranges narrowed.  This drew from our survival guide which offers senior decision makers factors to consider when challenging costs. 

    Our set-up questions include – Does the programme have the right culture and leadership with the necessary authority and influence? Are key risks identified, understood and addressed? Whilst on delivery and variation management – Is the programme sufficiently flexible to deal with setbacks and changes in the operating context? Does the programme have a clear plan for transfer to operations/business as usual? 

    With COVID-19, EU Exit and net zero, government is undertaking more programmes. And as the complexity of these, and the overarching landscape increases, getting the basics right has never been more important. There have been notable improvements, and the government’s aim to set out requirements and best practice guidance is a good thing. Now is the time to reflect and learn from the past. Programme delays and increased costs have repercussions for the programme itself, other programmes and overall expenditure at a time when resources are constrained.  

    Comment on this post...

Right column

  • About the NAO blog

    Our experts share their views about issues and common challenges facing government, what public sector leaders should look out for and how organisations have addressed issues. Our posts draw together threads from across our reports, share secrets spilled in events and reveal our experts’ expectations for the future.

    We encourage comments that support the exchange of ideas for improvement, but ask that those posting are respectful of others.

  • Sign up for automatic feeds

    Sign up to receive email alerts:

    RSS IconSubscribe in an RSS Reader
  • Recent posts