All posts by Daniel Lambauer
Posted on June 1, 2021 by Daniel Lambauer
In our ever-increasing digital and automatized world, certain buzzwords are becoming more centre stage in the public sector. One of them is “artificial intelligence”. While the concept, and development, of artificial intelligence is not new (artificial intelligence was first recognised as a formal discipline in the mid-1950s), it is a word that has been casually thrown around more in recent years in the public sector, and sometimes carelessly.
Traditional algorithms vs machine learning models
These days modern data scientists normally associate with artificial intelligence systems that are based on machine learning models. Machine learning models deploy methods that develop rules from input data to achieve a given goal.1 There is a difference to, what you may call, traditional algorithms. Traditional algorithms don’t need data to learn, they just churn out results based on the rules inherent to them.
Traditional algorithms have been used in the public sector for some time to make decisions. The latest example making the headline was the model determining A level exam results last summer. From an auditing perspective, as the basis of the algorithms are usually transparent, auditing them is something we as a public audit institution are used to.2
But artificial intelligence that is based on machine learning is different – it has only been (cautiously) employed in the public sector in recent years.
It is different because, firstly, for a machine learning model to learn it needs good, quality data – and often a lot of it. Our report on the challenges of using data across government has shown that that condition is not always given.
Secondly, it can be quite costly to develop and deploy them. Moreover, the benefits are not always guaranteed and immediately realisable. In a public sector context with tight budgets, the willingness to put money behind them may not always be there.
The reason for this is related to a third point. It is not always certain from the outside what the machine will learn and therefore what decision-making rules it will generate. This makes it hard to state the immediate benefits. Much of the progress in machine learning has been in models that learn decision-making rules that are difficult to understand or interrogate.
Lastly, many decisions affecting people’s lives that artificial intelligence models would support pertain to personal circumstances and involve personal data, such as health, benefit or tax data. Whilst the personal data protection landscape has strengthened in recent years, there are not always the organisational regulatory structures and relevant accountabilities in place of the use of personal data in machine learning models.3 Public sector organisations are therefore at risk of inadvertently falling foul of developing data protection standards and expectations.
How to audit public sector machine learning models
Given all these challenges, it may not be surprising that in our public audit work, we are not coming across a lot of examples of the use of machine learning models in decision-making. But there are examples4 and we foresee that they may be growing in the future.
We have therefore teamed up with other public audit organisations in Norway, the Netherlands, Finland and Germany, and produced a white paper and audit catalogue on how to audit machine learning models. You can find it here: Auditing machine learning algorithms (auditingalgorithms.net).
As the paper outlines in more detail, we identified the following key problem areas and risk factors:
- Developers of machine learning models often focus on optimising specific numeric performance metrics. This can lead them to neglect other requirements, most importantly around compliance, transparency and fairness.
- The developers of the machine learning models are almost always not the same people who own the model within the decision-making process. But the ‘product owners’ may not communicate their requirements to the developers – which can lead to machine learning models that increase costs and make routine tasks more, rather than less time-consuming.
- Often public sector organisations lack the resources and/or competence to develop machine learning applications internally and therefore rely on external commercial support. As a result they may take on a model without understanding how to maintain it and how to ensure it is compliant with relevant regulations.
We also highlighted the implications for auditors to meaningfully audit artificial intelligence applications:
- They need a good understanding of the high-level principles of machine learning models
- They need to understand common coding languages and model implementations, and be able to use appropriate software tools
- Due to the high demand on computing power, machine learning supporting IT infrastructure usually includes cloud-based solutions. Auditors therefore also need a basic understanding of cloud services to properly perform their audit work.
Our audit catalogue sets out a series of questions that we suggest auditors should use when auditing machine learning models. We believe it will also be of interest to the public sector bodies we audit that employ machine learning models. It will help them understand what to focus on when developing or running machine learning models. As a minimum, it gives fair warning what we as auditors will be looking for when we are coming to audit your models!
1 In fact there are two main classes of machine learning models. Supervised machine learning models attempt to learn from known data to make predictions; unsupervised machine learning models try to find patterns within datasets in order to group or cluster them.
2 See for example our Framework to review models – National Audit Office (NAO) Report to understand more about what we look out for when auditing traditional models and algorithms. We currently have some work in progress that aims to take stock of current practices and identify the systemic issues in government modelling which can lead to value for money risks
3 In the UK the Information Commissioner Office has published guidance on the use of personal data in artificial intelligence: : Guidance on AI and data protection | ICO
4 For some UK example see: https://www.gov.uk/government/collections/a-guide-to-using-artificial-intelligence-in-the-public-sector
About the author:
Daniel Lambauer joined the NAO in 2009 as a performance measurement expert and helped to establish our local government value for money (performance audit) team. He is the Executive Director with responsibility for Strategy and Resources. As part of his portfolio, he oversees our international work at executive and Board level and has represented the NAO internationally at a range of international congresses. He is also the NAO’s Chief Information Officer and Senior Information Responsible Owner (SIRO). Before joining the NAO, Daniel worked in a range of sectors in several countries, including academia, management consultancy and the civil service.
Posted on August 27, 2020 by Daniel Lambauer
COVID-19 continues to have a significant impact on the work of the National Audit Office, including our international work. Our team has continued to complete our international assignments successfully from the UK. We are actively exchanging experiences with other national audit agencies (also known as Supreme Audit Institutions – SAIs) on how to audit governments’ response to COVID-19 across the world, which provides us with valuable insight to strengthen our UK audit response.
The objectives of our international work are three-fold – we want to use international good practice to improve our UK focused work; we want to enhance the UK’s reputation by showcasing the quality of our public audit work worldwide; and we want to protect the UK taxpayer’s interests overseas by bidding to audit international organisations that receive UK funds, or by providing training to SAIs in countries that receive UK aid.
The NAO published its new strategy in June , so we were already planning how international activities could best contribute to our mission as the UK’s independent public spending watchdog. As Gareth Davies, the Comptroller and Auditor General, recently described, the NAO is undertaking a substantial audit programme on the government’s response to the COVID-19 pandemic. From the start, we were clear that UK audit coverage of the pandemic would be stronger by including global insights. People who read our reports see media coverage contrasting how other countries are responding and want an objective comparison between these responses. We can also learn lessons from other countries that can help the UK to better respond to this pandemic.
The NAO has a global reputation as a leading SAI, but we strongly believe there is always more we can learn from others on how best to provide a modern public audit service. Having good links with other SAIs also allows us to make better use of international comparisons in our reports – particularly in key areas such as delivering major infrastructure and defence projects – providing Parliament with insights on how other governments approach the same challenges facing our own.
As we contacted other SAIs it was obvious our situation was not unique. As well as considering the specific nature and unprecedented scale of the pandemic, SAIs around the world were all thinking about how to audit the same thing at the same time. SAIs have a unique cross-government perspective and an independent evidence base that many other commentators do not. We are expected to audit government’s use of public funds but there is a lot to consider. How do we time our interventions so as not to compromise the emergency response, whilst ensuring full accountability for the use of public money. What type of audit product is useful at different stages of the response?
NAO staff have shared insights on the pandemic in webinars organised by the International Organisation of Supreme Audit Institutions (INTOSAI ). We also established a new European Organisation of Supreme Audit Institutions (EUROSAI) Project Group on Auditing the response to the COVID-19 pandemic, which we are co-chairing with SAI Finland. As part of this, SAIs from 30 European countries have agreed to coordinate and communicate COVID-19 work; share audit approaches, information and outputs; and scope content for any future lessons learned reports. In June and July we co-hosted eight online meetings where 30 SAIs set out the impact of COVID-19, their audit response, and what they wanted to share and learn. We will use these insights, and further information exchanges on specific topics, in our own reports to Parliament.
This isn’t easy, and each SAI has a different answer as every country’s context is different. However, what has quickly become apparent is that government responses around the world are similar: preparing, responding and then handling the recovery and long-term impacts of the pandemic. This involves significant expenditure in healthcare, wider emergency response measures and supporting individuals, businesses and the economy.
Since the pandemic started, operating internationally hasn’t been straightforward. In March we recalled our people working overseas at short notice and like everyone we have had to adapt to working online. Our work has continued successfully. We completed our international audits, including the World Intellectual Property Organisation, the Organisation for the Prohibition of Chemical Weapons, and the Pan American Health Organisation. The international bodies we audit will have to consider new ways of working in response to the pandemic. To support them in this, our audit reports provided an important and independent perspective on the decisions they have to make on future strategic planning, more efficient and effective processes, and on making better use of the resources provided to them. We will also share experiences with other SAIs auditing UN organisations at the UN Panel of External Auditors this November.
Learning how public servants around the world have responded to the pandemic, many in countries with fewer resources than the UK, provides valuable comparisons. It makes us even more grateful to those on the front line of the UK’s response.
The aim of everything we do at the NAO is to help Parliament hold government to account for its use of public money and to help improve public services. The nature of public audit means we first need to look backwards to understand what happened, so we can then look forwards to recommend how to make things better. By working internationally, the NAO is also looking outwards to help us understand how others are meeting the same challenges the UK faces.
Authors: Daniel Lambauer, Kevin Summersgill, Damian Brewitt
Daniel Lambauer joined the NAO in 2009 as a performance measurement expert and helped to establish our local government value for money (performance audit) team. He is the Executive Director with responsibility for Strategy and Resources. As part of his portfolio, he oversees our international work at executive and Board level and has represented the NAO internationally at a range of international congresses. Before joining the NAO, Daniel worked in a range of sectors in several countries, including academia, management consultancy and the civil service.
Kevin Summersgill joined the NAO’s value for money (performance audit) specialism in 2005. He has audited a wide range of public policy areas and as our Head of International Relations and Technical Cooperation routinely represents the NAO internationally. A specialist in continuous improvement and management systems thinking, he has advised governments and United Nations organisations around the world on how to increase effectiveness, transparency and accountability.
Damian Brewitt is a Chartered Accountant and Director of our international audit portfolio, leading financial and performance audit teams across the NAO’s portfolio of international external audit engagements. He has spent 16 of his 28 years of public sector external audit undertaking financial and performance audit of organisations across the international system. He has specific expertise in IPSAS, international governance and risk management. He supports the NAO and our stakeholders in providing sector insight, leveraged from his experience and the work of our teams across our international client portfolio. He has chaired the Technical Group of the UN Panel of External Auditors for the last two years.