Promoting the transparent use of algorithms in the public sector, takeaways from the UK experience
Prepared by Chiara Giannella and Federica Pezza
Lately Governments and public bodies around the world are increasingly turning to algorithms for the purpose of supporting decision-making practices and delivering public services. Indeed, algorithmic tools are potentially highly beneficial insofar as they can contribute to speed up times when it comes to the analysis of large amounts of data and, by this, promote a better interaction between the public sector and individuals.
Yet, on the other hand, issues may arise when it comes to the actual transparency and accountability of these instruments, with the consequence that individuals might have to face automated decisions whose reasons are, at the very least, questionable. For example, recently, the Italian Data Protection Authority (“IDPA”) started an investigation on the legitimacy of the algorithm used by the Veneto Region for the purpose of determining the scheduled times and the priority class of health services. At the basis of the IDPA’s decision there is the fact that this is a case where the large-scale processing of particularly sensitive data such as health data – also involving a significant number of patients – is likely to occur. Thus, the Region will now have to reply to the IDPA indicating all of the relevant information for its assessment, including inter alia the legal basis of the processing, the type of algorithm used as well as the types of personal data and clinical documents object of the processing.
In this context, it is therefore to welcome the joint initiative of the Central Digital and Data Office (“CDDO”) and the Centre for Data Ethics and Innovation (“CDEI”) in the UK which, on 5 January 2023 as a part of the National Data Strategy, launched the “Alghoritmic Transparency Recording Standard Hub”, with the aim of providing a framework for public sector organizations to share information on their use of algorithmic tools with the general public and other interested stakeholders. More in detail, the Hub is made up of:
- The Algorithmic Transparency Recording Standard (the “Standard”);
- The Guidance on using and completing the Standard; and
- A collection of published transparency reports.
Thus, through the Hub, those public organizations willing to use algorithmic tools when performing their functions will now be given the opportunity to access (and use) the Standard, which was developed by the CDDO and the CDEI in collaboration with civil society group and external experts and is made available online in a user-friendly format. Indeed, the Standard appears as a word template made up of 5 separate sections, where the organizations will have to fill in certain information including (i) a description of the algorithm and the way it works, (ii) an explanation on the way it is integrated in the decision-making process and the influence it has on the final decision as well as (iii) information on the impact assessments conducted and corresponding findings. Also, in line with the Standard, where personal and/or sensitive data is being stored and accessed, a description of the mechanisms used to protect the security and privacy of the data should be described
In addition to this, since the Standard was already piloted by various public bodies such as the Department for Health and Social Care and the Food Standard Agency, the first reports developed by using this tool are already available on the Hub, by this offering both an opportunity for public bodies to evaluate and benchmark the algorithms they use and for the launching institutions to update and refine the Standard accordingly.
Consequently, the analysis of the UK experience demonstrates that the adoption of adequate reporting standards may represent an important tool to mitigate the risks inevitably connected to the use of algorithms in the public sector, by this contributing to the promotion of an efficient and transparent decision-making. At the same time, the CDDO and CDEI’ initiative calls for the elaboration of adequate reporting standards also in the EU and in Italy, where this role could be played for example by the Agency for Digital Innovation (AGID).
Let’s Talk
For a deeper discussion, please contact:
PwC TLS Avvocati e Commercialisti
Partner
PwC TLS Avvocati e Commercialisti
Director