Prepared by Chiara Giannella and Federica Pezza
Lately the question of artificial intelligence and the delicate balance between promotion of innovative solutions and compliance with the existing legal framework is one of the most debated topics.
In particular, as regards privacy laws, the Italian Data Protection Authority (“Italian DPA”) recently issued a provisional limitation on data processing against the AI-powered chatbox “Replika”, which will therefore not be able to process personal data of Italian users for the time being.
Some background: what is Replika, the “virtual friendship” AI chatbox
Replika is an AI-powered chatbot, developed by the US-based developer Luka Inc., which generates a ‘virtual friend’ using text and video interfaces.
More in detail, according to the description provided on the two main App Stores, Replika is presented as a chatbox “for anyone who wants a friend with no judgment, drama, or social anxiety involved”. Also, based on such description, Replika consists in a customizable chatbox: it can either be a friend or a romantic partner or even a mentor, as it grows (and develops its own personality and memories) along with its users.
The Italian DPA’s order: reasons behind the ban
With its order of 2 February 2023, the Italian DPA considered that Replika was not in line with the Italian privacy framework, ordering Luka Inc. to immediately terminate processing of data relating to Italian users and to inform the Italian DPA within 20 days on any measures taken to implement the order.
In particular, the Italian DPA’s order depends on four main reasons:
- No age verification mechanism is in place. Indeed, the chatbox simply requires its users to provide their name, gender and email account. However, neither gating mechanisms for children nor blocking/banning systems if a user declares explicitly to be underage are in place.
- The content of the chatbox’ replies is often at odds with the safeguard of children and vulnerable individuals. Several reviews on the two main App Stores include comments by users flagging sexually inappropriate contents. Moreover, for the way it is presented (i.e. a chatbot that can improve users’ mood and emotional welfare) Replika is likely to impact on an individual’s mood and therefore enhance risks to the vulnerable individuals concerned.
- Lack of transparency. Under the General Data Protection Regulation (2016/679, “GDPR“), transparency is one of the key principles of data processing. Nevertheless, it does not seem to be adequately taken into account by the chatbox, which fails to disclose whatever information on the key elements of the processing at issue, in particular on the use of children’s personal data.
- Lack of legal basis for the processing activity. In line with Article 6 GDPR, a legal basis must always be identified for personal data to be processed. However, this is clearly not the case here. Indeed, Replika processes personal data unlawfully since performance of a contract cannot be invoked as a legal basis, even implicitly, given that under Italian laws children are incapable to enter into a valid contract.
Conclusion and takeaways
The above-mentioned order confirms that the right balance between promotion of innovative solutions and compliance with the existing privacy framework is clearly a difficult one to establish. Yet, at the same time, it is helpful insofar as it provides some guidelines for those companies willing to develop or employ AI-based solutions in their business.
In line with the indications of the Italian DPA, such companies are (at the very least) required to (i) adopt adequate age verification mechanisms; (ii) make sure that their AI models “appropriately” interact with users (including children and vulnerable individuals); (iii) disclose all the necessary information on the processing in line with Article 13 GDPR; and (iv) identify a suitable legal basis for the processing in line with Article 6 GDPR.
For a deeper discussion, please contact:
PwC TLS Avvocati e Commercialisti
PwC TLS Avvocati e Commercialisti