THE AI GOVERNANCE AWAKENS
We’ve written about the synergies between AI governance and privacy, but things are changing faster than Han Solo can make the Kessel Run.
In October 2023, the Biden-Harris administration published an Executive Order on Safe, Secure, and Trustworthy Artificial Intelligence.
Then, on 13 March 2024, the European Parliament passed the European Union Artificial Intelligence Act (EU AI Act), and even though the majority of the rules of the EU AI Act (which must still go through a technical vote) will only be effective three years after the act comes into effect, this is a major development in the global regulation of AI.
In addition, one of the most significant policy institutions in the world, the OECD (Organisation for Economic Co-operation and Development) has created an Expert Group on AI, Data and Privacy, to bridge the gap between AI and privacy communities. This Expert Group tracks AI incidents and runs an AI policy database to create a collective repository.
“This group is really meant to harness the synergies and learnings from both communities so that we can learn from each other and forge a way forward that is cohesive and united,” said Denise Wong, co-chair of the group and deputy commissioner of Singapore’s Personal Data Protection Commission.
On the home front, we have the South African AI Association, an industry body focused on promoting the advancement of responsible AI use in South Africa.
What’s become clear is that AI and Large Language Models (LLMs) are here to stay and won’t fall out of fashion soon. (Can anyone even remember the metaverse?)
WHY ARE THESE DEVELOPMENTS IMPORTANT FOR PRIVACY PROFESSIONALS?
AI governance has two pillars: data governance and model governance. As privacy professionals, we care mostly about the data governance pillar. It involves ensuring that data collection and use is aligned with privacy notices, user consents and regulatory requirements.
If your organisation has done any work on POPIA, GDPR or any other privacy regulation compliance, you should have a record and map of data processing activities in place, and you should have done a data protection impact assessment on those activities. If, however, you plan to use any of the data to train an AI model, you must also ensure that:
- data subjects are aware that their data will be used to train AI models;
- you have the data subjects’ consent to use their data for this purpose;
- you remove any unnecessary unique identifiers from the dataset or implement appropriate pseudonymisation techniques; and
- you check your data for bias, representation, labels and feature qualities before using it in AI models.
WHY ARE THESE DEVELOPMENTS IMPORTANT FOR INDIVIDUALS?
LLMs frequently collect personal information from around the web, often without permission of the relevant individuals. In response, more and more organisations are clamping down on employees using tools like ChatGPT for work due to leaks of proprietary information. Several lawsuits that have recently been filed globally allege that Google, OpenAI and others have violated privacy laws in training and operating their AI services.
Currently, according to Ina Fried of Axios, AI is exacerbating existing threats to individuals’ privacy which regulations struggle to manage because the unique capabilities of generative AI raises much bigger concerns than just the common aggregation of personal information sold and distributed by data brokers. AI tools can draw connections and inferences (accurate or not), which provide tech companies with a detailed understanding of individuals. Ultimately, this means that we are all at risk from digital clones (clone wars trigger alert!) and deepfakes that would not just look like us but could also act and communicate like us.
VICTORY AND SACRIFICE
As privacy professionals, we must be vigilant to the dark side of the AI wielders and persevere until balance is found.
As C3PO said: “R2-D2, you know better than to trust a strange computer.”
Want to talk about how you can find a balance between AI and privacy governance? Get in touch!