Why we should listen to developers for AI privacy regulation

https://www.siliconrepublic.com/wp-content/uploads/2024/06/Jeremy-Bradley.jpg

Zama’s COO Jeremy Bradley argues that as the ‘guardians of privacy’, developers are best placed to advise on cybersecurity in an AI boom, though everyone from CEOs to end-users has a role to play.

In an era where technological advancements are exponentially accelerating, the voices of those at the forefront of creating these technologies are crucial.

Recent findings from a study commissioned by Zama, a Paris-based cryptography firm specialising in fully homomorphic encryption (FHE), have cast a spotlight on a growing concern among developers: the encroaching threat of AI and machine learning to privacy.

Our study, which surveyed mora than a thousand developers from the UK and US, reveals a striking insight: 53pc of developers view AI as a significant threat to privacy, nearly on par with cybercrime, which stands at 55pc. This sentiment underscores a critical shift in the threat landscape, where AI, despite its nascent stage, is rapidly emerging as a formidable challenge, closely trailing the perennial threat of cybercrime.

The financial implications of cybercrime are staggering, projected to soar to $13.82trn by 2028. However, with AI’s sophistication growing, the potential for its misuse by cybercriminals could drive these costs even higher. This is not just about financial loss; it’s about the erosion of privacy that could follow.

The value of the experts’ takes

The developers’ perspective is particularly illuminating. Tasked with integrating privacy safeguards into everyday applications, these professionals are uniquely positioned to judge the efficacy of current privacy and regulatory frameworks. Alarmingly, 98pc believe immediate action is necessary to address future privacy concerns under these frameworks. A significant 72pc feel that existing regulations are not equipped for the future, and 56pc express concern that even dynamic regulatory structures intended to adapt to technological advances could pose threats of their own.

These survey results highlight the importance of listening to developer insights, stressing their vital role in safeguarding privacy within organisations amid AI adoption. This emphasises the need for upcoming regulations to address the escalating risks to user privacy identified by developers.

Furthermore, the survey indicates a concerning gap: nearly a third of developers believe that regulators lack a comprehensive grasp of the technologies involved, potentially hindering the efficacy of future regulations. However, the developers surveyed also said they prioritised privacy in innovation. This entails leveraging privacy enhancing technologies (PETs) to process data securely without compromising functionality.

As we enter an era dominated by AI-driven advancements, developers advocate for regulatory approaches that are well-informed, proactive and adaptable. It’s crucial to maintain privacy as a core principle to preserve the integrity of data-driven innovation, even amidst rapid technological progress. By integrating advanced encryption technologies and enhancing regulators’ understanding of privacy tools, we can uphold privacy and security values in the digital realm.

Everyone has a role to play

While we already see increasing efforts from regulators as the first AI-tailored policies and bills have been launched, it’s important to remember that the responsibility to protect privacy when it comes to AI should be shared and widespread. Regulators make the rules and integrate them with existing policies and aspects of society. As ‘guardians of privacy’, developers have the expertise and tools to ensure the technology is up to par. Finally, users too can play an active role in protecting their own information.

Policymakers and governments

The key for those in charge of creating regulations that can keep up with AI and related technologies is to appreciate the importance of being knowledgeable. Constant education and learning are crucial, through traditional channels or through L&D partnerships with the very same tech companies that are advancing the field. Regularly hosting external experts for training sessions, encouraging participation at technical conferences and summits, as well as organising technology briefs will help regulators to be equipped with the knowledge to develop truly competent policies.

Collaborations between the public and private sectors can work to serve the interests of both parties, with the additional perks of creating something that the public can also benefit from. The combination of the regulators’ understanding of policies and frameworks with the technical know-how of private companies can, on the one hand, foster the creation of well-suited regulations that incorporate the technology while, on the other hand, ensuring that these new technologies that are developed and launched to market will be already compliant and up to code.

Developers

As it emerged from their survey responses – 79pc of developers say customers’ privacy is important to them. Developers know that they hold a unique position in the fight for privacy: they have the skills and knowledge to make it happen, they understand what users want and they are aware of the challenges of delivering while working for organisations that have to operate within rules and limitations.

Their role is to champion the value of privacy in the organisation they work for and deliver working tools, by researching technologies and open-source solutions.

They are called to be proactive and creative, joining a wider community to share ideas and advancements and exploring the scope of available technologies, from more traditional security measures to the latest encryption-based solutions such as FHE. They should create easy-to-use tools that can be applied to the growing range of AI-based technologies and functions.

Organisations and private users

Private businesses and organisations are well-placed to take the necessary privacy-preserving steps from the start, as they have the means and workforce to integrate data privacy in the early stages of development of their technological tools. Whether this is for a customers’ database, financial information or research records, organisations can put privacy at the forefront with dedicated solutions.

Organisations can easily research and select the best privacy-oriented technologies that suit their needs, hire the best talent and truly protect clients’ and users’ privacy instead of seeing this as an afterthought, investing in encryption and anonymisation techniques.

And while private users might not have the means to access the same range of tools, they can easily follow a few simple safeguards to protect themselves when they are not relying on external organisations to do so: periodically reviewing apps permissions, frequently changing passwords and using strong ones, and enabling multifactor authentication whenever possible on all devices and platforms.

By Jeremy Bradley

Jeremy Bradley is the chief operating officer at Zama. He is a cross-functional and highly tactical leader who has worked with many organisations to shape strategy, drive communications and partnerships, and lead policy and process.

Find out how emerging tech trends are transforming tomorrow with our new podcast, Future Human: The Series. Listen now on Spotify, on Apple or wherever you get your podcasts.

<<<- Go Back