08 Dec 2021

From self-regulation to legal compliance: the end of an era

Professor Luciano Floridi, Professor of Philosophy and Ethics of Information at the University of Oxford, sets out his perspective on the future of AI Governance.

If I had to choose one year to mark the beginning of the commercialisation of the Web, I would suggest 2004, when Facebook was launched, and Google held its IPO. Before then, the debate on predictable ethical problems – from privacy to bias, from illegal or unethical content to fake news and misinformation – had been mainly academic. After 2004, public concerns began to spread, and pressure mounted to improve business strategies, policies, and regulations. Self-regulation began to appear as a possible approach to deal with such pressure. The view was that the digital industry could formulate its own ethical codes and standards, and require and monitor adherence to them. It was not a bad idea. In theory, companies could develop and implement self-regulation more insightfully, quickly, and efficiently, without society having to wait for new legislation. Done properly, self-regulation could have prevented disasters, while preparing the industry for future legal frameworks. It could even have contributed to improving them. The potential for a constructive and fruitful dialogue between business and society was there. It was worth pursuing it. Unfortunately, it did not work.  

If I had to choose another year, this time to mark the coming of age of the self-regulatory era, I would suggest 2014, when Google established its Advisory Council (I was a member) to deal with the consequences of the ruling by the Court of Justice of the European Union on the so-called “right to be forgotten”. It was the first of many councils and boards that followed. That particular project had some success but, on the whole, the following era of self-regulation was a disappointment. The Facebook-Cambridge Analytica scandal in 2018 – predictable and preventable – and the patently ill-conceived and very short-lived Advanced Technology External Advisory Council established by Google about AI ethics in 2019 (I was a member), showed how challenging and ultimately unsuccessful self-regulation could be. Companies seemed unable or reluctant to solve their ethical issues, not necessarily in terms of resources, lobbying, and PR, but in terms of C-suit strategy to improve mentalities and behaviours so deeply ingrained. When industry reacted to the ethical challenges posed by AI by creating hundreds of different codes, guidelines and declarations, the empty nature of self-regulation became almost embarrassing. Today, Facebook’s Oversight Board, established in 2020, is an anachronism, a late reaction to the end of an era when self-regulation failed to make a difference. It is too late because, today, the law has caught up. In particular, in the EU, the GDPR is being followed by several legislative initiatives: the Digital Markets Act, the Digital Services Act, and the AI Act, just to mention the most significant. They are likely to generate a vast Brussels effect. Legal compliance will soon replace an ethical adherence that never took off. This does not mean that companies will not have a significant role to play over and above the legal requirements. But the era of self-regulation as a default and main option is over. It leaves behind some good work, in terms of analysis of the problems and their solutions, of cultural and social awareness and ethical sensibilities, and even of some positive contributions to legislation.  

For example, the High-Level Expert Group on Artificial Intelligence (I was a member), established by the EU Commission, saw the participation of industry partners, and provided the ethical framework for the AI Act. However, the invitation issued by society to the digital industry to adopt a more ethical position was largely ignored. It was a missed opportunity of historical significance, very costly both socially and economically. But it is now time to accept that it did not work, and “compel them [the companies] to come in” (Luke 14.23). 

 

Luciano Floridi1,2  

1Oxford Internet Institute, University of Oxford, 1 St Giles, Oxford, OX1 3JS, United Kingdom; 2Department of Legal Studies, University of Bologna, Via Zamboni 27/29, 40126 Bologna, Italy. Corresponding email: [email protected] 

Related topics