Power and responsibility for big tech: a last chance for self-regulation?


In the late 1990s and early 2000s, the EU and the US legislated for an online ‘liability exemption’ under which websites and online platforms are broadly not held liable for the content or products that their customers and users upload to their sites. This approach was replicated globally and has been key in allowing user-generated and user-uploaded platforms such as YouTube, eBay and Twitter to grow, flourish and consolidate. It has also been criticised by more traditional media that do not enjoy the same immunity.

The European Commission’s recent communication on “tackling illegal content online” demonstrates how this consensus is now coming under increasing strain. The guidelines constitute a painstaking argument that companies can, and should, more effectively police the content on their platforms and that they should not hide behind their ‘liability exemption’. This responds to pressure from European governments, who assert that large US platforms are not doing enough to police the content they host, particularly with regards to terrorist activity and hate speech, or breaches of intellectual property. This interpretation has been strongly contested by the companies who point to their own fact checking and self-regulatory initiatives.

So why is the ‘liability exemption’ consensus beginning to break down now? This, and other questions on the responsibilities of platforms was discussed with a group hosted by Global Counsel with Marietje Schaake MEP, ALDE Coordinator on the INTA Committee, Werner Stengg, Head of Unit at DG CONNECT for Online Platforms, and Orit Koppel, CEO of the Jimmy Wales Foundation.

Regulators and politicians no longer buy into slogans such as “do no evil”, and recognise that tech companies have transformed from the unprofitable organisations that marked the development of the internet in the early 2000s. Facebook and Google are amongst the world’s largest companies, have successfully commercialised their huge customer bases and often exercise, for example in online advertising, leading or dominant market positions. The Commission’s communication emphasises that platforms have developed “closer links between users and content - notably for targeted advertising”, highlighting the expectation that platforms cannot, on the one hand, deny any responsibility for the content they host when, on the other, they are selling adverts adjacent to this very same content, creating billions of euros worth of profit in the process.

This is the Commission’s way of saying that with great economic power comes great responsibility, particularly where a business model that relies on ‘clicks’ encourages the development of what Marietje Schaake describes as ‘junk news’. While the EU institutions have, for now, stopped short of proposing to abolish or amend the ‘liability exemption’, the implicit threat to do so runs throughout the communication. The spotlight is now on whether big tech can show self-regulation works – by removing illegal or harmful content more quickly and efficiently than ever, without overly restricting free speech - or whether the cherished ‘liability exemption’ could be at risk.

Download this Insight here.


The views expressed in this note can be attributed to the named author(s) only.