Jeremy Wright was the first of five UK ministers tasked with pushing through the UK government’s landmark legislation on internet regulation, the Online Safety Bill. The current UK government likes to call its initiatives “globally revolutionary”, but for a brief period in 2019 that might have been right. Back then, three prime ministers ago, the bill – or at least the white paper that would form the basis of it – outlined an approach that recognized that social media platforms were already de facto arbiters of what constituted acceptable discourse across much of the Internet, but that it was a responsibility they didn’t necessarily want and weren’t always able to fulfill. Tech companies were pilloried for the things they missed, but also, by free speech advocates, for the things they suppressed. “There was a growing realization that self-regulation wouldn’t be viable for much longer,” Wright says. “And therefore governments had to be involved. »
The bill aimed to define a way to handle “legal but harmful” content – content that was not explicitly against the law but which, individually or in aggregate, posed a risk, such as misinformation about health care , messages encouraging suicide or eating disorders. , or political disinformation that could undermine democracy or create panic. The bill had its critics, including those who feared it would give too much power to Big Tech. But it was widely praised as a thoughtful attempt to address a problem that was growing and changing faster than politics and society could adapt. In his 17 years in Parliament, Wright says: “I’m not sure I’ve seen anything in terms of potential legislation that has enjoyed such broad political consensus.”
After finally passing both Houses of the UK Parliament, the Bill received Royal Assent today. It’s no longer the best in the world: the European Union’s competing digital services law came into force in August. And the Online Safety Act comes into force as a broader and more controversial piece of legislation than Wright championed. The law’s more than 200 articles cover a wide range of illegal content that platforms will have to deal with and give them a “duty of care” over what their users, particularly children, see online. Some of the more nuanced principles regarding harm caused by legal but harmful content have been watered down, and added a highly controversial requirement for messaging platforms to scan user messages for illegal content, such as illegal content. child sexual abuse, which tech companies and privacy advocates say is an unwarranted attack on encryption.
Businesses, from large tech companies to small platforms and messaging apps, will have to comply with a long list of new requirements, starting with verifying the age of their users. (Wikipedia, the eighth most visited website in the UK, said it would not be able to comply with the rule because it violates the Wikimedia Foundation’s principles for collecting data about its users. ) Platforms will need to prevent young users from viewing age-inappropriate content, such as pornography, cyberbullying and harassment; publish risk assessments on potential dangers to children on their services; and give parents easy ways to report concerns. Sending threats of violence, including rape, online will now be illegal, as will aiding or abetting self-harm online or transmitting deepfake pornography, and companies will need to act quickly to remove them from their platforms, as well as fraudulent advertisements.
In a statement, UK Technology Secretary Michelle Donelan said: “The bill protects free speech, empowers adults and will ensure platforms remove illegal content. But protecting children is at the heart of this bill. I would like to thank the campaigners, parliamentarians, abuse survivors and charities who have worked tirelessly, not only to get this law across the finishing line, but to ensure that it will make Britain the safest place in the world to go online. »
Enforcement of the law will be left to the UK’s telecoms regulator, Ofcom, which said in June it would begin consultations with the industry after receiving royal assent. Enforcement of the law is unlikely to begin immediately, but the law will apply to any platform with a significant number of users in the UK. Businesses that fail to comply with the new rules face fines of up to £18 million ($21.9 million) or 10% of their annual turnover, whichever is greater. retained.