The UK government has now published the final version of the Online Safety Bill. Following the pre-legislative scrutiny by a joint select committee of parliament, which I chaired, the bill has been introduced in the House of Commons and is expected to be debated before the end of May. This is a world-leading piece of legislation, creating an independent regulator that will set safety standards online, protecting users from harmful content found on social media platforms and through search engines.
At its heart is the principle that what is illegal offline will be regulated online. This is our chance to end the internet wild west and have safety standards based on our laws, rather than the rules the Big Tech companies have created for themselves. These companies will now have an obligation to proactively mitigate illegal content, such as child abuse or terrorist content, as well as being required to act against content such as hate speech, the incitement of violence and the promotion of known frauds and scams. The Bill will also include new communications offences recommended by the law commissioners. These address growing concern about the promotion of self-harm and suicide, cyberflashing and sharing knowingly false content with the intent of causing physical or serious psychological harm to a target audience. Ofcom, the UK media regulator, will oversee the safety regime and will be able to audit the service providers in scope of the Bill to ensure they are complying their obligations. If companies fail to do so, they could be fined up to ten per cent of global annual revenues.
The UK Online Safety Bill establishes specific protections for content of democratic importance, like statements from candidates during an election, and for journalism from recognised news organisations. This will protect news organisations that take legal responsibility for the content they produce and are part of an established and independent organisation for handling complaints. Content within these categories would not fall under the regulatory responsibilities for platforms created by the Online Safety Bill. What’s more, the Bill would also require service providers to give due regard to freedom of speech when moderating content. This could allow redress for people who believe that information they have shared on social media has been unfairly removed.
Some people state that this Bill will make Big Tech companies mass surveillance platforms. However, this ignores the fact that they do the surveillance already by gathering data about everything people do on their services, and quite a lot of device-level data about what else their users do online. This surveillance exists to make money, so that they know what to recommend next to their customers to keep them online for longer. More engagement means more advertising, leading to greater revenues.
I believe the tools they have created could also be used to help keep people safe online, by proactively identifying harmful content, and then not recommending it to users. Others have claimed that this Bill will give even more power to the bosses of Big Tech to regulate speech online. This is completely wrong: under the Online Safety Bill, those companies will have to work within rules created by parliament and enforced by an independent regulator.
Finally, there have been questions about whether it’s possible to legislate at a national level for content shared on platforms based in other countries. The answer to this is yes, we can, and other countries have done so already. Germany has specific nationals law requiring the removal of anti-Semitic and pro-Nazi content, and France has a law to combat known sources of disinformation during election period. We already know that if you travel in other countries, some content is not available to view on video sharing platforms. The Big Tech companies can and do change their services depending on where they are being accessed from, and they can do the same in order to comply with the Online Safety Bill. These are necessary reforms, that we have waited and worked for, and have now finally been presented to parliament.
Damian Collins MP is the chair of the Joint Select Committee on the Draft Online Safety Bill