Net Benefit? The Online Safety Bill receives Royal Assent

Jonathan Barnes KC and Samuel Rowe review the Online Safety Act 2023

On 26 October 2023, the Online Safety Bill received Royal Assent and became the Online Safety Act 2023. It has been a long time coming, with origins stretching back to the then Digital Secretary, Matt Hancock MP, and his bold ambition to “make sure the UK is the safest place to be online”, an intention that had been carried over from the Government’s 2017 Internet Safety Green Paper.

Staying true to those origins, the Act is an ambitious piece of legislation. It implements the Law Commission’s recommendations on reform of the communications offences. It also requires regulated service providers to take steps to prevent harm being caused by online content. Ofcom – the Act’s enforcement authority – has announced its intention to publish three tranches of codes and guidance, to cover: (i) illegal harms duties; (ii) child safety duties and pornography; and (iii) transparency, user empowerment, and other duties on categorised platforms. Ofcom’s timeline may extend until late 2024 before the relevant codes start to come into force.

However, the first of the draft codes will be open for consultation from 9 November 2023, which should mean regulated service providers, or at least those most likely to fall under Ofcom’s gaze, can shortly consider their response, and begin to shape their services to fit the requirements of the Act.

Once Ofcom’s codes and guidance are up and running, the practical effects of the Act’s obligations, and any corollary impact on speech online, should start to become apparent. This latter area has already generated significant controversy throughout the legislation’s passage, namely as to how freedom of expression can appropriately be protected in a preventative or prohibitory regime which may essentially incentivise “suppression”. That debate perhaps still has some way to run.

Online safety and freedom of expression

The introductory thrust of the Act – in section 1 – states its general purpose of making the use of the internet safer for individuals in the UK. To achieve that purpose the Act imposes statutory duties of care on regulated service providers to identify, mitigate and manage risks of harm from illegal content and activity and content and activity that is harmful to children. The Act goes on, however, also to place various obligations – as further statutory duties – on service providers to have regard to the right of freedom of expression. Section 22(3) requires all regulated services to pay “particular regard to the importance of protecting users’ right to freedom of expression within the law” when deciding on, and implementing, safety measures and policies. This obligation to have regard to the right to freedom of expression is open-textured but that may be unavoidable, given the breadth of services caught by the Act. Nonetheless, it may prove a difficult area to police. Moreover, service providers may be tempted to pay lip service but no more to such an obligation, if it can be reduced effectively to box ticking.

Category 1 services will additionally be subject to duties to protect news publisher content under section 18, journalistic content under section 19 and “content of democratic importance” under section 17. The latter amounts to content that contributes “or appears to be specifically intended to” contribute to domestic democratic political debate. It will, presumably, be for the service providers to determine what criteria and methods they use to determine if content was posted with the necessary intent.

Enforcement

Ofcom may hand out very substantial financial penalties to regulated service providers for breach of the Act’s new duties – up to £18 million or 10% of a company’s relevant global turnover. Yet, those headline grabbing numbers seem much more likely to arise in practice when the primary purpose of the Act is engaged: enabling users to encounter less illegal and harmful content online. On that view, it may be natural for service providers to be far more concerned about breaching the safety duties, than they are about paying “particular regard to the…right to freedom of expression” or the fact that content may be news publisher content, journalistic or democratically important.

Accordingly, there must remain at present – while Ofcom’s codes and guidance are developed – a material danger that the balance between freedom of expression and protecting users by controlling content that is published online will fall towards the latter. Some might argue however that the current prevalence of illegal and harmful material online justifies such an intervention, to ensure that Internet platforms and service providers adopt a more protective approach to user safety online, notwithstanding users’ rights of free expression.

User rights

Whilst the Act imposes obligations on regulated service providers to comply with the duties concerning freedom of expression, it may be argued that there is no sufficient mechanism within the Act for individuals to vindicate their rights of freedom of expression directly against the platforms.  The effect of sections 21(2) and 21(4) is that all regulated entities must operate a complaints procedure, which includes addressing complaints that assert the provider is not complying with its duty to pay regard to the right of freedom of expression including where content has been taken down in certain circumstances. Section 21(6) extends the complaint duty so that it applies to the additional obligations placed on category 1 services, such as protecting journalistic content. Sections 15 and 16 also impose an obligation on category 1 services to ‘empower’ adult users where proportionate, such that they can control their access to content that may be otherwise restricted by the operation of the Act (such as material promoting suicide or racially abusive content).

However, it will only be upon systematic failures of such duties that Ofcom is likely realistically to take enforcement action, as opposed to Ofcom acting as a result of a specific asserted breach of, for example, a user’s right to freedom of expression. Such an individual, though, has no right to bring an action – for example, a civil or regulatory claim – directly against a regulated service provider over a breach of the Act.

Conclusion

This significant new legislative regime is now entering a bedding in period, as Ofcom develops its guidance and codes. The legislation has drawn a line under the previous lobbying and consultation phases, but there remain difficult balances to be struck in practice between, on the one hand the laudable principal purposes of the new Act, but on the other the proper and full enjoyment of rights of free expression by all online users.