On Tuesday 19 September, the House of Lords approved the Online Safety Bill (the OSB).  This marks the end of the OSB’s passage through the legislative process, and after six years of heated debate, it is now set to become law.

The OSB applies to all platforms available to UK users, and imposes exacting legal obligations on online services, particularly to prevent content that is harmful to children or illegal being made available on their platforms.  Platforms will also be required to prevent use by underage children (typically under 13) and to employ technology to enforce these age limits, as well as providing adults with more control over the content they see.

The OSB applies to all platforms offering “user-to-user services” or “search services”  with “links” to the UK (regardless of where the platform is based).The type of platforms categorised as “user-to-user” and “search” services includes obvious services such as social media, messaging apps and search engines, but also less obvious services that allow searching across more than one website / database (such as online travel services).  Online maps, review sites and collaborative knowledge sites such as Wikipedia also fall within scope of the OSB, as they facilitate commenting and user collaboration.

A “link” is established either because a platform (i) has a significant number of UK users or UK users form a part of the target market; or (ii) can be used by individuals in the UK and there are reasonable grounds to believe that there is a material risk of significant harm to those individuals.

Under the OSB services are classified into Category 1 (high risk user-to-user services), Category 2A (lower risk search services) and Category 2B (lower risk user-to-user services).  The detailed criteria to determine which classification a service falls into will be set out in secondary legislation after the OSB has received Royal Assent. It is ultimately expected to apply to around 25,000 companies.

Implications for services caught by the OSB

Category 1 services will be under stringent obligations to police harmful content hosted on their platforms and will be required to:

  • Prevent already illegal content from appearing, or delete it quickly if it does appear;
  • Prevent content newly designated as illegal by the OSB from appearing (for example, content promoting self-harm), or delete it quickly if it does appear;
  • Prevent children from accessing harmful or age-inappropriate content (even if not illegal);
  • Enforce age limits for the use of platforms;
  • Employ effective age checking measures, and disclose what age assurance technologies they are employing, if any;
  • Ensure that risks and dangers posed to children on large social media platforms are made more transparent (including by publishing risk assessments);
  • Provide clear and accessible mechanisms for parents and children to report problems online;
  • Allow users greater control over the types of content they see, including offering tools for adult users to filter out certain harmful categories of content (children will be protected against these categories by default).

It will not be sufficient to simply remove illegal or harmful content after the fact. Guidance notes that “[p]latforms will need to think about how they design their sites to reduce the likelihood of them being used for criminal activity in the first place.”

Ofcom, the UK’s communications regulator, will be in charge of enforcing these new rules, and will have the power to fine companies in breach of the rules up to the greater of £18m or 10% of annual global turnover. Senior managers of relevant companies will also be liable for criminal sanctions in the event of failure to comply with information requests or enforcement notices in relation to certain child safety duties. In the most extreme cases, Ofcom will be empowered to seek the courts’ permission to prevent payment providers, advertisers and internet service providers from working with a persistently non-compliant company.

What will happen next?

Phased implementation. Once the OSB has received Royal Assent, Ofcom has indicated it will publish the first of three phases of planned consultations.  The first phase will focus on guidance on mitigating the risk of illegal harm, the second on child safety duties and the third on other duties of categorised platforms. The first parts of the law will likely be in force towards the end of 2024 or in early 2025 once commencement orders have been issued.

Ofcom’s development of enforcement strategies. Ofcom is working with the Department for Science, Innovation and Technology to develop an enforcement strategy for the new laws. Ofcom plans to publish Codes of Conduct and guidance for compliance.

The true impact of the OSB remains to be seen, as Ofcom now faces the task of providing guidance that balances the competing interests of privacy / freedom of speech and protection against harmful content.

First steps by online services caught by the OSB. In-scope companies have been urged not to wait for the legislation to come into force before acting, and to begin to improve safety and prepare for regulation now. The first steps of such preparation will be for affected businesses to conduct a risk assessment to determine whether their online service is likely to be caught by  the OSB. If so, it will be necessary to consider the introduction of screening services and filtering and reporting mechanisms, along with any necessary amendments to terms of service.