The UK’s Online Safety Act 2023 (OSA) is a comprehensive piece of legislation designed to regulate social media companies and search services and to increase protections for individuals online. It draws comparisons to the EU’s Digital Services Act, with both laws include provisions relating to safety and transparency—seeking to balance the need to protect people online with fundamental rights such as the right to freedom of expression and privacy. Importantly, it applies not just to digital service providers in the UK but to any service with links to the UK.
One key area in which the OSA goes further is in relation to the obligations on service providers to undertake specific assessments in relation to their services. The first of these assessments (the illegal content risk assessment—discussed in more detail below) must be completed by all in-scope services by March 16, 2025.
For more information on the developing regulatory landscape relating to “online safety” in the UK, EU and U.S., please view our earlier webinar.
OSA—Who Is in Scope?
- Services in scope. The OSA applies to providers of user-to-user (U2U) services and search services:
- U2U services: internet services through which content that is generated, uploaded, or shared by other users may be encountered by other users of the service. This could cover social media platforms, photo or video sharing platforms, chat and instant messaging services, blogs, online games, online dating platforms, online marketplaces, etc., including where such services are available through a web browser or mobile application.
- Search services: internet services that consist of, or include, the functionality for users to search multiple websites or databases (including services through which a user could in principle search all websites or databases). This could cover conventional search engines, reverse image lookups, content aggregators that allow users to search through multiple databases, AI-powered search engines, etc.
Certain services are then subject to additional obligations, such as U2U or search services that publish pornographic content or that meet certain threshold conditions, based on the number of UK users and certain features of the service (“Categorised Services”).
- Content in scope. The OSA defines “content” as anything communicated by means of an internet service, whether publicly or privately, including written material or messages, oral communications, photographs, videos, visual images, music and data of any description. The OSA envisages several different categories of content:
- Illegal content: content that amounts to a relevant offense, i.e., a priority offense (such as offenses related to terrorism or child sexual exploitation or abuse (CSEA)) or another, non-priority, offense (such as cyberflashing, content that encourages or assists with self-harm, or threatening communications).
- Content that is harmful to children: content that may not amount to illegal content but that should be hidden from children. This is further grouped into primary priority content that is harmful to children (such as pornographic, suicide, self-harm or eating disorder content), priority content that is harmful to children (such as abuse, hate, bullying or violent content, or content which encourages the consumption of harmful substances or the undertaking of dangerous stunts or challenges), and non-designated harmful content which otherwise presents a material risk of significant harm to an appreciable number of children in the UK.
Categorised Services are also subject to obligations relating to additional content categories, such as fraudulent adverts, content of democratic importance, and content that adults may wish to avoid encountering on a service (e.g., abusive content or content that encourages suicide or self-harm).
- Territorial jurisdiction: Given the cross-border nature of the internet, the OSA’s territorial application extends beyond services based in the UK. The OSA applies to any service that “has links with the United Kingdom.” This criteria will be met where: (i) the service has a significant number of UK users; (ii) the UK forms one of the target markets for the service; or (iii) the service is capable of being used in the UK and there are reasonable grounds to believe there is a material risk of significant harm to individuals in the UK presented by content on the service.
What Are the Main Assessment Duties?
- Illegal Content Risk Assessment. The first assessment that must be undertaken is an illegal content risk assessment. This is designed to improve a service provider’s understanding of how risks of different kinds of illegal harms could arise on the service and what safety measures should be implemented to protect users. Illegal content risk assessments must be of a “suitable and sufficient” standard. Ofcom (the UK regulatory body with responsibility for the OSA) has published guidance on how to carry out an illegal content risk assessment, which sets out a four-step process: (i) understand the kinds of illegal content that needs to be assessed; (ii) assess the risk of harm; (iii) decide measures, implement and record; and (iv) report, review and update. All in-scope services must complete an illegal content risk assessment by March 16, 2025.
- Children’s Access Assessment. In-scope services must also undertake a children’s access assessment to understand to what extent the service is “likely to be accessed by children” (meaning anyone under the age of 18). Where the assessment determines that a service is likely to be accessed by children, this triggers additional child protection duties (which may not apply if the children’s access assessment determines that the services is not likely to be accessed by children). Importantly, if a children’s access assessment is not completed, the service will be subject to the additional child protection duties in any event. All in-scope service must complete a children’s access assessment by April 16, 2025.
- Children’s Risk Assessment. If a service is considered “likely to be accessed by children” then service providers must undertake a children’s risk assessment, which is separate and in addition to the overarching illegal content risk assessment. The children’s risk assessment must assess the risks that exist specifically in relation to children on the service, taking into account any measures the service already has in place to protect children. Ofcom is currently consulting on guidance relating to the children’s risk assessment. Once this guidance has been finalized (expected in April 2025) service providers will have three months to complete the requisite assessment.
- User Empowerment Risk Assessment. Certain Categorised Services may also need to complete an adult user empowerment assessment, to understand its user base, the likelihood of adult users encountering specific content that they may wish to avoid, and the impact that the service (including its design and operation) may have in relation to adults encountering such content and the impact that this may have. Ofcom will publish more details on this once it has finalized the thresholds to determine which services will be Categorised Services under the OSA.
Records must be maintained of all assessments carried out, including the process followed and the findings. Failure to complete an assessment where required under the act is an automatic breach, which could result in enforcement action and a civil penalty of up to 10% of revenue or £18m (whichever is greater).
Additional Obligations
A core focus of the OSA is on “systems and processes”—Ofcom will regulate the measures taken by service providers to mitigate the risks identified in their assessments, ensuring that these measures are proportionate, as opposed to regulating individual pieces of content appearing on the services.
Service providers will also be expected to have in place content reporting and complaints procedures, and clear and accessible terms of service which address specific areas identified in the OSA (e.g., specifying how individuals are to be protected from illegal content).
Next Steps
Companies operating U2U or search services should consider to what extent they may be subject to the OSA (e.g., by analyzing any links they may have to the UK). Where a service is in scope, the priority action item would be to complete the illegal content risk assessment by the March 16, 2025, deadline.
Related Articles
EU AI Act: First Set of Requirements Go into Effect February 2, 2025