Online service providers have been given a “valuable reminder” that content generated by AI will fall in scope of the UK’s Online Safety Act’s requirements in the same way content created by human users does, an expert in technology regulation has said.
Gemma Erskine of Pinsent Masons was commenting after Ofcom issued an open letter to UK online service providers regarding generative AI (gen-AI) and chatbots.
Ofcom is the regulator of the Online Safety Act 2023 (OSA). The UK government recently proposed strategic priorities for Ofcom to adopt in the role, focused on five core themes: safety by design, transparency and accountability, agile regulation, inclusivity and resilience, and technology and innovation. The agile regulation priority proposed would encompass, the government said, “ensuring the framework is robust in monitoring and tackling emerging harms – such as AI generated content”.
Erskine said Ofcom’s letter emphasises that it will not treat content created by AI tools and chatbots differently than content created by human users for purposes of the Act – if the content is shared through a user-to-user service, or generates pornographic material, or the AI tool provides a search functionality, it will be treated the same as any other in-scope content or service.
Erskine said: “Given the rapid advancement of AI and gen-AI, this open letter serves as a valuable reminder to platforms about the extensive scope of the OSA. The OSA imposes duties on services to limit harms to users through available content. Ofcom’s letter emphasises that content generated by AI tools will be treated identically to content created by human users under the Act.”
“Content generated by AI tools may pose significant risks to in-scope services because of the risk of hallucinations or the tool generating inaccurate or illegal content. Whilst some services are taking steps to signpost to users that particular content has been created by an AI tool, this will not be sufficient to meet the service’s duties under the OSA if the underlying content may cause harm to users,” she said.
“Ofcom’s letter outlines the key steps that services need to take to comply with the OSA, in preparation for the final publication of the illegal harm risk assessment guidance and codes of practice, which is expected in December. It concludes by reminding services that the duties under the OSA are mandatory. Ofcom has warned that it is prepared to take enforcement action, including issuing fines, to ensure compliance,” Erskine added.
In relation to user-to-user services, the OSA imposes duties on in-scope services that have the functionality to permit users to share user-generated content, which can include images, videos, messages, or other information.
Ofcom’s letter clarifies that services allowing users to share text, images, or videos generated by a chatbot with other users will be classified as user-to-user services under the Act. This includes services, such as certain social media platforms, that allow groups to engage with a chatbot in a group setting or thread.
Ofcom also explained that any platform that enables users to create their own gen-AI chatbots and make them available to other users will also fall within the scope of the OSA. Any text, images, or videos created by these chatbots will be considered “user-generated content” and regulated by the OSA.
Ofcom underscored that content generated by AI that is then shared by a user would be treated the same as content the user had created on their own: information created by AI is user generated content, and it won’t matter whether the material was created on the platform where it was shared or created elsewhere and uploaded to that platform.
In relation to search services, the letter clarifies that gen-AI tools enabling the search of multiple websites and databases are considered 'search services' under the Act. These tools would be subject to the same duties as other search services.
Ofcom further indicated that this includes tools which modify, augment, or facilitate the delivery of search results on existing platforms. It also said services that include gen-AI tools capable of generating pornographic material are regulated under the Act.
Ofcom said all in-scope services should promptly comply with their duties and obligations under the OSA.
“For providers of user-to-user services and search services, this means, among other requirements, undertaking risk assessments to understand the risk of users encountering harmful content; implementing proportionate measures to mitigate and manage those risks; and enabling users to easily report illegal posts and material that is harmful to children,” the regulator said.
The first duties under the OSA pertaining to illegal harms will commence in December. Guidance on conducting an illegal harms risk assessment as well as codes of practice that can be adopted by service providers, are due to be published then by Ofcom.
Erskine said the draft codes of practice that Ofcom consulted on outline several measures that services providing gen-AI within the scope of the OSA could implement now. These include: having a named person accountable for compliance with the Online Safety Act; having a content moderation function that allows for the swift takedown of illegal posts where identified and for children to be protected from material that is harmful to them; and having a content moderation function that is adequately resourced and well trained. They also include: using highly effective age assurance to prevent children from encountering the most harmful types of content where this is allowed on the platform; and having easy to access and easy to use reporting and complaints processes.
Under the OSA, services may opt to use measures other than those specified in the codes. However, if choosing to do so, they must explain how their chosen approach fulfils the duties outlined in the Act. If services follow the recommendations in Ofcom’s codes, they will be considered compliant with the relevant duties in the Act.