Morning, Tuesday, 11th November 2025
Online
This conference will consider the future for online safety policy and practice in the UK. Areas for discussion include priorities for regulation, enforcement, and platform responsibilities in light of evolving risks, ongoing implementation of the Online Safety Act 2023, and wider issues around compliance, governance, and public expectations.
It will bring together stakeholders and policymakers to discuss the next phase of implementation for the Act, including transparency requirements, categorisation of services, and priorities for Ofcom in its role as lead regulator. Attendees will assess what will be needed to support effective compliance, particularly for smaller platforms, alongside options for strengthening enforcement capacity and coordination with related legislation, such as the Data (Use and Access) Act 2025 and the Policing and Crime Bill.
Further planned sessions examine priorities for improving online protection for children and young people, including early indications of how aims for proportionate age assurance are being met, and how providers are balancing safety with privacy - following enforcement from July 2025 of relevant duties under the Act, including risk assessment and age assurance requirements for services likely to be accessed by children.
Delegates will assess what Ofcom’s Children’s Codes of Practice and forthcoming enforcement may mean for platform design choices, content moderation systems, and approaches to risk assessment, particularly for services accessed by children. Discussion is also expected on the coordination of regulatory measures with safeguarding responsibilities in schools and other educational settings, and how statutory guidance and the responsibilities of schools and other settings may develop.
There will be a focus on longer-term regulatory challenges posed by AI-generated content, misinformation, and online harms affecting users of all ages. Areas for discussion include the scope for collaborative frameworks between regulators, cross-border coordination, and further integration of mental health considerations into future updates to the UK’s online safety framework. Attendees will consider strategies and best practice for stakeholder engagement and the roles of schools, parents, and community organisations, alongside responsibilities of platforms, regulators, and government in responding to emerging threats and supporting safe and accessible digital environments that reflect the needs of different user groups.
With the agenda currently in the drafting stage, overall areas for discussion include:
- implementation of the Online Safety Act: classification of regulated services - transparency and accountability obligations - expectations for smaller platforms
- evolving legal context: impact of the Data (Use and Access) Act and Policing and Crime Bill - overlap between regimes - organisational responsibilities and clarity around compliance requirements
- alignment across regulators: roles of Ofcom, the ICO and others - consistency across safety, privacy and data - coordination mechanisms
- platform responsibilities: scope of duties across service types - safety-by-design and risk assessment - implementation challenges
- new and emerging harms: response to AI-generated content and digitally facilitated abuse - gaps in current protections - future regulatory options
- protections for children and young people: age assurance requirements - safeguarding duties across sectors - use of evidence in setting thresholds
- role of education and schools: adequacy of current guidance - support for school leadership - relationship with national policy
- limits and access controls: rationale for restrictions on smartphones and social media - feasibility and risks - role of parents and public messaging