Westminster eForum

Protecting children online: content regulation, age verification and latest thinking on industry responsibility

Morning, Tuesday, 10th December 2019

Central London

THIS EVENT IS CPD CERTIFIED



This timely conference will consider what more is required from policy, regulation and industry practice to protect children online.


It will be an opportunity to discuss the key issues affecting the protection of children arising from the Online Harms White Paper, which set out the Government’s plans to improve consumer safety online.


Delegates will assess practicalities of implementation and the next steps for delivery.


As the ICO puts its age appropriate design code into effect, further sessions examine what are the key priorities going forward.


We expect discussion to reflect the focus on industry responsibility in the Online Harms white paper - including the proposed legal duty of care for online companies towards its users which raises questions of what duties and liability should fall on social media platforms and tech companies.


It follows the Science and Technology Committee’s report on the mental health impacts of social media - which found that social media exacerbated online harms including negative body image and cyberbullying.


As the Government looks to make tech companies more accountable, delegates will have the opportunity to hear directly from industry on what more can be done to protect children online.


With the recently announced Executive Board for the UK Council for Internet Safety bringing together tech organisations, civil society, and the public sector to facilitate a coordinated and collaborative approach to increasing consumer safety, we expect discussion on the extent to which this thinking can bring meaningful change.


The agenda will bring out the latest thinking on the future of online regulation with respect to children.


It takes place:


  • as the Government prepares to introduce a new regulatory framework to combat online harms,
  • in the context of the ongoing consultation into the future regulatory design pertaining to video-sharing platforms - as required to meet obligations set out in the Audiovisual Media Services Directive to protect children from harmful content.

We expect discussion around proportionality in a new regulatory framework, taking into account varying available resources between companies.


It comes with Ofcom’s appointment as interim regulator for video-sharing online, with powers to impose fines of up to 5% of a platform’s revenue for allowing access to extremist or harmful content.


We expect discussion on how the policy aim of greater transparency around the presence of harmful content on platforms can be achieved.


It takes place with plans for publicly-available yearly transparency reports from online companies detailing the presence of harmful content and what they are doing to address the issue.


Delegates will discuss the effectiveness of current public and private sector initiatives aimed at encouraging children to think more critically online.


They will also consider areas for improvement - with the white paper announcing plans to develop a new online media literacy strategy.


One year on from the Data Protection Act, further sessions consider progress that has been made toward protecting children’s privacy online and what further support might be needed.


The conference also takes place with the ICO’s Age Appropriate Design Code coming into effect - developed to ensure online products and services that could be accessed by children incorporate data protection safeguards into their design.


Discussion will reflect the practicalities of age verification technology - including protections on the collection and use of data, and the effects on the business models of online companies and services, as well as concerns about a disproportionate impact on start-ups affecting their ability to compete in the digital markets.


We also expect discussion on issues of how to define what is harmful, especially when the content is not illegal, as well as enforceability issues and whose responsibility this should be - and concerns about age verification technology, with restriction of access to information potentially stifling freedom of expression.



Keynote Speaker

Jonathan Bamford

Director of Strategic Policy, Domestic, Information Commissioner’s Office

Keynote Speakers

Daniel Dyball

Executive Director, UK, Internet Association

Becky Foreman

UK Corporate Affairs Director, Microsoft and Industry Vice-Chair, Internet Watch Foundation

Jonathan Bamford

Director of Strategic Policy, Domestic, Information Commissioner’s Office

Senior speaker confirmed from DCMS

Chairs

Danielle Rowley MP

Vice-Chair, All-Party Parliamentary Group on Social Media

Ann Coffey MP

Spokesperson for Children and Education, The Independent Group for Change

Speakers

Lenard Koschwitz

Senior Director, Global Policy, Allied for Startups

Claire Levens

Policy Director, Internet Matters

Senior speaker confirmed from UNICEF UK

Senior speaker confirmed from Ofcom

Paul Herbert

Partner, Goodman Derrick

Alex Towers

Director of Policy and Public Affairs, BT and Board member, techUK

Vicki Shotbolt

Chief Executive Officer and Founder, Parent Zone and Member, Executive Board, UK Council for Child Internet Safety

Professor Lorna Woods

Professor, School of Law, University of Essex