ARTICLE
10 August 2021

Government Publishes Report On Understanding How Platforms With Video-sharing Capabilities Protect Users From Harmful Content Online

W
Wiggin
Contributor
Wiggin
The Department for Digital, Culture, Media & Sport (DCMS) commissioned consultants EY to review the current landscape of online video-sharing platforms, including an analysis of the recent growth and innovation in the sector.
UK Media, Telecoms, IT, Entertainment
To print this article, all you need is to be registered or login on Mondaq.com.

The Department for Digital, Culture, Media & Sport (DCMS) commissioned consultants EY to review the current landscape of online video-sharing platforms, including an analysis of the recent growth and innovation in the sector. This review was undertaken in the context of new regulation requiring platforms to take appropriate measures to protect their users from certain types of harmful content online.

To form their view, EY undertook two key areas of market research. First, they designed and administered three separate consumer surveys focused on children, teenagers and adults to understand how they use platforms with video-sharing capabilities and their awareness of online harms. Secondly, they interviewed seven, and surveyed 12, online platforms with video-sharing capabilities to better understand how sophisticated their measures are and the costs they incur to enforce those measures.

The report finds that the measures that platforms currently employ to protect their users from harmful content online can vary, but can include acceptable use policies, community guidelines, age assurance, age verification, parental controls, user prompts, content moderation, mechanisms for users to flag violative content, and transparency reports published to disclose a range of information relating to content that has been reported to a platform's moderators.

EY's research suggests that rather than focusing on individual measures, risk assessments need to be carried out to ensure the suite of measures in place are in line with the specific risks on the platform. Further, the report finds, platforms that consider themselves likely to be accessed by children tend to report having more effective measures in place to protect their users from harmful content online.

All platforms EY spoke with explicitly stated that illegal content on their platforms is banned. Most platforms employ the use of industry-wide resources, such as content and image scanning software, to prevent the spread of child sexual abuse material.

To access the report, click here.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

ARTICLE
10 August 2021

Government Publishes Report On Understanding How Platforms With Video-sharing Capabilities Protect Users From Harmful Content Online

UK Media, Telecoms, IT, Entertainment
Contributor
Wiggin
See More Popular Content From

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More