The EU Approach to Safeguard Children’s Rights on Video‐Sharing Platforms: Jigsaw or Maze?

1Citations
Citations of this article
17Readers
Mendeley users who have this article in their library.

Abstract

Children are keen consumers of audiovisual media content. Video‐sharing platforms (VSPs), such as YouTube and TikTok, offer a wealth of child‐friendly or child‐appropriate content but also content which—depending on the age of the child— might be considered inappropriate or potentially harmful. Moreover, such VSPs often deploy algorithmic recommender systems to personalise the content that children are exposed to (e.g., through auto‐play features), leading to concerns about diversity of content or spirals of content related to, for instance, eating disorders or self‐harm. This article explores the responsibilities of VSPs with respect to children that are imposed by existing, recently adopted, and proposed EU leg-islation. Instruments that we investigate include the Audiovisual Media Services Directive, the General Data Protection Regulation, the Digital Services Act, and the proposal for an Artificial Intelligence Act. Based on a legal study of policy docu-ments, legislation, and scholarship, this contribution investigates to what extent this legislative framework sets obligations for VSPs to safeguard children’s rights and discusses how these obligations align across different legislative instruments.

Cite

CITATION STYLE

APA

Verdoodt, V., Lievens, E., & Chatzinikolaou, A. (2023). The EU Approach to Safeguard Children’s Rights on Video‐Sharing Platforms: Jigsaw or Maze? Media and Communication, 11(4), 151–163. https://doi.org/10.17645/mac.v11i4.7059

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free