Article | Open Access
The EU Approach to Safeguard Children’s Rights on Video-Sharing Platforms: Jigsaw or Maze?
Views: | 886 | | | Downloads: | 996 |
Abstract: Children are keen consumers of audiovisual media content. Video-sharing platforms (VSPs), such as YouTube and TikTok, offer a wealth of child-friendly or child-appropriate content but also content which—depending on the age of the child—might be considered inappropriate or potentially harmful. Moreover, such VSPs often deploy algorithmic recommender systems to personalise the content that children are exposed to (e.g., through auto-play features), leading to concerns about diversity of content or spirals of content related to, for instance, eating disorders or self-harm. This article explores the responsibilities of VSPs with respect to children that are imposed by existing, recently adopted, and proposed EU legislation. Instruments that we investigate include the Audiovisual Media Services Directive, the General Data Protection Regulation, the Digital Services Act, and the proposal for an Artificial Intelligence Act. Based on a legal study of policy documents, legislation, and scholarship, this contribution investigates to what extent this legislative framework sets obligations for VSPs to safeguard children’s rights and discusses how these obligations align across different legislative instruments.
Keywords: video-sharing platforms; audiovisual content; children’s rights; legislation
Published:
© Valerie Verdoodt, Eva Lievens, Argyro Chatzinikolaou. This is an open access article distributed under the terms of the Creative Commons Attribution 4.0 license (http://creativecommons.org/licenses/by/4.0), which permits any use, distribution, and reproduction of the work without further permission provided the original author(s) and source are credited.