We recommend the all-embracing guide Social Media Platforms and Challenges for Democracy, Rule of Law and Fundamental Rights, commissioned by the European Parliament Committee on Civil Liberties, Justice, and Home Affairs (LIBE Committee).
The study raises important issues such as
- risks to fundamental rights and equal participation in public debate, connected with online hate speech,
- disinformation and its effects on public safety, fundamental rights, and public debate,
- risks to the capacity of news media to support pluralist political debate and promote democratic participation and accountability.
The study provides an overview of the legal framework for social media content governance:
- the legal framework for content moderation set out in the DSA (social media platforms exempt from liability for content upon fulfilling certain requirements and no overall obligation to monitor illegal content, notice and action procedure, Good Samaritan clause, due diligence requirement),
- various other rules on content moderation in certain areas: the DSM Directive, the Terrorist Content Regulation, the revised AVMS, and common Codes of Practice on Disinformation and Hate Speech,
- regulatory framework governing social media content recommendations and other aspects of platform design: DSA, proposal for a regulation on the transparency and targeting of political advertising, Code of Practice on Disinformation.
The Digital Services Act leaves many open questions, for example the extent of the obligations of very large platforms to assess and mitigate systemic risks. The study makes recommendations concerning how national regulators and the Commission can effectively implement and expand on relevant provisions to ensure that platforms take effective measures against disinformation and hate speech, while respecting users’ fundamental rights.
We recommend the unabridged version of the study.