GUIDANCE NOTE
INTRODUCTION
GENERAL CONSIDERATIONS FOR GOOD POLICYMAKING
DEFINING THE PROBLEM
ENSURING PREDICTABILITY
OVERCOMPLIANCE AND DISCRIMINATION
AFFECTED RIGHTS
NATURE OF THE REGULATORY APPROACH
CHARACTERISTICS OF SUCCESSFUL AND FAILED APPROACHES
TRANSPARENCY
EXPLANATORY MEMORANDUM
KEY CONCEPTSI INTRODUCTION
UNDERSTANDING THE PROBLEMS
WHOSE RESPONSIBILITY?
“SELF-REGULATION” IS UNDERSTOOD DIFFERENTLY IN DIFFERENT CONTEXTS
SUCCESS OR FAILURE?
II PROBLEM STATEMENT
1. PREVALENCE OF UNWELCOME/ABUSIVE CONTENT/BEHAVIOUR
2. LACK OF STANDARDISATION OF CONTENT RESTRICTIONS
3. THE LINE BETWEEN PUBLIC AND PRIVATE
a) Unpredictability of content restrictions
b) Lack of clarity on balance of roles and responsibilities of states and private actors
c) Dangers of over-compliance, especially when leading to discriminatory outcomes
d) Focus on metrics restricted to speed and volume, driven by repeated “urgent” situations
e) Moderating risk
III AFFECTED RIGHTS
1. FREEDOM OF EXPRESSION
2. RIGHT TO PRIVACY
3. FREEDOM OF ASSEMBLY AND ASSOCIATION
4. RIGHT TO REMEDY
a) For victims
b) For people whose content has been unjustly removed
c) Adequacy of the right to redress and remedy
IV PURPOSES AND DRIVERS OF CONTENT MODERATION
1. CONTENT MODERATION AND BUSINESS INTERESTS
Problematic content exacerbated by business models
2. Content moderation for public policy reasons
a) Content that is illegal everywhere, regardless of context
b) Illegal content that is part of a wider crime
c) Content that is not necessarily part of a wider offence
d) Legal content that is illegal primarily due to its context
e) Content that is illegal primarily due to its intent
f) Content that is potentially harmful but not necessarily illegal
g) Content that raises political concerns
3 Conclusion
V. STRUCTURES FOR CONTENT MODERATION
1. SELF-REGULATION
2. CO-REGULATION
3. COMMON CHARACTERISTICS OF SUCCESSFUL APPROACHES
VI. TRANSPARENCY
WHY TRANSPARENCY IS ESSENTIAL
To ensure restrictions are necessary and proportionate
To ensure non-discrimination
To ensure accountability of stakeholders (such as states)
Identification of transparency data
Recognising the positive and negative incentives created by transparency metrics
Trusted flaggers
VII. KEY PRINCIPLES FOR A HUMAN RIGHTS-BASED APPROACH TO CONTENT MODERATION
1. Transparency
2. Human rights by default
3. Problem identification and targets
4. Meaningful decentralisation
5. Communication with the user
a) Clarity and accessibility of terms of service
b) Clarity on communication with users
6. High level administrative safeguards
a) Clear legal and operational framework
b) Supervision to ensure human rights compliance
c) Evaluation and mitigation of “gaming” of complaints mechanisms
d) Ensuring consistency and independence of review mechanisms
e) Recognising the human challenges of human content moderation
f) Ensuring protection of privacy and data protection
g) Victim redress
7. Addressing the peculiarities of self- and co-regulation in relation to content moderation