EXECUTIVE SUMMARY I. CONTEXT
A. Impetus for the study
(i) Demand for action
(ii) Standardization agenda
B. Wider background to the study
(i) Emphasis on the importance of international human rights frameworks
(ii) Growing demands on Internet platforms to do their bit in tackling online hate speech
(iii) Recognition of the diversity of Internet platforms
(iv) Promotion of collaborative governance of online hate speech
(v) Trade-offs at the heart of the governance of online hate speech
C. What’s the point of Internet governance for online hate speech?
(i) Three levels of governance: The moderation level, the oversight level and the regulatory level
(ii) Outcome-oriented versus process-oriented governance tools: The NetzDG Act in Germany and the Avia Bill in France as case studies
(iii) Understanding the purpose or function of governance tools for online hate speech
D. On the prima facie need for pluralism, disaggregation and integration within the governance of online hate speech
(i) Public content areas versus closed groups
(ii) Different types of Internet platforms
(iii) Different kinds of content
(iv) Differential reputational damage to Internet platforms
(v) Differing amounts, types and degrees of harmfulness of hate speech
(vi) Pluralism of country contexts
(vii) Implications of diversity and pluralism for the governance of online hate speech
E. Definitional Issues
F. Governance firewalls: Facebook’s Oversight Board as a case study
G. Aims, scope and methods of the study
II. FIRST LEVEL OF INTERNET GOVERNANCE: THE MODERATION LEVEL
A. Professionalised moderation
B. Distributed moderation
C. Pre-moderation by professional publishers of content
D. Facilitated user self-moderation
E. Auto-moderation
F. Content management
III. SECOND LEVEL OF INTERNET GOVERNANCE: THE OVERSIGHT LEVEL
A. Public consultation on content policies and moderation guidelines, processes and procedures
B. Internal appeals processes
C. General recommendations from an independent supervisory council, steering committee or oversight board
D. Referrals of grey area or difficult cases to an independent supervisory council, steering committee or oversight board
E. Fully independent dispute resolution procedure or mediation process
F. User rating system
IV. THIRD LEVEL OF INTERNET GOVERNANCE: THE REGULATORY LEVEL
A. Legal compliance
B. Voluntary code of practice
C. Legal responsibility to remove unlawful hate speech enforced with fines
(i) Exceptions from legal responsibilities in the case of journalistic content
(ii) Exceptions from legal responsibility for Internet platforms that refer grey area cases to competent independent institutions and abide by the decisions
(iii) Exemptions from liability for Internet platforms granted “responsible platform” status
(iv) Leniency programmes that give Internet platforms reductions in fines if they fully cooperate with governmental authorities
D. Legal responsibility not to over-remove lawful hate speech enforced with fines
E. Statutory duty of care and/or code of practice
F. Special public prosecutor
G. Bespoke criminal offences
H. Reform of sentencing guidelines
I. Special police unit
V. ON THE BENEFITS AND CHALLENGES OF COLLABORATIVE APPROACHES TO THE GOVERNENCE OF ONLINE HATE SPEECH
A. Potential benefits of collaboration in the governance of online hate speech
B. Potential disbenefits of collaboration in the governance of online hate speech
(i) Challenges in collaboration between Internet platforms and trusted flaggers and monitoring bodies
(ii) Challenges in collaboration between Internet platforms and independent supervisory councils, steering committees or oversight boards
(iii) Challenges in collaboration between Internet platforms and the police and public prosecutors
VI. ON THE MOTIVATIONS, GOALS, VALUES AND EXPECTATIONS OF GOVERNMENTS, INTERNET PLATFORMS, CIVIL SOCIETY ORGANISATIONS AND THE GENERAL PUBLIC CONCERNING THE GOVERNANCE OF ONLINE HATE SPEECH
A. Governmental agencies
B. Internet platforms
C. Trusted flaggers and monitoring bodies
D. The general public
(i) UK public opinion poll results (YouGov, sample of 1,633 UK adults, November 2019)
(ii) France public opinion poll results (YouGov, sample of 1008 French adults, November 2019)
(iii) Germany public opinion poll results (YouGov, sample of 2,055 German adults, November 2019)
(iv) Analysis
VII. A VICTIM-SENSITIVE APPROACH TO GIVING REDRESS TO TARGETS OF ONLINE HATE SPEECH
A. The need for a victim-sensitive approach
B. Some important qualifications about a victim-sensitive approach
C. Guidelines for achieving a victim-sensitive approach
(i) Victim-sensitivity at the moderation level
(ii) Victim-sensitivity at the oversight level
(iii) Victim-sensitivity at the regulatory level
VIII. MEASURES OR INDICATORS AGAINST WHICH THE SUCCESS OR PROGRESS OF DIFFERENT INTERNET GOVERNANCE TOOLS FOR ONLINE HATE SPEECH CAN BE ASSESSED
A. List of indicators
B. Some important qualifications about the indicators
C. Selecting the right indicators
IX. CONCLUSIONS AND RECOMMENDATIONS
1. Standardization agenda
2. Grey area cases
3. Public opinion
4. Collaboration
5. Mitigating the incentive to over-remove hate speech content
6. Monitoring voluntary codes of conduct
7. A victim-sensitive approach
8. Proactive use of text extraction and machine learning tools or algorithms
9. Indicators of success in the governance of online hate speech
10. Equitable sharing in the governance of online hate speech
LIST OF ORGANISATIONS WHO PARTICIPATED, WERE CONSULTED OR WERE ENGAGED DURING THE STUDY REFERENCES