Please log in to watch this conference skillscast.
Moderating discriminatory user content is a significant challenge for many online communities, especially in today's climate of polarized ideologies and increased xenophobia. Timothy Quinn, technology lead at Hatebase, will discuss some of the ways Hatebase has devised to identify and analyze hate speech, particularly across multilingual and regionally diverse ecosystems.
This talk will focus on real-world solutions to implementing moderation workflows, as well as on balancing free speech against legal and financial liabilities being imposed by governments. Although some technological material will be presented, anyone with an operational interest in content moderation will benefit from attending.
YOU MAY ALSO LIKE:
- The Secrets of the GHC Garbage Collector (in Online Event on 11th June 2020)
- Let’s Play with Cloud Code to Run Cloud Native Applications (in Online Event on 18th June 2020)
- Digital Discrimination: Cognitive Bias in Machine Learning (SkillsCast recorded in June 2020)
- Using A.I. to Help Humanitarian Causes (SkillsCast recorded in May 2020)
Hate Speech as a Service
Timothy Quinn helps run Hatebase, a Toronto-based organization that uses natural language processing and predictive analysis to help companies rid their ecosystems of hate speech, while also helping government agencies, law enforcement and NGOs perform threat assessments in conflict areas.