An arrow pointing left
View all of our work

SAMbot

Online abuse is a barrier to civic engagement and active participation in our democracy. It can disrupt political conversations and even prevent people from entering politics. Our SAMbot project measures abusive content that candidates and political parties receive online during Canadian elections. We do this to illuminate the realities of abuse on the digital campaign trail and the barriers to civic engagement created by technology’s influence on our democratic culture. 

SAMbot
An arrow pointing left
View all of our work
An arrow pointing left
Back to Our Work

SAMbot

Online abuse is a barrier to civic engagement and active participation in our democracy. It can disrupt political conversations and even prevent people from entering politics. Our SAMbot project measures abusive content that candidates and political parties receive online during Canadian elections. We do this to illuminate the realities of abuse on the digital campaign trail and the barriers to civic engagement created by technology’s influence on our democratic culture. 

What is SAMbot?


SAMbot is a machine learning bot that detects and tracks abusive sentiment. During Canadian elections, we use SAMbot to collect data and generate insights about the online abuse received by candidates and political parties.

As political discourse is generally at its most abusive during campaigns, SAMbot helps us gain critical insight into the current state of online Canadian political conversations.

SAMbot can help us look at online discussions at a massive scale. To date, we have used SAMbot to analyze millions of comments in federal, provincial, and municipal elections across Canada. Insights from this work have received national media coverage and been presented at meetings across governmental, policy, technology, human rights and academic spaces.

Why do we need SAMbot ?

While it is commonly believed that toxic online spaces are harming our democracy, we do not have sufficient data to shed light on this problem. Our SAMbot findings provide insight into how online conversations are affecting political participation in Canada, as well as working conditions on the digital campaign trail. This data can be used to inform effective policy responses to address online harms.

How does SAMbot work?

SAMbot is a machine learning bot — a software application that runs automated tasks through a type of machine learning called natural language processing. SAMbot monitors all English and French tweets sent to candidates. 

The models that SAMbot uses to evaluate language are trained and tested on millions of data points so it can identify content considered toxic, harmful, or insulting. Each time SAMbot is deployed in an election we are able to improve and iterate on the machine learning models it uses. This increases the accuracy of our results.

When SAMbot evaluates a comment, it makes a confidence prediction based on how likely it is that someone would interpret the comment to be abusive. Machine learning allows tools like SAMbot to have an enormous scale. However, language is highly nuanced and SAMbot can never replace human judgment. This is why analysis of SAMbot insights is guided by human beings – members of the Samara Centre team. In using AI for civic inquiry, the presence of human analysis is important to underscore because it helps to counter common assumptions about artificial intelligence being impartial, autonomous, or free from bias. By taking this approach, we strive to demonstrate how AI-driven tools can contribute to civic inquiry in an ethical and productive manner.

IDEAS for a Better CanadaIDEAS for a Better Canada

Events

IDEAS for a Better Canada

'IDEAS for a Better Canada' explores democracy from the ground up. Join the Samara Centre for Democracy and CBC's Nahlah Ayed for a series of free public events across Canada.

SAMbot

Online abuse is a barrier to civic engagement and active participation in our democracy.

Project Overview

What is SAMbot?


SAMbot is a machine learning bot that detects and tracks abusive sentiment. During Canadian elections, we use SAMbot to collect data and generate insights about the online abuse received by candidates and political parties.

As political discourse is generally at its most abusive during campaigns, SAMbot helps us gain critical insight into the current state of online Canadian political conversations.

SAMbot can help us look at online discussions at a massive scale. To date, we have used SAMbot to analyze millions of comments in federal, provincial, and municipal elections across Canada. Insights from this work have received national media coverage and been presented at meetings across governmental, policy, technology, human rights and academic spaces.

Why do we need SAMbot ?

While it is commonly believed that toxic online spaces are harming our democracy, we do not have sufficient data to shed light on this problem. Our SAMbot findings provide insight into how online conversations are affecting political participation in Canada, as well as working conditions on the digital campaign trail. This data can be used to inform effective policy responses to address online harms.

How does SAMbot work?

SAMbot is a machine learning bot — a software application that runs automated tasks through a type of machine learning called natural language processing. SAMbot monitors all English and French tweets sent to candidates. 

The models that SAMbot uses to evaluate language are trained and tested on millions of data points so it can identify content considered toxic, harmful, or insulting. Each time SAMbot is deployed in an election we are able to improve and iterate on the machine learning models it uses. This increases the accuracy of our results.

When SAMbot evaluates a comment, it makes a confidence prediction based on how likely it is that someone would interpret the comment to be abusive. Machine learning allows tools like SAMbot to have an enormous scale. However, language is highly nuanced and SAMbot can never replace human judgment. This is why analysis of SAMbot insights is guided by human beings – members of the Samara Centre team. In using AI for civic inquiry, the presence of human analysis is important to underscore because it helps to counter common assumptions about artificial intelligence being impartial, autonomous, or free from bias. By taking this approach, we strive to demonstrate how AI-driven tools can contribute to civic inquiry in an ethical and productive manner.

Media Mentions

This section is intended to highlight specific media mentions.

"Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique."

Name of Media Outlet
Link to Article

"Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique."

Name of Media Outlet
Link to Article
The Power Abusers of Alberta's 2023 Election
CBC Radio Active
Read Article
As we prepare for a federal election, beware of looming 'infowars'
CFRA 580
Read Article
Online abuse prevalent during 2023 Alberta election
Vancouver Cooperative Radio
Read Article
The Power Abusers of Alberta's 2023 Election
CBC Radio Active
Read Article
“We could see actual Canadian policy being shaped by a small number of people that are misrepresenting the feelings of politicians’ constituents.”
St. Albert Gazette
Read Article
Despite the number of news stories it generated, the Kirkland Lake incident was a drop in the bucket of the widespread, complex and pervasive manipulation of public opinion that occurs daily, across social media.
The Globe and Mail
Read Article
In-authentic Engagement in Online Spaces
640 Toronto | Global News
Read Article
Is that online outrage you’re seeing really grassroots, or just ‘astroturfing’?
The Globe and Mail
Read Article
Online abuse and its effect on politics
Majority of Canadians Believe Online Hate and Abuse is a Problem
"Politics from beyond and within Canadian borders are now increasingly interwoven."
Yahoo News
Read Article
How Can Canadians Combat Online Hate?
TVO's The Agenda
Read Article
Can a public official block a constituent?
Columnists from CBC Radio
Read Article
Democracy XChange 2024: The Impact of Online Harassment on Democracy
CPAC Public Record
Read Article
Can Angry People be Good Citizens?
TVO Today Live
Read Article
How are you affected by political polarization?
Ontario Today
Read Article
"Social media platforms facilitate the spread of abusive content that has offline consequences, including widespread polarization, alienation and physical violence."
Policy Options
Read Article

Election Reports

Highlights

No items found.

The content for this Project

Lorem ipsum dolor sit amet, consectetur adipiscing elit.