A surge of online abuse, potential fake engagement, and inauthentic activity impacted Alberta’s 2023 provincial election, targeting both veteran politicians and first-time candidates across all parties.
The Samara Centre’s Astroturfing and Abuse: The SAMbot Alberta 2023 Provincial Election Report sheds light on the scale of this digital disruption and its impact on our democracy. Beyond exposing these harms, the report provides key recommendations for how Canadian institutions and digital platforms can better respond to this ever-changing online landscape.
Research Director Beatrice Wayne draws on the report’s findings and offers potential solutions.
You’ve identified the role digital platforms ought to be playing. What should X, Facebook, and platforms like these be doing differently?
Digital platforms should work with policymakers and researchers to tackle online abuse and misinformation by supporting the introductions of platform regulations. Canada should adopt a “design code”-style approach - where digital platforms are required to prioritize user safety and consumer protection. Regulators would then ensure these platforms follow the rules and hold them accountable when they don’t. Simple changes in how platforms operate can make a big difference in reducing online harm at scale.
The Samara Centre has been deploying SAMbot since 2021 and has monitored 12 different elections. But it isn’t easy to get the data you need from digital platforms. What needs to change?
We have to support legislation that safeguards researchers’ access to social media data. Without it, we can’t fully understand the extent of online harms and their effect on our democracy. At a time when social media research is being increasingly restricted (on platforms like X), it’s more important than ever to support legislation that ensures Canadian researchers can study how these platforms are shaping our society.
Samara Centre’s research shows that it isn’t just veteran candidates who are targets online. What have you found?
First-time candidates are very vulnerable to abuse - they have the least support and experience to handle online (and offline) attacks. If we want a truly representative democracy, we need to offer support to first time candidates who may become particular targets of abuse. Otherwise, we risk pushing people away from politics before they even get started.
Something very troubling is that LGBTQ+ communities are experiencing very high levels of online abuse on digital platforms. How do we prevent that?
We need to recognize anti-LGBTQ+ discourse as a democratic threat. In recent years, this kind of discourse has been on the rise in Canada, along with online gender-based violence. When women and gender-diverse people don’t feel safe, they’re less likely to engage in politics and public life. If we want a democracy that truly represents everyone, we have to tackle this issue head-on.
What about users? With so much discussion about foreign interference during elections and on these digital platforms, how is a person to know what’s authentic and what isn’t?
We need to raise public awareness about what online inauthentic behaviour and foreign interference could look like. This isn’t just a distant issue - it’s a real threat, especially for diasporic communities. With AI making it easier and cheaper to spread convincing disinformation, bad actors are working faster than ever. Raising awareness is key to spotting and stopping these threats before they undermine our democracy.
The problem of online abuse seems insurmountable. What gives you hope that things can change?
There’s a silver lining here - most online abuse in Canadian politics comes from a small group of users. That means the problem isn’t as overwhelming as it sometimes feels. With the right regulations and better digital literacy, we can make online spaces more productive, inclusive, and supportive of democracy. There’s real hope for better online civic conversations.
You can read more about the recommendations from our SAMbot Alberta report here.