Skip to content

Burnaby's SFU gets $6.2M to study misinformation, foster more equitable futures

Director of the Digital Democracies Institute seeks to research new ways of understanding and evaluating algorithms that aren't tied to social media companies' private analysis.
SFU's Digital Democracies Institute has received $6.2 million from the Mellon Foundation to study mis- and disinformation, as well as the effects of algorithms and machine learning on our culture.

SFU has received $6.22 million from the U.S.-based Mellon Foundation to study the effects of mis- and disinformation.

The current state of algorithms that create online echo chambers, left unchecked, can undermine the dialogue necessary for a robust democracy, according to the director of SFU’s Digital Democracies Institute, Wendy Hui Kyong Chun.

“Part of the problem we’re in right now is that a lot of the analysis that’s being done on social media is being done by social media companies behind closed doors,” Chun said. “And what they’re doing is basically surveillance.”

“We don’t believe it’s a good model, and that there are other ways of understanding social media and its impact.”

The research will also consider new ways to evaluate programs using algorithms and automated learning.

For example, some American court systems use a program called COMPAS which determines the risk of a defendant re-offending. The algorithm’s result can influence the sentencing.

But the program has been shown to be biased against racial minorities, even though race isn’t used as an explicit factor.

The program doesn’t end up predicting the future – it predicts the past, according to Chun.

“What this means is that if these algorithms become dominant in our lives, we’ll be caught in this pattern of repeating the past and repeating our past mistakes, rather than learning from the past.”

Chun explained misinformation is information mistakenly shared, like a parody that’s believed to be real. Disinformation, on the other hand, is deliberately incorrect, and spread to foster distrust or chaos.

But there’s hope for algorithms to shape the future positively.

SFU, along with partner institutions around the world, is researching how to move beyond data literacy towards developing “data fluencies” which combine arts and humanities with data science.

Project member and SFU assistant professor Gillian Russell will lead free public night schools in Vancouver, bringing community members together with artists and technologists to discuss the ways algorithms and data operate in today’s world – and come up with different ways of engaging with those technologies.

The SFU funding announcement comes a week after NDP House leader Peter Julian, MP for New Westminster-Burnaby, tabled a bill to address online discrimination, misinformation and hate harming Canadians.

The bill seeks to make public the algorithms of “profitable web giants.”

“Experts have observed how algorithms can discriminate against communities by targeting regions for job postings and other ads. It is wrong to rob people of opportunities based on where they live and who they are. We must ensure online platforms and communication service providers are transparent about their algorithms,” Julian said in a statement.

Chun was a commissioner for the Canadian Commission on Democratic Expression, which operates on the principle that “the notion that platforms are neutral disseminators of information is faulty” and that platform companies can find themselves caught between private interest and public good.

The commission recommended the creation of a new regulatory body independent from the government, media and private platforms to ensure transparency and accountability.

For Chun and the Digital Democracies Institute, bringing disciplines and sectors together is critical to tackling the issues of algorithms and online information, but technology is not the sole solution.

“We don’t believe technology can solve our problems for us. I think we’re in the situation we are in now because we’ve had this blind belief in technology solving our problems, and, inevitably, they’ve made it worse,” Chun said.

“We believe that in order to think about this, we need to bring everybody together and to rethink to think about what kind of future world we want.”