Skip to content

First Nations, Jim Balsillie slam government over lack of consultation on AI bill

OTTAWA — The Assembly of First Nations is warning it could take the Liberal government to court over its proposed privacy and artificial intelligence bill.
2024021418024-65cd47b554a7934f79e76ed2jpeg
Canadian businessman and former Research In Motion co-CEO and chair Jim Balsillie is shown during an interview in Toronto, Monday, April 17, 2023. Balsillie is among those slamming the federal government for a lack of consultation on its proposed bill to regulate artificial intelligence. THE CANADIAN PRESS/Cole Burston

OTTAWA — The Assembly of First Nations is warning it could take the Liberal government to court over its proposed privacy and artificial intelligence bill.

And former tech executive Jim Balsillie told MPs studying the bill that he considers the legislation "anti-democratic."

The government has already been criticized for failing to consult widely and early enough on Bill C-27, which updates privacy laws and introduces the Artificial Intelligence and Data Act.

Balsillie, the former co-CEO of BlackBerry pioneer Research In Motion, said Wednesday the government did no public consultations and relied too heavily on feedback from industry rather than civil society.

Indigenous leaders said First Nations weren't consulted at all.

"As a result, the minister did not hear First Nations, does not understand First Nations, and it shows in the legislation," the Assembly of First Nations said in a brief submitted to the House of Commons industry committee.

It said the bill infringes on the rights of First Nations, including on data sovereignty, and that litigation is "likely" if the government doesn’t meet its obligations.

Liberal MP Ryan Turnbull defended the government during the meeting. He said more than 300 "meetings and consultations" have been conducted on the bill and there will also be another "two years’ worth of extensive consultations" on the regulations stemming from the bill. 

The committee also heard from Christelle Tessono, a tech policy researcher at the University of Toronto, who said the bill doesn’t address human rights risks that AI systems can cause.

She said at a minimum, the preamble to the bill should "acknowledge the well-established disproportionate impact these systems have on historically marginalized groups," such as Indigenous Peoples, people of colour, members of the LGBTQ+ community and economically disadvantaged individuals.

During his testimony, Balsillie outlined some of what he called "countless" incidents of harm by AI systems. He said that includes cases where they have facilitated housing discrimination, made racist associations, shown job postings to men but not women and recommended longer prison sentences for visible minorities. 

The Assembly of First Nations also said it has concerns about AI, including racial profiling.

“First Nations have been treated as criminals when they try to open bank accounts and they have been subject to racial profiling in the health sector, by police, and government officials,” it said in its brief.

“Imagine the potential for such abuse to continue or even worsen when biased and prejudiced individuals and organizations are building AI systems that will implicate First Nations.”

The bill does little to reassure First Nations, it said.

Balsillie said the bill needs to be sent back to the drawing board. 

“Rushing to pass legislation so seriously flawed will only deepen citizens’ fears about AI because AIDA merely proves that policymakers can't effectively prevent current and emerging harms from emerging technologies." 

This report by The Canadian Press was first published Feb. 14, 2024.

Anja Karadeglija, The Canadian Press