Canada’s proposed Bill C-63, officially known as the Online Harms Act, is being presented by the federal government as a long-overdue solution to combat online hate, violence, and exploitation. On the surface, the bill appears well-intentioned. But like many modern laws aimed at curbing harmful speech, the details reveal a number of serious concerns for free expression, civil liberties, and due process.
As a Canadian citizen, it’s important to understand both what this bill aims to do and what unintended consequences may follow if passed into law without major revisions.
What Is Bill C-63?
Introduced by the Liberal government, Bill C-63 seeks to address seven categories of harmful online content:
- Child sexual exploitation
- Non-consensual sharing of intimate images
- Content that incites violence
- Content that foments hatred
- Cyberbullying
- Inducement to self-harm or suicide
- Violent or sexual content targeted at minors
It proposes new responsibilities for digital platforms, creates several new regulatory bodies, and amends multiple existing laws — including the Criminal Code, the Canadian Human Rights Act, and the Youth Criminal Justice Act.
Read the Complete Proposed Bill Here:
https://www.justice.gc.ca/eng/csj-sjc/pl/charter-charte/c63.html
What Is Bill C-63?
Key Provisions and Concerns:
Broad Definitions of Harmful Content:
Expansion of Regulatory Bodies:
Amendments to the Criminal Code:
Revival of Section 13 of the Canadian Human Rights Act:
Implications for Social Media Platforms:
⚠️ What Are the Key Concerns?
While the objective — protecting Canadians, particularly children and marginalized groups, from serious harm — is commendable, several parts of the bill raise red flags:
1. Revival of Section 13 of the Human Rights Act
Bill C-63 reinstates Section 13, which allows individuals to file complaints for online speech that is likely to expose others to hatred or contempt.
- Problem: This provision was repealed in 2013 due to concerns about vague language and its chilling effect on free speech. It allows for legal consequences based not on intent or harm done, but on the likelihood of someone interpreting the speech as hateful.
2. Preventative Orders Based on Anticipated Harm
The bill introduces pre-crime-style mechanisms, where courts can issue peace bonds and impose restrictions on individuals believed to potentially commit a hate-related offence.
- Problem: This erodes the principle of “innocent until proven guilty” by allowing penalties based on speculation and fear rather than actual criminal behavior.
3. The Creation of Powerful New Enforcement Bodies
Bill C-63 establishes a Digital Safety Commission, a Digital Safety Ombudsperson, and a Digital Safety Office. These agencies would have sweeping authority to investigate, demand takedowns, conduct audits, and impose massive financial penalties.
- Problem: There’s limited judicial oversight, and the scope of their power could easily be exploited or expanded by future governments with less democratic intent.
4. Heavy Duties Imposed on Platforms
Social media services would be required to:
- Monitor for harmful content
- Remove or restrict access quickly
- Provide detailed compliance reports
- Problem: The definition of “harmful” is subjective. To avoid penalties, platforms may over-censor, removing even legal content that is critical, controversial, or unpopular — effectively chilling public discourse.
Why Citizens Should Be Cautious
- Free speech is a foundational value. While hate must be addressed, the solution cannot involve vague thresholds that allow punishment based on perception.
- Regulatory overreach is dangerous. Once government bodies are given sweeping powers, it’s extremely difficult to contain them — especially under shifting political leadership.
- The potential for politicized enforcement is real. What one administration deems hateful or harmful could be redefined later to suppress dissent, journalism, satire, or inconvenient truths.
- Digital platforms may become de facto censors. With heavy financial penalties on the line, companies may choose risk aversion over nuance — and the public will suffer for it.
What’s Good About the Bill?
Let’s be fair: Bill C-63 does include necessary steps toward protecting children from exploitation, cracking down on non-consensual image sharing, and addressing the rise of coordinated online abuse campaigns. The intent of Bill C-63 — to protect people from genuine online harm — is valid and needed. But the implementation raises serious red flags for freedom of expression, due process, and proportionality. Citizens should push for clearer definitions, stronger oversight, and firm safeguards to ensure the law doesn’t become a tool for silencing unpopular or inconvenient voices.
It’s the scope and execution that need serious scrutiny.
What’s Good About the Bill?
Calling for better laws doesn’t mean defending hate speech — it means ensuring we don’t compromise civil liberties in the name of public safety. Canada needs digital protections, but not at the expense of free expression, due process, and the presumption of innocence.
We can address harm and preserve liberty — but not if we hand unchecked authority to regulatory bodies with vague mandates. This is one of those moments where we must read the fine print, ask tough questions, and demand clear, limited, and accountable legislation.
Bill C-63 should be debated, revised, and improved — not blindly accepted in the name of safety.