Skip to main content

Canada’s proposed Bill C-63, officially known as the Online Harms Act, is being presented by the federal government as a long-overdue solution to combat online hate, violence, and exploitation. On the surface, the bill appears well-intentioned. But like many modern laws aimed at curbing harmful speech, the details reveal a number of serious concerns for free expression, civil liberties, and due process.

As a Canadian citizen, it’s important to understand both what this bill aims to do and what unintended consequences may follow if passed into law without major revisions.

What Is Bill C-63?

Introduced by the Liberal government, Bill C-63 seeks to address seven categories of harmful online content:

  1. Child sexual exploitation
  2. Non-consensual sharing of intimate images
  3. Content that incites violence
  4. Content that foments hatred
  5. Cyberbullying
  6. Inducement to self-harm or suicide
  7. Violent or sexual content targeted at minors

It proposes new responsibilities for digital platforms, creates several new regulatory bodies, and amends multiple existing laws — including the Criminal Code, the Canadian Human Rights Act, and the Youth Criminal Justice Act.

What Is Bill C-63?

Key Provisions and Concerns:

Broad Definitions of Harmful Content:

The bill targets seven categories of harmful content, including child exploitation, non-consensual intimate images, cyberbullying, and content that incites violence or hatred. However, critics argue that the definitions, particularly for content that “foments hatred,” are vague and could encompass lawful but controversial speech, leading to potential over-censorship.

Expansion of Regulatory Bodies:

Bill C-63 proposes the creation of a Digital Safety Commission, a Digital Safety Ombudsperson, and a Digital Safety Office. These bodies would have the authority to enforce compliance, conduct audits, and impose penalties on social media platforms. While intended to ensure accountability, there are concerns about the concentration of power and the potential for these bodies to suppress dissenting voices.

Amendments to the Criminal Code:

The bill introduces new offenses related to hate propaganda, including the possibility of life imprisonment for advocating genocide. It also allows for preventive measures, such as peace bonds, based on the fear that an individual may commit a hate crime. Such provisions have been criticized for potentially infringing on the presumption of innocence and enabling punitive actions without concrete evidence of wrongdoing.

Revival of Section 13 of the Canadian Human Rights Act:

Bill C-63 seeks to reinstate Section 13, which was previously repealed due to concerns over its impact on free speech. The section allowed for complaints against individuals for online communications deemed likely to expose others to hatred or contempt. Its revival raises fears of renewed censorship and the stifling of legitimate discourse.

Implications for Social Media Platforms:

The act imposes duties on social media services to act responsibly, protect children, make certain content inaccessible, and maintain records. Non-compliance could result in substantial fines, up to 6% of global revenue or $10 million, whichever is greater. While aiming to hold platforms accountable, there is apprehension about the feasibility of these requirements and their potential to stifle innovation.
While Bill C-63 endeavors to create a safer online environment, it is essential to balance this goal with the protection of fundamental freedoms. The concerns highlighted suggest a need for careful reconsideration and possible revision of the bill to ensure that it does not inadvertently suppress lawful expression or grant excessive power to regulatory bodies. Engagement with diverse stakeholders and transparent dialogue will be crucial in refining the legislation to effectively address online harms without compromising civil liberties.

⚠️ What Are the Key Concerns?

While the objective — protecting Canadians, particularly children and marginalized groups, from serious harm — is commendable, several parts of the bill raise red flags:

1. Revival of Section 13 of the Human Rights Act

Bill C-63 reinstates Section 13, which allows individuals to file complaints for online speech that is likely to expose others to hatred or contempt.

  • Problem: This provision was repealed in 2013 due to concerns about vague language and its chilling effect on free speech. It allows for legal consequences based not on intent or harm done, but on the likelihood of someone interpreting the speech as hateful.

2. Preventative Orders Based on Anticipated Harm

The bill introduces pre-crime-style mechanisms, where courts can issue peace bonds and impose restrictions on individuals believed to potentially commit a hate-related offence.

  • Problem: This erodes the principle of “innocent until proven guilty” by allowing penalties based on speculation and fear rather than actual criminal behavior.

3. The Creation of Powerful New Enforcement Bodies

Bill C-63 establishes a Digital Safety Commission, a Digital Safety Ombudsperson, and a Digital Safety Office. These agencies would have sweeping authority to investigate, demand takedowns, conduct audits, and impose massive financial penalties.

  • Problem: There’s limited judicial oversight, and the scope of their power could easily be exploited or expanded by future governments with less democratic intent.

4. Heavy Duties Imposed on Platforms

Social media services would be required to:

  • Monitor for harmful content
  • Remove or restrict access quickly
  • Provide detailed compliance reports
  • Problem: The definition of “harmful” is subjective. To avoid penalties, platforms may over-censor, removing even legal content that is critical, controversial, or unpopular — effectively chilling public discourse.

Why Citizens Should Be Cautious

  • Free speech is a foundational value. While hate must be addressed, the solution cannot involve vague thresholds that allow punishment based on perception.
  • Regulatory overreach is dangerous. Once government bodies are given sweeping powers, it’s extremely difficult to contain them — especially under shifting political leadership.
  • The potential for politicized enforcement is real. What one administration deems hateful or harmful could be redefined later to suppress dissent, journalism, satire, or inconvenient truths.
  • Digital platforms may become de facto censors. With heavy financial penalties on the line, companies may choose risk aversion over nuance — and the public will suffer for it.

What’s Good About the Bill?

Let’s be fair: Bill C-63 does include necessary steps toward protecting children from exploitation, cracking down on non-consensual image sharing, and addressing the rise of coordinated online abuse campaigns. The intent of Bill C-63 — to protect people from genuine online harm — is valid and needed. But the implementation raises serious red flags for freedom of expression, due process, and proportionality. Citizens should push for clearer definitions, stronger oversight, and firm safeguards to ensure the law doesn’t become a tool for silencing unpopular or inconvenient voices.

It’s the scope and execution that need serious scrutiny.

What’s Good About the Bill?

Calling for better laws doesn’t mean defending hate speech — it means ensuring we don’t compromise civil liberties in the name of public safety. Canada needs digital protections, but not at the expense of free expression, due process, and the presumption of innocence.

We can address harm and preserve liberty — but not if we hand unchecked authority to regulatory bodies with vague mandates. This is one of those moments where we must read the fine print, ask tough questions, and demand clear, limited, and accountable legislation.

Bill C-63 should be debated, revised, and improved — not blindly accepted in the name of safety.