Back to Bills

Crackdown on Online Hate and Child Exploitation

Full Title: An Act to enact the Online Harms Act, to amend the Criminal Code, the Canadian Human Rights Act and An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service and to make consequential and related amendments to other Acts

Summary#

Bill C-63 creates a new Online Harms Act and makes related changes to the Criminal Code, the Canadian Human Rights Act, and the child‑pornography mandatory reporting law. It sets duties for large social media services to reduce exposure to harmful content, protect children, and rapidly block certain illegal content. It also stiffens penalties for hate propaganda, creates a new “hate crime” offence, and re‑establishes a civil route to address online hate speech.

  • Creates the Digital Safety Commission, Digital Safety Ombudsperson, and Digital Safety Office to regulate and enforce online safety (Part 1).
  • Requires regulated social media services to mitigate harmful content, add child‑safety design features, and block child sexual exploitation and non‑consensual intimate content within 24 hours in many cases (Part 4).
  • Allows user complaints to the Commission and strong enforcement tools, including inspections and penalties up to the greater of 6% of global revenue or $10,000,000; criminal fines up to the greater of 8% or $25,000,000 (Part 7).
  • Increases Criminal Code penalties for hate propaganda, defines “hatred,” and creates a new hate‑crime offence punishable by up to life imprisonment (Part 2).
  • Re‑introduces a Canadian Human Rights Act provision letting people file complaints about online hate speech with possible remedies up to $20,000 compensation and a $50,000 penalty (Part 3).
  • Expands and streamlines mandatory reporting of child pornography by Internet services and lengthens data preservation to 1 year (Part 4).

What it means for you#

  • Households and users

    • You can flag harmful content on major social platforms; platforms must confirm receipt and say what action they took or why none was taken (Part 4: Tools and processes to flag).
    • You can complain to the Commission about child sexual exploitation content or non‑consensual intimate content; the Commission can order platforms to make it inaccessible in Canada, including permanently (Part 6).
    • Private messaging features are excluded from these platform duties (Interpretation: Exclusion of private messaging feature).
    • Platforms must offer tools to block other users (Part 4: Tools to block users).
  • Parents and children

    • Platforms must add child‑protection design features set by regulation (examples may include parental controls and child‑specific privacy settings) (Part 4; Regulations, paragraph (o)).
    • Harmful content includes bullying and content that induces a child to self‑harm (Interpretation: definitions).
    • Operators are not required to proactively scan for content, except regulations may require upload‑blocking technology for child sexual exploitation material (Interpretation: Proactive search not required; possible CSAM upload prevention).
  • Victims of image‑based abuse and child sexual exploitation

    • Platforms must make flagged or identified content inaccessible to people in Canada, generally within 24 hours, while they decide if it must stay down (Part 4: Duty to Make Certain Content Inaccessible).
    • You can seek help through the Commission’s process. For hate speech, you may file a human rights complaint and, if upheld, a tribunal may order up to CAD $20,000 compensation and a CAD $50,000 penalty against the respondent (CHRA s.13; s.53.2).
  • Content creators and community moderators

    • If your content is flagged as certain illegal content, you will be notified and may make representations; you can request reconsideration of platform decisions (Part 4: Representations; Reconsideration).
    • Platforms must label harmful content that has been unusually amplified by automated programs (bots) when certain conditions are met (Part 4: Multiple instances of automated communication).
  • Platform operators (social media services accessible in Canada)

    • New duties: act responsibly to mitigate user exposure to harmful content; publish user guidelines; enable flagging; provide a resource person; keep compliance records; and file a public digital safety plan with detailed metrics (Part 4).
    • Must integrate child‑safety design features set by regulation (Part 4; Regulations, paragraph (o)).
    • Must preserve content that incites violence or violent extremism for 1 year if made inaccessible (Part 4: Duty to preserve).
    • Subject to inspections, orders, and administrative monetary penalties up to the greater of 6% of gross global revenue or $10,000,000 (Part 7: Maximum penalty).
    • Criminal offences carry fines up to the greater of 8% of gross global revenue or CAD $25,000,000 (Part 7: Offences — operators).
    • May be required to pay regulator charges to recover program costs (Part 9: Cost Recovery).
    • Not required to proactively search for harmful content, subject to any CSAM upload‑prevention regulations (Interpretation).
  • Researchers and civil society groups

    • Accredited organizations may access platform data inventories from safety plans and, by Commission order, specific electronic data for research under strict conditions (Part 5).
  • Internet service providers (ISPs), hosts, and email services

    • Must notify a designated law enforcement body when the service is used to commit a child‑pornography offence; must include transmission data when content is manifestly child pornography and preserve related data for 1 year (Part 4 of that Act as amended).
  • Timeline

    • Criminal Code changes: take effect on the 90th day after Royal Assent (Part 2: Coming into Force).
    • Mandatory reporting changes: take effect 6 months after Royal Assent (Part 4: Coming into Force).
    • Most Online Harms Act provisions: take effect on dates set by order of the Governor in Council (Part 10: Coming into Force).

Expenses#

  • At a glance: Estimated net cost: Data unavailable.

  • Key points

    • Creates new federal entities (Commission, Ombudsperson, Office) and authorizes spending; a Royal Recommendation accompanies the bill (Minister of Justice Recommendation; Part 1).
    • Allows cost recovery from regulated services through charges set by regulation (Part 9).
    • Enables administrative monetary penalties and court‑enforceable fines; revenues are payable to the Receiver General (Part 7), but amounts collected are unknown.
    • Imposes reporting and enforcement duties on federal bodies and a designated law enforcement entity; incremental costs not published.
ItemAmountFrequencySource
Digital Safety Commission/Office/Ombudsperson setup and operationsData unavailableOngoingPart 1
Regulatory charges on operators to recover costsData unavailableOngoingPart 9
Administrative monetary penalties (up to 6% or $10,000,000)Data unavailableAs imposedPart 7
Criminal fines (up to 8% or CAD $25,000,000)Data unavailableAs imposedPart 7
CHRA complaint handling, tribunal orders and enforcementData unavailableOngoingPart 3

Proponents' View#

  • Focused child protection and rapid relief: 24‑hour timelines to make certain illegal content (child sexual exploitation; non‑consensual intimate images) inaccessible and a clear complaint route through the Commission (Part 4; Part 6).
  • Strong but targeted platform duties: duty to act responsibly, publish safety plans with metrics, and add child‑safety design features set by regulation (Part 4; Regulations, paragraph (o)).
  • Serious penalties and enforcement tools to ensure compliance, scaled to global revenue (up to 6% AMPs; up to 8% criminal fines) (Part 7).
  • Speech safeguards: “hatred” is defined as detestation or vilification; the law clarifies that being discredited, humiliated, hurt, or offended is not enough (Criminal Code s.319(7)–(8); Online Harms Act: Purposes; Part 4: No unreasonable or disproportionate limit on expression).
  • Civil route for online hate: re‑establishes CHRA complaints with capped remedies (up to CAD $20,000 compensation; CAD $50,000 penalty) and protections for complainant identity (CHRA s.13; s.40(8)–(13); s.53.2).
  • Practical scope limits: private messaging excluded; no general duty to proactively search content (Interpretation and Application).

Opponents' View#

  • Risk to lawful expression: “harmful content” (e.g., bullying, content that induces self‑harm, content that “foments hatred”) is broad; 24‑hour deadlines and high penalties may push platforms to over‑remove lawful speech (Interpretation: definitions; Part 4; Part 7).
  • Due process concerns: new recognizance for feared hate propaganda or hate crime allows conditions (including electronic monitoring and bodily‑substance testing) without a conviction (Criminal Code s.810.012; s.810.3).
  • Privacy and data‑access risks: inspectors can compel access to documents and systems, and accredited third parties can obtain platform data for research by order, raising confidentiality and security questions despite safeguards (Part 5; Part 7: Inspections; Part 8).
  • Burden on smaller services: even services below the user threshold can be designated as regulated if the Governor in Council sees significant risk (Application: Regulated service; subsection (3)), creating compliance costs that may be hard to absorb.
  • Implementation uncertainty and costs: new federal bodies, regulations, and cost‑recovery charges on operators lack published budgets; charges may be passed on to users or creators (Part 1; Part 9).

Timeline

Feb 26, 2024 • House

First reading

Sep 23, 2024 • House

Second reading

Technology and Innovation
Criminal Justice
Social Issues