Back to Bills

Stop Algorithm Bias, Explain Data Use

Full Title: An Act respecting transparency for online algorithms

Summary#

This bill would set rules for how online communication services use algorithms and personal information. It aims to stop discriminatory outcomes and require clearer public explanations of what data is used, how, and why content is shown or hidden. The Canadian Radio-television and Telecommunications Commission (CRTC) would inspect and enforce compliance, and a new advisory committee would study impacts and report yearly.

  • Requires platforms to post plain-language notices about data collected, algorithm use, weighting of data categories, and cross-border transfers (Bill s.4).
  • Bans algorithmic practices that cause adverse differential treatment based on prohibited grounds of discrimination (e.g., race, sex, disability) (s.5; Canadian Human Rights Act s.3(1)).
  • Prohibits discriminatory use of personal information for access to goods and services and for key opportunities like jobs, housing, credit, insurance, health care, and education (s.6).
  • Sets rules for de-identifying data and forbids re-identification except to test security (s.8).
  • Requires providers to keep all records needed to show compliance (s.9).
  • Establishes CRTC inspectors with powers to obtain documents and data and to order certain actions (ss.12–15), plus fines up to $1,000,000 (s.16).
  • Creates a 7‑member expert advisory committee to research and report annually (ss.10–12).

What it means for you#

  • Households and users

    • You would see clear notices on service websites/apps about what personal data is collected, how algorithms use it, how data categories are weighted or ranked, and whether data moves across borders (s.4(a)–(d)). This starts 18 months after Royal Assent (s.22(2)).
    • You would be protected from algorithms that use your personal information in ways that result in adverse differential treatment on prohibited grounds (s.5). This starts 12 months after Royal Assent (s.22(1)).
    • Platforms would be barred from using your personal information to limit access to public-facing goods and services or to target job, housing, credit, insurance, health care, or education opportunities in discriminatory ways (s.6).
    • You would not get a private right to sue under this Act; enforcement runs through the CRTC and the courts for offences (ss.12–18). Data on complaint processes is not specified.
  • Workers and job seekers

    • Platforms could not use personal information in ways that make employment ads or opportunities unavailable to you based on prohibited grounds (s.6(b)).
    • Targeted outreach to under-represented groups is allowed if fair, not deceptive, and not lower quality than general offers (s.7(b)).
  • Businesses that advertise

    • Ad tools for jobs, housing, credit, insurance, health care, or education must avoid targeting that leads to discriminatory unavailability (s.6(b)). Platforms, not advertisers, are directly regulated, but tool design and use may change to comply (ss.5–6).
  • Online communication service providers (social networks, forums, public-facing platforms)

    • Post required algorithm and data-use disclosures, including the method by which algorithms prioritize or weight data categories (s.4(b)–(d); s.22(2)).
    • Implement measures so moderation procedures, rules, and systems do not cause adverse differential treatment on prohibited grounds (s.5; s.22(1)).
    • Do not design or use features that make goods, services, or key opportunities unavailable in discriminatory ways (s.6).
    • If you de-identify user data, match safeguards to purpose and sensitivity, and do not re-identify except to test security (s.8).
    • Keep all records needed to show compliance; details may be set by regulation (s.9; s.21(d)).
    • Cooperate with CRTC inspectors who can access documents and data, order activities to stop or start, and take copies; warrants are required for dwellings (ss.12–15).
    • Face fines up to $1,000,000 on indictment for breaches of s.4–6; lesser fines for s.8–9 (s.16–18). Directors/officers can be personally liable for s.4–6 offences if they directed or influenced the conduct (s.17).
  • Startups and small providers

    • No size-based exemption is specified. If your service’s primary purpose is enabling users to communicate with other users and it is accessible in Canada, obligations apply unless the service enables only private communications (definitions; s.21(a)–(b)).

Expenses#

Estimated net cost: Data unavailable.

  • No appropriation amounts are stated. The bill creates new duties for the CRTC (inspections, annual reports) and the Minister (reports) and pays remuneration and expenses for a 7‑member advisory committee (ss.10–12). Data unavailable.

  • Potential fines (actual revenue unknown):

    ItemAmountFrequencySource
    Fine for contravening s.4–6 (on indictment)Up to CAD $1,000,000Per offence(s.16(a))
    Fine for contravening s.4–6 (summary)Up to $100,000Per offence(s.16(b))
    Fine for contravening s.8 or s.9 (summary)Up to $50,000 first; up to $100,000 subsequentPer offence(s.16)
  • Regulations may create administrative monetary penalties; amounts to be set later (s.21(e)). Data unavailable.

Proponents' View#

  • The bill makes algorithm use more transparent by requiring a public description of data collected, how algorithms use it, and how categories are weighted or ranked (s.4(b)–(c)). This helps users understand why they see or do not see content.
  • It targets discrimination risks by imposing a results-focused duty to avoid adverse differential treatment in content moderation systems (s.5) and by barring discriminatory use of personal information for access to goods, services, and key opportunities (s.6).
  • It adds accountability through CRTC inspections, document access, and the ability to order remedial actions, backed by fines up to $1,000,000 (ss.12–16).
  • It limits privacy harms by setting standards for de-identification and banning re-identification except to test security controls (s.8).
  • It builds an evidence base via a 7‑member expert advisory committee that must research algorithmic discrimination and report to Parliament annually (ss.10–12).
  • It allows fair, non-deceptive outreach to under-represented groups, supporting inclusive access without lowering offer quality (s.7(b)).

Opponents' View#

  • Key terms like “adverse differential treatment” are not defined in detail, which may create uncertainty about what outcomes trigger liability and how to measure compliance (s.5).
  • Disclosure of how algorithms weigh categories could expose trade secrets or enable manipulation of ranking systems, increasing security and gaming risks (s.4(c)).
  • Compliance burdens may be high for smaller providers, given broad record-keeping, system changes, and inspection readiness, with no small-entity exemption (s.9; ss.12–15).
  • Inspector powers to access data, order activities to stop or start, and remove materials may raise confidentiality and data-protection concerns, especially for sensitive or proprietary information (s.13(2)).
  • The Act overlaps with existing human rights and privacy regimes and relies on later regulations for key details (e.g., what counts as “private communications,” record formats, and administrative penalties), which could delay clarity (s.21(a)–(e)).
  • Restrictions on targeting for jobs, housing, credit, insurance, health care, and education may lead platforms to limit or simplify ad tools broadly to avoid risk, potentially reducing useful outreach even when lawful (s.6; s.7(b)).

Timeline

Jun 17, 2022 • House

First reading

Technology and Innovation
Social Issues
Labor and Employment
Housing and Urban Development
Healthcare
Education
Trade and Commerce