Platform operators (social media, gaming, apps, hosting, email and messaging services)
- You must design with minors’ best interests in mind and reduce listed harms (e.g., sexual exploitation, self‑harm promotion, addiction‑like patterns).
- Provide clear safety settings, parental controls, ad labels, data use explanations, and a reporting channel with an internal response process.
- Set protective defaults, offer a chronological feed option, and use privacy‑preserving age checks.
- Keep audit logs, complete an independent safety review every two years, and publish annual reports with risk and usage metrics.
- You cannot use “dark patterns” that push users to weaken safety or require a digital ID to access services.
- Non‑compliance can bring fines up to tens of millions of dollars.
People who are harassed online
- Repeated online contact can count as criminal harassment.
- Anonymous or fake‑identity harassment can lead to tougher sentences. Courts can order steps to identify the harasser.
- Courts can order no‑contact conditions and limit an offender’s internet use.
- Victims can get restitution for costs to remove intimate or fake intimate images from the internet.
Law enforcement and regulators
- A single designated body will receive mandatory reports about child sexual abuse material from internet services, with certain technical data when the content is clearly illegal.
- Required data must be preserved for one year and reported annually to federal ministers.
- The CRTC will set guidelines for research involving minors.