top of page

Navigating Canada's Online Harms Act: An Overview

a woman scrolling on her smartphone

written by

Basia Walczak, Counsel, INQ Law

The amount of harmful content on social media has raised significant concerns about its impact on individuals, communities, and society at large. Cyberbullying, hate speech, mis/disinformation, and the distribution of illegal content, including content sexually exploiting children, have become increasingly commonplace.

Existing laws and regulations in Canada have struggled to keep pace with the rapidly evolving nature of online harms, leading to calls for more robust legislative measures¹ to address these challenges.

What is Bill C-63

In response to the growing challenges posed by harmful content on the internet, the Canadian government has introduced Bill C-63, the Online Harms Act (“the Act”).² This comprehensive legislation has two aims: holding social media services to account in addressing harmful content hosted on their platforms and instilling greater transparency around how this content is managed.³

Who does Bill C-63 Apply to?

Bill C-63 applies to operators of social media services, including livestreaming and user-uploaded adult-content services, such as Facebook, Twitch, and PornHub.⁴ A social media service is defined as the following: “a website or application that is accessible in Canada, the primary purpose of which is to facilitate interprovincial or international online communication among users of the website or application by enabling them to access and share content.‍”⁵

Key Provisions of Bill C-63:

Bill C-63 intends to mitigate online harms in enacting the following measures:

  1. Establishing New Duties for Social Media Services: social media services would be required to remove content ⁶ (1) that sexually victimizes a child or revictimizes a survivor, and (2) is intimate content posted without consent. Bill C-63 likewise imposes a duty to keep records, including all information and data necessary to determine compliance with the Act.

  2. Increased Regulatory Oversights: Bill C-63 would establish a new Digital Safety Ombudsperson⁷ to act as a resource and advocate for the public interest with respect to issues related to online safety and create a new Digital Safety Commission⁸ to oversee and enforce the Act’s regulatory framework;

  3. Mandatory Reporting of Internet Child Pornography: social media services would be required to implement mechanisms flagging harmful content, create safety measures geared towards children, and establish measures to reduce exposure to seven categories of harmful content,⁹ including content that involves bullying children or promoting self-harm among young people;¹⁰

  4. Increased Penalties: Non-compliance with Bill C-63 could lead to strict penalties: up to 6% of gross global revenue or $10 million, whichever is greater.

Looking Ahead

  • Ensure to keep yourself appraised of any developments concerning Bill C-63 and its implications for various practice areas including criminal law, human rights law, and online regulation;

  • Bill C-63 introduces significant new obligations and regulatory requirements for social media services, requiring proactive measures to prevent and mitigate the spread of harmful content. Consider assessing existing content moderation processes, particularly those directed at content featuring children, and compare them to the provisions under Bill C-63;

  • Consider reviewing current data retention and destruction policies in accordance with Bill C-63.



Although Bill C-63 has yet to receive Royal Assent, it represents a significant step towards addressing the complex challenges posed by harmful content online.

By understanding the key provisions of Bill C-63 and its implications, legal professionals and those operating social media services can effectively navigate the evolving landscape of online regulation in Canada and contribute to efforts to create a safer and more secure online environment for all.

INQ is available to assist with practical strategies concerning this new piece of legislation including through improving your organization’s privacy and data practices.


¹ The Act proposes amendments to the Criminal Code, the Canadian Human Rights Act, and An Act respecting the mandatory reporting of internet child pornography by persons who provide an internet service. Bill C-63 amends the Criminal Code to strengthen penalties for certain offenses related to online harms, such as cyberbullying, harassment, and the distribution of child sexual abuse material. One amendment will also introduce a new offense (punishable by up to life imprisonment) for those found guilty of inciting genocide. With regards to the Canadian Human Rights Act, Bill C-63 will make “communication of hate speech” by means of the internet, or other telecommunication means, a discriminatory practice. Lastly, the Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service will require persons who provide internet services to report suspected cases of internet child pornography to law enforcement authorities.

⁴ considerations considered in defining a “social media service” will include the assessment of factors such as the number of users on a platform or whether there is a “significant risk that harmful content is accessible on the service.”

 Social media services will have a new 24-hour takedown requirement in making certain content inaccessible, including added measures allowing users to flag the content directly and file a complaint about its existence to a new Digital Safety Commission.

⁷ The Commission would oversee and enforce the new regulatory framework and the Ombudsperson would act as a resource and advocate for users and victims. Their responsibilities would include gathering information from users on an ongoing basis and issue calls for written submissions to solicit views on specific issues; conducting consultations with users and victims; and directing users to proper resources such as law enforcement or help lines.

⁸ The new Digital Safety Commission would have the following powers: enforcing legislative and regulatory obligations and holding online services accountable for their responsibilities through auditing for compliance, issuing compliance orders and penalizing services that fail to comply; collecting, triaging, and administering user complaints and reports about services’ obligations; enforcing the removal of content that sexually victimizes a child or revictimizes a survivor and intimate content communicated without consent; and setting new standards for online safety by providing guidance to services on how to mitigate risk, perform research, work with stakeholders and develop educational resources for the public, including children and parents.

⁹ Bill C-63 is focused on the following seven types of content online: content that sexually victimizes a child or revictimizes a survivor; intimate content communicated without consent; violent extremist and terrorist content; content that incites violence; content that foments hatred; content used to bully a child; and content that induces a child to harm themselves.


57 views0 comments


bottom of page