EXECUTIVE SUMMARY
I. The LCO Defamation Law in the Internet Age Project
The LCO’s Final Report on Defamation Law in the Internet Age is a comprehensive review of Ontario’s defamation law and its application to the internet. The report examines how current defamation law applies to online speech, and makes recommendations for reform. It also considers the implications of new technologies, such as social media, for defamation law.
The report recommends that Ontario should adopt a modernized version of the common law defences of truth and responsible communication on matters of public interest. This defence would provide greater protection for those who make statements about matters of public interest in good faith, while still allowing individuals to seek redress when they have been defamed. The report also recommends that Ontario should introduce a statutory cause of action for serious harm caused by false statements made online or through other digital media.
In addition, the report recommends that Ontario should create an independent tribunal to hear complaints about online defamation and provide remedies such as orders requiring removal or correction of false statements. The tribunal would be empowered to order damages, costs, and other remedies appropriate in the circumstances. Finally, the report recommends that Ontario should consider introducing legislation to protect individuals from maliciously false statements made online with intent to harm their reputation or economic interests.
Overall, this Final Report provides a comprehensive review of Ontario’s defamation law and its application to the internet age. It offers practical solutions for protecting individuals’ reputations while preserving freedom of expression online.
The Final Report provides a comprehensive overview of the current state of Ontario’s defamation law, identifies areas for reform, and makes recommendations on how to modernize the law. The report also includes an analysis of the impact of defamation law on freedom of expression in Ontario.
The Final Report is divided into four parts: Part I provides an overview of the project; Part II examines the current state of Ontario’s defamation law; Part III outlines areas for reform; and Part IV sets out recommendations for reform. The report also includes appendices that provide additional information about the project, including a summary of public consultations and research findings.
The LCO’s research and analysis included a comprehensive review of the existing legal framework, including an examination of the common law, statutes, case law and academic literature. We also consulted with a wide range of stakeholders, including members of the public, academics, media organizations, lawyers and other experts.
The LCO’s final report on Defamation Law in Ontario was released in April 2018. The report contains over 100 recommendations for reform to Ontario’s defamation law. These recommendations are designed to modernize the law while protecting freedom of expression and reputation. The report also includes a draft Defamation Act that would codify many of these reforms into legislation.
The LCO has considered the impact of the internet on defamation law in two ways. First, it has examined how existing defamation law applies to online activities and whether any changes are needed to ensure that the law is effective in this context. Second, it has considered whether there are any new or additional legal protections that should be provided for online activities.
In terms of existing defamation law, the LCO has concluded that there is a need for greater clarity and certainty regarding the application of the law to online activities. This includes clarifying when an individual can be held liable for defamatory content posted by another person, as well as providing guidance on when a website or other online platform can be held liable for defamatory content posted by its users. The LCO also recommends that consideration be given to introducing a “notice-and-takedown” system similar to those used in other jurisdictions, which would allow individuals and organizations to request that defamatory content be removed from websites and other online platforms.
In terms of new or additional legal protections, the LCO recommends that consideration be given to introducing a “right of reply” system similar to those used in some other jurisdictions, which would allow individuals who have been defamed online to respond publicly with their own statement of facts. The LCO also recommends that consideration be given to introducing a “right of erasure” system similar to those used in some other jurisdictions, which would allow individuals who have been defamed online to request that their personal information be removed from websites and other online platforms. Finally, the LCO recommends that consideration be given to introducing a “right of access” system similar to those used in some other jurisdictions, which would allow individuals who have been defamed online to obtain copies of any documents containing their personal information from websites and other online platforms.
The internet has revolutionised the way we communicate, and with it, the way we interact with each other. It has enabled us to connect with people from all over the world, share our thoughts and opinions, and access information quickly and easily. However, this new technology also brings with it a range of legal issues that need to be addressed.
One of these issues is the role of internet intermediaries such as Facebook and Google in facilitating online communications. These companies provide platforms for users to post content, but they are not responsible for creating or moderating that content. This means that they can be held liable for any defamatory material posted on their sites if they fail to take reasonable steps to remove it.
In response to this issue, many countries have introduced legislation which seeks to protect internet intermediaries from liability for user-generated content. These laws generally provide immunity from liability if the intermediary takes reasonable steps to remove offensive material when notified by a third party. This approach has been criticised by some as being too lenient on companies who fail to take action against offensive content.
At the same time, there is an ongoing debate about how best to balance freedom of expression with protecting individuals from defamation online. Some argue that existing defamation laws are outdated and do not adequately protect individuals from malicious online attacks or false accusations made against them. Others argue that existing laws are sufficient and should be enforced more rigorously rather than introducing new legislation which could potentially limit freedom of expression online.
Ultimately, it is clear that there is no one-size-fits-all solution when it comes to regulating online communications and protecting individuals from defamation online. Each country must find its own balance between protecting freedom of expression and ensuring individuals are protected from malicious attacks or false accusations made against them on the internet.
The internet has also created a new platform for anonymous defamation. It is much easier to post defamatory content anonymously online than it is in traditional media, making it difficult to identify the source of the defamation and hold them accountable. This can lead to a chilling effect on free speech, as people may be less likely to express their opinions if they fear being targeted by anonymous defamers.
Finally, the internet has made it easier for people to spread false information quickly and widely. This can have serious implications for reputation management, as false information can spread rapidly and be difficult to contain or correct.
We recommend that the Ontario government amend the LSA to:
1. Establish a single publication rule for internet content, so that each time a defamatory statement is accessed online, it is treated as a new publication and a new cause of action arises;
2. Create an explicit defence for internet intermediaries who are not responsible for creating or publishing defamatory content;
3. Introduce a notice-and-takedown procedure to allow individuals to request the removal of defamatory content from websites; and
4. Provide immunity from liability for those who comply with such requests in good faith.
These changes would bring Ontario’s defamation law into line with other areas of law that regulate offensive internet content, while also providing greater protection for individuals whose reputations have been damaged by online defamation.
The LCO has identified a number of key issues that need to be addressed in order to ensure that Ontario’s legal framework is up-to-date and responsive to the changing nature of digital technology. These include:
1. Ensuring that individuals have meaningful control over their personal information and data, including through the use of privacy enhancing technologies;
2. Strengthening the enforcement of privacy laws, including through increased fines for violations;
3. Clarifying the roles and responsibilities of organizations with respect to data protection;
4. Establishing clear rules for online contracts and consumer protection;
5. Developing appropriate frameworks for artificial intelligence (AI) and machine learning (ML);
6. Addressing the challenges posed by online platforms, such as those related to competition, market power, and consumer protection;
7. Examining how existing laws can be adapted or updated to address emerging technologies such as blockchain;
8. Considering how best to protect vulnerable populations from potential harms associated with digital technology; and
9. Exploring ways to promote innovation while protecting public safety and security interests.
The Report’s recommendations are organized into four main areas:
1. Defamation Law Reform: The Report recommends a number of reforms to modernize defamation law, including changes to the defences of truth and responsible communication on matters of public interest, as well as changes to the rules governing damages and costs.
2. Access to Justice: The Report recommends measures to reduce barriers to access to justice for those seeking redress for online reputational harm, including changes to court procedures and the introduction of an expedited process for resolving certain types of claims.
3. Internet Intermediary Responsibility: The Report recommends measures designed to promote internet intermediary responsibility for defamatory content posted by their users, including a new notice-and-notice system and a new notice-and-takedown system.
4. Education and Awareness: The Report recommends measures designed to increase public awareness about defamation law and how it applies in the online context, as well as initiatives aimed at educating internet intermediaries about their legal obligations with respect to defamatory content posted by their users.
Executive Summary
This report provides an analysis of the current state of the XYZ Corporation and offers recommendations for improvement. The analysis was conducted by examining the company’s financial performance, customer satisfaction ratings, employee engagement levels, and competitive landscape.
The findings indicate that XYZ Corporation is in a strong financial position with increasing revenues and profits over the past three years. Customer satisfaction ratings are also high, with customers reporting positive experiences with the company’s products and services. However, employee engagement levels are low, indicating a need for improved communication between management and staff. Additionally, the competitive landscape is becoming increasingly crowded as new competitors enter the market.
Based on these findings, it is recommended that XYZ Corporation focus on improving employee engagement by implementing better communication strategies between management and staff. Additionally, it is recommended that the company invest in marketing to increase brand awareness and differentiate itself from competitors. Finally, it is recommended that XYZ Corporation explore new product offerings to stay ahead of competition in the market.
Overall, this report provides an overview of XYZ Corporation’s current state and offers recommendations for improvement. With proper implementation of these recommendations, XYZ Corporation can continue to be successful in its industry.
The LCO Defamation Project is a research project conducted by the Law Commission of Ontario (LCO). The project seeks to examine the current state of defamation law in Ontario and to make recommendations for reform. The project focuses on how defamation law affects freedom of expression, access to justice, and other related issues. It also looks at how technology has changed the way people communicate and how this has impacted defamation law. The project includes public consultations, research, and analysis. The final report will be released in 2021.
II. The Law Commission of Ontario
The Las Cumbres Observatory (LCO) is a global network of robotic telescopes that provides access to the night sky for professional and amateur astronomers alike. Founded in 2013, LCO operates a network of over 40 telescopes located at sites around the world, including Australia, Chile, South Africa, and the United States. The observatory’s mission is to enable cutting-edge research by providing access to high-quality data from its telescopes. LCO also offers educational opportunities for students and teachers through its online platform and outreach programs. In addition to its research capabilities, LCO also provides public access to its telescope network through its website, allowing anyone with an internet connection to explore the night sky.
III. Why is Defamation Law Reform Important in the Internet Age?
Defamation law is based on the common law tort of defamation. The elements of a defamation claim are: (1) a defamatory statement; (2) publication of the statement to at least one person other than the plaintiff; and (3) damage to the plaintiff’s reputation.5 A defamatory statement is a false statement that tends to lower the plaintiff’s reputation in the eyes of a reasonable person.6
The defences available to a defendant in a defamation action include truth, absolute privilege, qualified privilege, fair comment and responsible communication on matters of public interest.7 In addition, there are statutory defences such as those found in provincial legislation like Ontario’s Libel and Slander Act8 or Canada’s Defamation Act.9
In recent years, Canadian courts have been increasingly willing to recognize new defences such as responsible journalism10 and responsible communication on matters of public interest11 which provide greater protection for freedom of expression while still protecting reputation.
Today, however, the social context has changed. Freedom of expression is now seen as a fundamental right and is increasingly valued by society. This shift in values has led to a re-evaluation of defamation law, with more emphasis being placed on protecting freedom of expression. In many jurisdictions, this has resulted in a relaxation of the standards for proving defamation and an increased focus on balancing the interests of reputation and free speech.
The modern approach to defamation law is based on the principle that individuals should be able to express their opinions without fear of legal repercussions. This means that the law must strike a balance between protecting an individual’s right to freedom of expression and protecting another individual’s reputation. To achieve this balance, courts have developed a number of defences which can be used by defendants in defamation cases. These defences include truth, fair comment, qualified privilege and responsible communication on matters of public interest. In addition, legislation has been enacted in some jurisdictions which provides additional protection for those who publish material in the public interest.
The complexity of defamation law has been further compounded by the emergence of the internet. The internet has enabled people to communicate with a much wider audience than ever before, and this has led to an increase in the number of defamation cases being brought before the courts. This has resulted in a need for greater clarity and consistency in the application of defamation law, as well as an increased focus on balancing freedom of expression with protection from harm.
The internet has made it easier for people to communicate with each other, and this has had a significant effect on defamation law. The internet allows for the rapid dissemination of information, which can be difficult to control or contain. This means that defamatory statements can spread quickly and widely, making it more difficult to identify the source of the statement and take action against them. Additionally, because of the global nature of the internet, it is often unclear which jurisdiction’s laws apply in a particular case. This can make it difficult to determine what constitutes defamation in a given situation.
The internet also makes it easier for anonymous users to post defamatory statements without fear of repercussions. This has led to an increase in online harassment and cyberbullying, as well as false accusations and rumors that can have serious consequences for those targeted by them. In response, many countries have passed laws specifically targeting online defamation and other forms of cybercrime.
Finally, the internet has changed how courts approach defamation cases. Courts are now more likely to consider factors such as the size of an audience when determining whether a statement is defamatory or not. Additionally, courts may consider whether a statement was posted with malicious intent or if it was simply an honest mistake when deciding how much damages should be awarded in a case.
Finally, the internet has also changed the way defamation is litigated. Online defamation cases often involve multiple defendants and multiple jurisdictions, making them more complex than traditional defamation cases. Furthermore, online platforms such as social media have made it easier for individuals to publish defamatory statements, making it more difficult for plaintiffs to identify and sue the responsible parties. As a result, courts have had to develop new legal doctrines and procedures to address these issues.
The internet has changed the landscape of defamation litigation. The traditional paradigm defendant is no longer the only player in the game. The internet has enabled individuals to become publishers, and to reach a global audience with their publications. It has also enabled individuals to become targets of defamatory publications, often from anonymous sources.
The LCO heard from many stakeholders that the current law does not adequately address these new realities. In particular, there was consensus that the current law does not provide sufficient protection for freedom of expression online, nor sufficient remedies for those who are victims of online defamation.
The LCO heard from some stakeholders that the current law should be amended to provide greater protection for freedom of expression online. This could include providing additional defences or immunities for certain types of speech or speakers, such as bloggers or other non-professional media organizations. Other stakeholders suggested that existing defences should be clarified or expanded to better reflect the realities of online publishing and communication. For example, some stakeholders suggested that an “honest opinion” defence should be available even if it is based on facts which are false or unproven; others suggested that a defence should exist where a person publishes information without knowledge of its falsity but with reasonable grounds for believing it to be true; still others suggested that an “innocent dissemination” defence should apply even if a publisher knows about the defamatory content but does not have control over it (e.g., because it is posted by a third party).
The LCO heard from other stakeholders who argued against expanding existing defences or creating new ones, suggesting instead that existing defences are adequate and can be applied flexibly in appropriate cases. These stakeholders argued that any expansion of defences would lead to more harm than good by allowing more false and damaging speech into circulation without consequence. They also argued that expanding defences would create uncertainty in the law and make it difficult for potential plaintiffs to know when they have a viable claim and when they do not – thus discouraging legitimate claims from being brought forward at all.
The law of defamation has had to adapt to the new reality of online communications. In some cases, the law has been slow to catch up with technology, and there is a need for reform in this area. For example, many jurisdictions have yet to recognize that an internet post can be defamatory even if it is not published in a traditional media outlet. Similarly, the concept of “libel tourism†(whereby a plaintiff brings a defamation action in a jurisdiction where the defendant may be more likely to lose) has become increasingly common as people are able to publish material online from anywhere in the world. There is also an ongoing debate about whether or not internet service providers should be held liable for content posted by their users.
In response to these challenges, many countries have adopted laws that provide greater protection for individuals against online defamation. These laws often include provisions that allow victims of online defamation to seek redress without having to go through lengthy and expensive court proceedings. For example, some countries have established specialized tribunals or other dispute resolution mechanisms that allow victims of online defamation to seek remedies such as removal of offending content or financial compensation without having to go through costly litigation. Other countries have adopted laws that require internet service providers and other intermediaries (such as search engines) to take down defamatory content upon request from the victim or upon order from a court or tribunal.
One potential solution is the development of a specialized tribunal or court to handle defamation cases. This would provide an accessible, efficient and cost-effective forum for resolving disputes. The tribunal could be empowered to issue injunctions, order removal of offending material, and award damages. It could also have the power to compel disclosure of anonymous publishers in appropriate cases. Such a system would provide a more effective way to address online defamation than the current civil justice system.
Another potential solution is the use of alternative dispute resolution (ADR) mechanisms such as mediation or arbitration. These processes are often quicker and less expensive than litigation, and can help parties reach mutually agreeable solutions without resorting to costly court proceedings. ADR can also be used to resolve disputes involving anonymous publishers by providing an avenue for parties to negotiate a settlement without having to identify themselves publicly.
Finally, governments should consider introducing legislation that provides for specific remedies for online defamation. This could include measures such as creating criminal penalties for maliciously publishing false information online, or allowing victims of online defamation to seek redress through civil actions against anonymous publishers. Such laws could help deter malicious behaviour while providing victims with an avenue for seeking justice in cases where traditional legal remedies are not available or practical.
The LCO recommends that the government should introduce a new statutory cause of action for serious online harms, including defamation. This would provide an alternative to the existing common law tort of defamation and allow for more effective remedies for those who have been harmed by online speech. The new cause of action should include a range of remedies, such as damages, injunctions and orders for removal or correction of content. It should also provide for a range of defences, including truth and public interest.
The LCO also recommends that the government should introduce a new statutory defence to defamation claims based on internet speech. This defence would be available to those who can demonstrate that they took reasonable steps to remove or correct defamatory content after being notified of its existence. This defence would help protect individuals from liability in cases where they are not responsible for creating or disseminating the defamatory content but are nonetheless liable under existing law.
Finally, the LCO recommends that the government should consider introducing a system of notice-and-takedown procedures for online speech similar to those used in other jurisdictions. Such a system could provide an efficient mechanism for addressing complaints about online speech without resorting to costly litigation.
IV. Seven Principles Guiding Defamation Law Reform
1. Promote freedom of expression: Defamation law should protect freedom of expression, including the right to criticize and debate matters of public interest.
2. Ensure access to justice: Defamation law should ensure that individuals have access to justice when their reputation is unjustly harmed.
3. Balance competing interests: Defamation law should balance the interests of those who wish to protect their reputations with those who wish to express themselves freely.
4. Provide clarity and certainty: Defamation law should provide clarity and certainty for all parties involved in defamation proceedings, including plaintiffs, defendants, lawyers, and judges.
5. Foster responsible communication: Defamation law should foster responsible communication by encouraging parties to consider the potential consequences of their statements before making them public.
6. Minimize costs and delays: Defamation law should minimize costs and delays associated with defamation proceedings so that individuals can seek redress in a timely manner without incurring excessive financial burden.
7. Respect privacy rights: Defamation law should respect the privacy rights of individuals by protecting them from unwarranted intrusions into their personal lives or affairs.
- Defamation Law Must Re-Balance Protection of Reputation and Freedom Of Expression In The Internet Age
The LCO has concluded that a new balancing of protection of reputation and freedom of expression is necessary to accommodate the broader diversity of publications and reputational harm to which defamation law must now respond.
 - Defamation Law Needs to Be Updated; Some Statutory Reforms Are Necessary
Defamation law must evolve coherently in response to new forms of communications made possible by the internet and other technological developments. The LCO has concluded that, where possible, the same rules should apply to all publications. Accordingly, the LCO believes that the LSA should be repealed and replaced with a new Defamation Act that establishes a modern legal framework for resolving defamation complaints in Ontario. The new Act should adopt a flexible and integrated approach to defamation law. Subject to specified exceptions, the LCO recommends against codifying the substantive elements of defamation law in the new statute.
 - Defamation Law Is Evolving; New Reforms Must Complement These Developments
The LCO has concluded that, with some exceptions, reworking the substantive elements of defamation law is unnecessary and may tend to destabilize the balance between freedom of expression and protection of reputation achieved by recent reforms. In our view, the primary problem in the law of defamation is the procedural barriers to access to justice in the internet era. Our recommendations focus mostly (but not entirely) on these barriers.
 - Access To Justice And Dispute Resolution Must Be Improved
The court process remains crucial to protect the important legal rights at stake in defamation law claims. However, as online defamation disputes among private individuals become more common, alternative dispute resolution mechanisms are needed. The LCO’s recommendations are designed to divert high volume, low value defamation claims away from the formal court system and to encourage informal, practical resolution of these claims.
 - Defamation Law Must Specifically Address Online Personal Attacks
Media law cases continue to represent a significant proportion of defamation law claims being heard by Ontario courts. However, defamation cases increasingly involve individuals publishing online personal attacks that do not engage the public interest. Where these attacks are directed at the complainant’s reputation, defamation law is engaged. Traditional defamation law principles and court processes developed to respond to media law cases often fall short when applied to online personal attacks. Defamation law in Ontario must do more to encourage informal resolution of these claims and to improve access to justice for those complainants bringing legal proceedings.
 - There Must Be New Obligations For Intermediary Platforms
Intermediary platforms are a crucial control mechanism for holding online publishers to account for defamatory content. The LCO has concluded that the new Defamation Act must include specific obligations directed at intermediary platforms that host third party content accessible to Ontario users. Our recommendations impose two distinct duties on intermediary platforms: the obligation to pass on notice of a defamation complaint to the publisher of the content, and the obligation to take down content subject to a defamation notice if the publisher of the content does not respond to the notice.
 - Defamation Law and Privacy Law Have Distinct Objectives and Should Remain Separate
The LCO has concluded that, notwithstanding overlapping principles and values in certain respects, defamation law and privacy law continue to be functionally distinct and should remain so.
V. Resolving Defamation Disputes in the Internet Age: An Overview of the LCO’s Proposed Reforms
The LCO recommends that the Ontario government create a new civil cause of action for online defamation. This would provide an accessible and affordable legal remedy to those who have been defamed online, while also providing appropriate protections for freedom of expression. The proposed cause of action should be tailored to address the unique challenges posed by online defamation, including the need to identify anonymous defendants, the potential for multiple defendants, and the need to consider international law.
The LCO also recommends that the government create a specialized tribunal to hear online defamation claims. This tribunal should be designed with an emphasis on accessibility and affordability, with streamlined procedures and low filing fees. It should also have jurisdiction over all types of online content, including social media posts, websites, blogs, and other digital platforms.
Finally, the LCO recommends that the government develop public education initiatives aimed at raising awareness about online defamation and its consequences. These initiatives should focus on educating both victims of online defamation as well as potential perpetrators about their rights and responsibilities under Ontario law.
1. Contact the website or platform hosting the defamatory content and request that it be removed. Depending on the website or platform, this may require submitting a formal complaint or takedown notice.
2. File a lawsuit against the person who posted the defamatory content. This is often a lengthy and expensive process, and may not always be successful.
3. Seek legal advice from an experienced defamation lawyer in Ontario to determine if there are any other options available to you, such as seeking an injunction or damages for harm caused by the defamatory content.
Second, they may contact the author of the book. If the author is still alive and can be contacted, they may be willing to help resolve the complaint.
Third, they may contact a consumer protection agency or other government body in their jurisdiction. Depending on the nature of the complaint, these agencies may be able to provide assistance in resolving it.
Fourth, they may seek legal advice from a lawyer who specializes in copyright law. A lawyer can advise them on their rights and options for pursuing a claim against the publisher or author.
Finally, the complainant may take legal action against the person who posted the defamatory content. This is often a difficult and expensive process, but it can be effective in obtaining a court order to remove or correct the content. Depending on the jurisdiction, this may involve filing a lawsuit for defamation or seeking an injunction to prevent further publication of the material.
The fourth option is to contact the website hosting the defamatory content and request that it be removed. This may be successful if the website has a policy of removing defamatory content upon request, or if the complainant can provide evidence that the content is false or defamatory. However, this approach may not be effective if the website does not have such a policy or if it refuses to remove the content.
The best option is to seek legal advice from a qualified attorney who specializes in online defamation law. This will ensure that the complainant has access to the most up-to-date information and resources available, as well as the ability to pursue any legal remedies that may be available.
1. A streamlined process for resolving complaints quickly and efficiently, with the goal of providing a fair and just outcome for all parties. This stream would be available to those who do not wish to pursue litigation or who cannot afford it. It would involve an independent mediator or arbitrator, who would assess the merits of the complaint and provide a binding decision.
2. A more formal process for resolving disputes that require more detailed consideration of the facts and legal issues involved. This stream would involve a hearing before a panel of experts, including lawyers, journalists, academics, and other professionals with expertise in defamation law. The panel would hear evidence from both sides and make a binding decision on the merits of the case.
3. An appeals process for those dissatisfied with the decisions made in either of the two streams above. This stream would involve an appeal to a higher court or tribunal, which could review both factual and legal issues related to the case.
The LCO also recommends that governments create an independent body to oversee these processes and ensure they are conducted fairly and transparently. This body should have access to resources such as legal advice, research materials, and expert witnesses in order to ensure that all parties receive due process throughout the proceedings.
Finally, governments should consider introducing legislation that sets out clear standards for what constitutes defamation in order to provide greater clarity on this issue for all parties involved in defamation cases.
- Notice and takedown
- Informal negotiation, potentially facilitated by ODR • Court action
The second stream, notice and notice, is discussed in chapter nine of the Report. This option would arise where a complainant notifies an intermediary platform of allegedly defamatory material being hosted on the platform. The platform would pass the complaint on to the publisher. The publisher would have a short time to respond (the LCO recommends two days). If the publisher does not respond, the platform would be required to take down the offending content. If the publisher does respond, then both parties could pursue their claims through a defamation action or other legal process.
The third stream, formal adjudication, would be available to claimants who are unable to resolve their complaint through informal negotiation. This stream could be established by the creation of a specialized tribunal or court with jurisdiction over online harms. The tribunal or court would have the authority to hear complaints and issue orders for remedies such as damages, injunctions, and other forms of relief. The tribunal or court could also provide guidance on the interpretation and application of the notice requirement discussed in chapter four.
The LCO recommends that the court action stream be supplemented by a new form of dispute resolution, which we refer to as an expedited tribunal. This tribunal would provide a more accessible and less costly alternative to court proceedings for lower value claims. The tribunal would have the power to order takedowns, but not damages or other remedies. It would be available only in cases where the defendant has consented to its jurisdiction and where both parties agree to waive their right to appeal.
1. Remove the content in question;
2. Publish a correction or apology;
3. Offer to enter into mediation with the complainant; or
4. Refuse to take any action and proceed to court.
If the publisher refuses to take any action, then the complainant may commence a defamation action in court. The court would consider all relevant evidence, including whether the publisher had received notice of complaint and taken appropriate steps in response, before deciding on liability and damages.
- Negotiate informally with the complainant.
- Ignore the notice. Online publishers ignoring the notice could have their content taken down by the hosting platform (subject to a putback option). Both online and offline publishers ignoring notice could be sued for defamation after four weeks had passed.
- Challenge the notice. In this case, online content would remain online. Both online and offline publishers could be sued for defamation after four weeks had passed.
- The LCO expects most high volume, low value claims involving personal reputation and/or private publishers would more likely be resolved either by takedown or informal negotiations.
- These procedural streams are summarized in the following chart and explained in more detail in the Final Report.
VI. Summary of Final Report Recommendations
The LCO’s analysis focused on the legal and policy issues surrounding the regulation of digital currency in Ontario. The LCO identified a number of areas where existing laws and regulations may be inadequate to address the risks associated with digital currency, including consumer protection, anti-money laundering, taxation, and privacy.
The LCO’s recommendations include:
1. Establishing a regulatory framework for digital currency businesses that is tailored to their specific needs;
2. Developing guidance for consumers on how to use digital currencies safely;
3. Enhancing consumer protection measures by requiring businesses to register with regulators and comply with certain standards;
4. Strengthening anti-money laundering regulations by requiring businesses to report suspicious transactions;
5. Clarifying tax obligations related to digital currencies; and
6. Ensuring that privacy protections are in place for users of digital currencies.
Defamation law is a complex area of the law, and the LCO’s project will not attempt to provide a comprehensive review of all aspects of defamation law. Rather, it will focus on those elements that are particularly relevant to online content and technology. These include:
1. The definition of defamation: Defamation is generally defined as a false statement that harms another person’s reputation. In some jurisdictions, this definition has been expanded to include statements that are true but which have a tendency to harm someone’s reputation.
2. The defences available in defamation cases: Defences such as truth, honest opinion and privilege can be used by defendants in defamation cases to protect themselves from liability for defamatory statements they have made.
3. The remedies available in defamation cases: Remedies such as damages, injunctions and apologies can be sought by plaintiffs in defamation cases to compensate them for any harm caused by defamatory statements made against them.
4. The impact of technology on defamation law: Technology has had a significant impact on the way in which information is disseminated and consumed, and this has had an effect on how courts interpret and apply defamation law. This includes issues such as the application of the single publication rule, the availability of statutory immunities for online intermediaries, and the use of algorithms to identify potentially defamatory content online.
5. The role of public figures in defamation cases: Public figures often face greater scrutiny than private individuals when it comes to their reputations, and this can affect how courts interpret and apply defamation law when dealing with claims brought by public figures.
The common law has been developed through court decisions, which have established the elements of a defamation claim and the defences available to defendants.
The elements of a defamation claim in Ontario are: (1) that the defendant made a statement; (2) that was false; (3) that was published; (4) that caused harm to the plaintiff’s reputation; and (5) that was not privileged.
The defences available to defendants include truth, fair comment, responsible communication on matters of public interest, absolute privilege, qualified privilege, and innocent dissemination.
Truth is an absolute defence to a defamation claim. If the defendant can prove that the statement they made is true, then they cannot be held liable for defamation.
Fair comment is another defence available to defendants. This defence applies when the defendant makes a comment about a matter of public interest or concern and it is based on facts which are true or reasonably believed to be true by the defendant.
Responsible communication on matters of public interest is another defence available to defendants in Ontario. This defence applies when the defendant has taken reasonable steps to ensure accuracy and fairness in their reporting on matters of public interest or concern.
Absolute privilege is another defence available to defendants in Ontario. This defence applies when statements are made during certain proceedings such as court proceedings or parliamentary debates where absolute immunity from liability for defamation exists.
Qualified privilege is another defence available to defendants in Ontario. This defence applies when statements are made between two parties who have an existing relationship or duty such as between employer and employee or landlord and tenant where there is an expectation of confidentiality between them.
Innocent dissemination is another defence available to defendants in Ontario. This defence applies when the defendant did not know about or had no control over how their statement was disseminated by someone else who was responsible for its publication.
However, we do recommend a number of reforms to the procedural elements of defamation law. These reforms are intended to reduce the cost and complexity of defamation proceedings, while still providing adequate protection for reputation. The proposed reforms include:
• Introducing a “notice and take-down” procedure for online content;
• Establishing a new cause of action for “malicious falsehood”;
• Allowing defendants to make early offers to settle claims;
• Limiting the scope of discovery in defamation proceedings; and
• Establishing a new statutory defence for responsible communication on matters of public interest.
The chapter also considers the implications of the Supreme Court of Canada’s decision in Grant v. Torstar for online defamation law in Ontario, as well as the potential impact of the proposed Defamation and Libel Act.
The LCO heard from a range of stakeholders that the current notice regime is overly restrictive and can lead to injustice. The LCO also heard that the notice requirement is not well understood by potential claimants, and that it can be difficult for them to obtain legal advice in time.
In response, the LCO recommends introducing a new notice regime which would allow for more flexibility in terms of when notice must be given. This would include allowing for an extension of time where appropriate, and providing greater clarity on what information must be included in the notice. The LCO also recommends introducing a limitation period for claims arising out of libel in newspapers or broadcasts, which would provide greater certainty to both claimants and defendants.
The LCO heard that the notice requirement should be reformed to better reflect the realities of modern media publishing. The LCO also heard that the notice requirement should be simplified and streamlined, with a focus on providing clear guidance to both plaintiffs and publishers. Suggestions for reform included: reducing the time period for giving notice; allowing for electronic delivery of notices; requiring more detailed information in notices; and introducing a “notice-and-take-down” system.
Ultimately, the LCO concluded that reform of the notice requirement is necessary to ensure that it is effective in protecting both plaintiffs and publishers. The LCO recommended that Ontario consider introducing a “notice-and-take-down” system, which would require publishers to remove allegedly defamatory content upon receipt of a valid notice from a plaintiff. This would provide an efficient mechanism for resolving disputes without resorting to litigation.
The LCO recommends that the new notice regime should be based on a “notice and take down” model. This would require publishers to remove defamatory content upon receipt of a notice from an affected party. The notice should include information about the allegedly defamatory material, the identity of the complainant, and contact information for both parties. The publisher should then have a reasonable period of time to investigate the complaint before taking action. If it is determined that the content is indeed defamatory, it should be removed or amended as appropriate. If not, no action should be taken.
The LCO also recommends that any new notice regime should include provisions for dispute resolution and penalties for non-compliance. This could include mediation services or other forms of alternative dispute resolution, as well as fines or other sanctions for those who fail to comply with the notice requirements.
Finally, the LCO believes that any new notice regime must be accompanied by public education initiatives to ensure that all stakeholders understand their rights and responsibilities under the law. This could include public awareness campaigns, educational materials, and training programs for publishers and other stakeholders in order to ensure compliance with the new rules.
The LCO recommends that the notice requirement should apply to all persons who publish or republish defamatory material, including individuals, corporations, and online intermediaries. The LCO also recommends that the notice period should be at least 30 days, and that a complainant should not be required to give notice if they can demonstrate that it would be futile or impossible to do so.
The LCO further recommends that intermediary platforms should have an obligation to pass on defamation notices to anonymous online publishers. This could include a requirement for platforms to use reasonable efforts to identify anonymous publishers and provide them with the notice. The LCO also suggests that intermediary platforms should have an obligation to take down defamatory content within a certain time frame after receiving a valid defamation notice.
The LCO recommends that Ontario adopt a single publication rule, with a limitation period of two years from the date of publication. This would provide complainants with sufficient time to bring their claims while protecting publishers from stale claims. The LCO also recommends that the limitation period should be extended in cases where the defamatory material is republished or re-broadcast.
In Ontario, the court has a number of preliminary motions available to plaintiffs and defendants in defamation actions. These motions are designed to provide an efficient and effective way for parties to resolve disputes without having to go through a full trial.
For plaintiffs, the most common motion is an application for an interlocutory injunction. This motion seeks an order from the court that requires the defendant to cease publishing or broadcasting any defamatory material until the case is resolved. The court will consider factors such as whether there is a serious issue to be tried, whether damages would be an adequate remedy, and whether there is a balance of convenience between the parties.
Defendants may also make use of preliminary motions in defamation actions. For example, they may bring a motion for summary judgment if they believe that there is no genuine issue for trial and that they should be granted judgment as a matter of law. Alternatively, defendants may bring a motion to strike out pleadings if they believe that the plaintiff’s claim is frivolous or vexatious or otherwise fails to disclose a reasonable cause of action.
Finally, both plaintiffs and defendants may bring motions for security for costs if they believe that their opponent does not have sufficient funds or assets to pay any costs awarded against them at trial.
These preliminary motions can help streamline proceedings in defamation cases by allowing parties to quickly resolve issues before going through with a full trial. They can also help ensure access to justice by providing courts with tools to protect both plaintiffs’ reputations and defendants’ freedom of expression in an accelerated environment.
(1) motions to dismiss; (2) motions for summary judgment; and (3) motions for a more definite statement.
1. Motions to Dismiss
Motions to dismiss are used in defamation actions when the plaintiff has failed to state a claim upon which relief can be granted. The court will consider whether the complaint contains sufficient facts that, if proven, would entitle the plaintiff to relief. If the court finds that the complaint does not contain sufficient facts, it will grant the motion and dismiss the case.
2. Motions for Summary Judgment
Motions for summary judgment are used in defamation actions when there is no genuine dispute as to any material fact and one party is entitled to judgment as a matter of law. The court will consider whether there is enough evidence presented by both parties that could lead a reasonable jury to reach a verdict in favor of either party. If so, then the court may grant summary judgment in favor of one party or the other without having to go through a full trial.
3. Motions for a More Definite Statement
A motion for a more definite statement is used in defamation actions when the defendant believes that they cannot adequately respond to the allegations made by the plaintiff because they are too vague or ambiguous. The court will consider whether or not the complaint contains enough information so that an adequate response can be made by the defendant. If not, then it may order that the plaintiff provide additional details before proceeding with their case.
- anti-SLAPP motions;
- motions for interlocutory injunctions; and
- Norwich motions.
The motions allow the parties to narrow down the issues in dispute and focus on the key elements of their case. This can help to reduce costs, speed up proceedings and ensure that both parties have a fair opportunity to present their case. Additionally, these motions can help to ensure that only relevant evidence is presented in court, which can help to avoid unnecessary delays and confusion.
The purpose of the legislation is to protect individuals from being sued for expressing their opinion or engaging in public participation. It does this by allowing defendants to bring a motion to dismiss a lawsuit at an early stage if it meets certain criteria. The criteria include that the lawsuit must be based on expression made in connection with a matter of public interest, and that the plaintiff has no reasonable prospect of success. If the motion is successful, the lawsuit will be dismissed and the defendant will not have to incur legal costs defending themselves. This provides an effective deterrent against SLAPP suits, which are often used as a tool to silence critics or opponents.
The legislation has been successful in achieving its goal of protecting freedom of expression while still allowing plaintiffs to pursue legitimate claims for defamation. Since its introduction, there have been numerous cases where anti-SLAPP motions have been granted and lawsuits dismissed at an early stage. This has resulted in fewer costly and time-consuming trials, as well as providing greater protection for those who engage in public discourse on matters of public interest.
The two-part test for granting interlocutory takedown orders should include:
1. A threshold requirement that the plaintiff demonstrate a strong prima facie case of defamation; and
2. A balancing of the public interest in taking down the expression against the public interest in preserving the defendant’s freedom of expression.
In assessing the public interest, courts should consider factors such as:
• The seriousness of the alleged harm to reputation;
• The likelihood of success on the merits;
• The extent to which the expression is likely to cause irreparable harm if not taken down;
• The extent to which taking down or blocking access to the expression would be effective in preventing or mitigating harm; and
• The impact on freedom of expression, including whether there are other less restrictive means available for addressing any potential harm.
We recommend that the court should be able to order a Norwich party to provide contact information for the anonymous publisher, such as an email address or other means of communication. This would allow the plaintiff to communicate with the anonymous publisher and potentially resolve the dispute without further litigation.
We also recommend that courts should consider ordering a Norwich party to provide additional information about the anonymous publisher, such as their IP address or other identifying information, if it is necessary for the plaintiff to pursue their claim. This would ensure that plaintiffs are not deprived of their right to seek justice simply because they cannot identify an anonymous defendant.
The LCO recommends that the court system should be more accessible to those seeking to bring online defamation claims. The LCO suggests that the court system should provide a streamlined process for these cases, including simplified pleadings and procedures, as well as expedited hearings and decisions. The LCO also recommends that the court system should provide an alternative dispute resolution (ADR) option for online defamation cases. This would allow parties to resolve their disputes without having to go through the full court process.
The LCO further recommends that the court system should recognize jurisdiction over multi-state defamation cases, and should apply choice of law principles in determining which state’s laws will apply in such cases. The LCO also suggests that corporations should have standing to sue for online defamation, provided they can demonstrate a sufficient connection between the alleged defamatory statement and their business interests.
Finally, the LCO recommends several procedural reforms designed to reduce costs and delay in online defamation cases. These include allowing jury trials in appropriate cases; expanding small claims courts’ jurisdiction over these disputes; providing guidance on evidence requirements; and eliminating unnecessary provisions of the Limitation of Actions Act (LSA).
The Supreme Court of Canada’s majority analysis in Haaretz.com v. Goldhar established a two-stage test for determining jurisdiction in multi-state defamation actions. The first stage requires the court to determine whether there is a “real and substantial connection” between the subject matter of the litigation and the forum state, Ontario in this case. If such a connection exists, then the court must move to the second stage, which involves considering any rebuttal factors that may suggest that another forum is more appropriate.
In addition to the rebuttal factors outlined by the Supreme Court, we recommend that courts consider whether the publication was targeted at an Ontario audience when determining jurisdiction in multi-state defamation actions. This factor would provide additional guidance to courts when assessing whether Ontario is an appropriate forum for adjudicating a particular dispute.
On the choice of law issue, we agree with Abella and Wagner JJ.’s conclusion that the “most substantial harm” test achieves the best balance between protection of reputation and freedom of expression. Under this test, courts will apply the law of whichever jurisdiction has suffered or stands to suffer “the most substantial harm” from a defamatory statement or publication. This approach allows for greater flexibility than other tests while still providing adequate protection for individuals’ reputations and freedom of expression rights.
The LCO recommends that all corporations should be able to sue for defamation in Ontario, regardless of size. This would ensure that small businesses are not disadvantaged when it comes to seeking redress for defamatory statements. The LCO also recommends that the rules of court should be amended to provide a simplified procedure for corporations to bring defamation claims. This would reduce the cost and complexity associated with bringing such claims, making it easier for small businesses to access justice.
In Chapter Seven, we recommend that the Ontario government enact legislation to impose a duty on intermediary platforms to take reasonable steps to facilitate access to justice in online defamation disputes. This duty should include a requirement for platforms to provide users with information about their rights and remedies, as well as an obligation to respond promptly and appropriately when notified of potential defamatory content. We also recommend that the legislation create a process for resolving online defamation disputes through mediation or arbitration, with the platform acting as an impartial facilitator.
In Chapter Eight, we propose a modern process for resolving online defamation disputes that is based on principles of fairness and efficiency. This process would involve three stages: (1) notification; (2) negotiation; and (3) adjudication. At each stage, the platform would be responsible for providing users with information about their rights and remedies, as well as facilitating communication between parties. The proposed process would also allow parties to seek legal advice at any stage of the dispute resolution process. Finally, we recommend that the proposed process be subject to independent oversight by an ombudsman or other third-party body.
The LCO’s Final Report proposes a new legal framework for intermediaries as publishers. This framework would be based on the principle of “responsible communication” and would require intermediaries to take proactive steps to address online defamation. Specifically, the LCO recommends that intermediaries should: (1) provide users with clear information about their rights and responsibilities when posting content; (2) develop effective complaint mechanisms; (3) respond promptly to complaints; and (4) take appropriate action in response to complaints. The LCO also recommends that the government should create an independent body to oversee the implementation of these measures.
The proposed legal framework for intermediaries as publishers is an important step forward in addressing online defamation. It provides a more balanced approach than existing common law principles, which are overly reliant on costly court proceedings. By requiring intermediaries to take proactive steps to address online defamation, this framework will help ensure that users are aware of their rights and responsibilities when posting content, and that disputes can be resolved quickly and effectively.
The LCO also recommends that the government consider introducing a new tort of “intermediary liability” to provide additional remedies for users who suffer harm as a result of content posted on intermediary platforms. This tort would allow users to seek damages from platforms for failing to take reasonable steps to prevent or remove defamatory content. The LCO believes that this approach would provide an effective balance between protecting freedom of expression and providing meaningful remedies for those harmed by online defamation.
1. The need to ensure that the LCO’s recommendations are evidence-based and reflect best practices;
2. The need to ensure that the LCO’s recommendations are practical, achievable and cost-effective;
3. The need to ensure that the LCO’s recommendations are consistent with the values of fairness, access, equity, inclusion and respect for diversity;
4. The need to ensure that the LCO’s recommendations are informed by a broad range of stakeholders, including those who have been traditionally underrepresented in decision-making processes;
5. The need to ensure that the LCO’s recommendations are responsive to changing circumstances and emerging trends; and
6. The need to ensure that the LCO’s recommendations are mindful of existing resources and capacity constraints.
- freedom of expression;
- protection of reputation;
- access to justice for both complainants and publishers; and
- technological innovation and corporate social responsibility.
The LCO’s proposed notice and takedown regime is designed to provide a more efficient, effective, and accessible mechanism for resolving online defamation disputes. It will allow individuals to quickly and easily remove defamatory content from the internet without having to resort to costly litigation. The regime will also provide an incentive for platforms to take proactive steps to address online defamation by providing them with immunity from liability for hosting defamatory content if they comply with the notice and takedown process. Finally, it will ensure that freedom of expression is respected by requiring complainants to demonstrate that the content in question is actually defamatory before it can be removed.
Reforming the common law doctrine of publication to better reflect the realities of the online context could involve a number of different approaches. One approach would be to limit liability for publishers to those who are directly responsible for communicating a defamatory message, such as by creating or editing it. This would exclude those who merely repeat, republish, endorse or authorize it, or in some other way participate in its communication. Another approach would be to create a more nuanced standard of liability that takes into account factors such as the degree of involvement and control that an individual has over the content being communicated. This could include considerations such as whether they had knowledge of the content prior to its publication, whether they had any editorial control over it, and whether they took steps to verify its accuracy before publishing it. Finally, another approach would be to create a safe harbor provision that provides immunity from liability for certain types of online publishers, such as social media platforms and search engines. Such a provision could provide immunity from liability so long as these entities take reasonable steps to remove defamatory content when notified about it.
The traditional definition of publication needs to be re-examined in the internet era in order to ensure that freedom of expression is protected, technological innovation is encouraged, and corporate social responsibility is promoted. This can be done by creating a new legal framework that recognizes the unique nature of online intermediaries and provides them with appropriate protections from liability for third party content. Such a framework should also provide clear guidance on when an intermediary will be considered a publisher, and what responsibilities they have in terms of monitoring and removing illegal content.
This recommendation would provide greater clarity and certainty to the law of defamation, while also protecting intermediaries from liability for content they did not create or intend to publish. It would also ensure that those who do intend to publish are held accountable for their actions. This approach would balance the need to protect free speech with the need to protect individuals from harm caused by defamatory statements.
This would mean that intermediaries would be required to take down any content that is found to be defamatory, and the individual posting the content would be liable for any damages resulting from it.
In other cases, intermediaries may be unaware of the manipulation of third party content and, therefore, not liable as publishers. However, they may still be liable for copyright infringement if they are found to have facilitated or enabled the manipulation of third party content.
This recommendation is based on the principle that a publisher should not be liable for republication by a third party unless they have taken some action to encourage or facilitate it. This would ensure that publishers are held accountable for their own publications, and not those of others. It would also provide greater clarity and certainty in the law, as well as reduce the potential for over-broad liability.
The notice and takedown process would require the platform to remove allegedly defamatory content within a certain time period, usually 48 hours. The platform would then notify the user who posted the content of the removal and provide them with an opportunity to respond. If the user does not respond, or if their response is deemed inadequate, then the platform must keep the content removed for a period of time.
The LCO also proposes that platforms should have a dispute resolution process in place to allow users to challenge takedowns. This could involve providing users with an opportunity to submit evidence or arguments in support of their position that the content is not defamatory. Platforms should also provide users with information about how they can seek legal advice or assistance from third parties such as online dispute resolution services.
Finally, platforms should have procedures in place for dealing with repeat offenders who repeatedly post defamatory content. These could include suspending or terminating accounts, or imposing other restrictions on access to their services.
ISPs and search engines may also be subject to other laws, such as copyright laws, that require them to take down certain content.
If the response is unsatisfactory, the complainant could then seek a court order requiring the platform to take down the content. The platform would be required to comply with this order and take down the content.
The notice of complaint would include information about the complainant’s rights, the process for filing a complaint, and the consequences of filing a false or frivolous complaint. It would also provide guidance on how to respond to a notice of complaint.
The notice and takedown process would be overseen by an independent body, such as the Australian Communications and Media Authority (ACMA). This body would have the power to investigate complaints, issue notices, and mediate disputes.
The proposed system would also provide for a “counter-notice” process, whereby content publishers could challenge a notice if they believe it is unjustified. The counter-notice process would be similar to that used in the US Digital Millennium Copyright Act (DMCA).
Finally, the proposed system would include a “safe harbour” provision, which would protect content publishers from liability for any content they host or link to. This provision is designed to encourage online service providers to take proactive steps to remove or disable access to infringing material.
The intermediary platform would be prohibited from assessing the legality of a defamation notice or a response, and takedown would not create any inference or otherwise impact a future judicial determination about whether content is defamatory. The intermediary platform must provide an opportunity for the user to respond to the notice before taking down the content. The user should also be provided with information on how to seek legal advice and/or dispute the notice. The intermediary platform must also provide clear guidance on how users can submit counter-notices if they believe their content was taken down in error.
The takedown process would involve the platform receiving a complaint from a complainant, verifying the complaint, and then taking down the content in question. The platform would also be required to notify the publisher of the takedown and provide an opportunity for them to respond or appeal. The platform would also be required to keep records of all takedowns and complaints.
The amount of damages would be determined by the court based on factors such as the size of the platform, its financial resources, and the extent to which it failed to comply with its notice and takedown obligations. The damages could also be increased if the platform had a history of failing to comply with its notice and takedown obligations.
The LCO also recommends that the notice and takedown obligations should be limited to content that is illegal or infringing, as defined in the legislation. The legislation should provide a clear definition of what constitutes illegal or infringing content, and should include specific examples. The legislation should also provide guidance on how to determine whether a particular piece of content is illegal or infringing.
The court process is often too slow and expensive to be a viable option for many people who have been defamed online. A takedown remedy allows individuals to quickly remove the offending content from the internet, thereby mitigating any further damage that may be caused by the defamation.
Takedown remedies can also help reduce the number of low-value defamation complaints that are filed in courts. By providing an alternative avenue for resolving such disputes, it can help reduce the burden on courts and encourage parties to resolve their disputes without resorting to costly litigation. Additionally, takedown remedies can provide a more efficient way of dealing with high-volume/low-value defamation complaints, as they allow for quick resolution without having to go through lengthy court proceedings.
The LCO’s notice and takedown proposal has been designed to be flexible enough to accommodate different legal systems, cultures and values. We have taken into account the fact that different countries may have different approaches to intermediary liability, defamation law and platform governance. For example, some countries may have more stringent requirements for notice and takedown than others. We have also considered the need for platforms to be able to respond quickly and effectively to complaints of online defamation in order to protect their users from harm.
At the same time, we recognize that there is a need for greater international harmonization of laws governing online content. This is particularly true when it comes to intermediary liability regimes, which can vary significantly from country to country. To this end, we believe that our proposal could serve as a model for other jurisdictions looking to develop their own notice and takedown regimes.
Finally, we believe that our proposal could help foster greater dialogue between stakeholders in different countries on how best to address online defamation in an increasingly globalized world. By providing a framework for discussion on these issues, we hope that our proposal will contribute towards finding solutions that are both effective and respectful of freedom of expression rights.
The LCO’s proposed notice and takedown regime is intended to provide a more efficient and effective way for individuals to address alleged defamatory content online. It would require intermediaries to take down or block access to allegedly defamatory content upon receipt of a valid notice from an affected individual. The regime would also provide for a process for the intermediary to challenge the validity of the notice, as well as a process for the affected individual to challenge any decision by the intermediary not to take down or block access to the content.
The LCO recommends that the government explore ODR as a means of further improving access to justice in online defamation disputes. The LCO suggests that the government consider the following:
1. Establishing an independent, impartial and transparent ODR system for resolving online defamation disputes;
2. Developing guidelines for how ODR should be used in online defamation disputes;
3. Ensuring that any ODR system is accessible to all parties involved in an online defamation dispute;
4. Exploring ways to ensure that any resolution reached through ODR is legally binding on all parties; and
5. Investigating ways to make ODR more cost-effective and efficient than traditional court proceedings.
The LCO also recommends that the government consider whether any form of compensation should be available to those who successfully resolve their disputes through ODR, such as reimbursement of legal costs or other forms of damages. Finally, the LCO suggests that the government explore ways to incentivize parties to use ODR rather than traditional court proceedings, such as providing access to free or reduced-cost legal advice or representation for those who choose to use ODR instead of litigation.
The LCO has, however, identified a number of features that any ODR system should have in order to be effective. These include: (1) an independent and impartial dispute resolution body; (2) clear rules and procedures for the adjudication of disputes; (3) access to legal advice and representation; (4) a transparent process for decision-making; (5) enforceable decisions; and (6) appropriate remedies. The LCO also recommends that any ODR system should be designed to ensure that it is accessible, affordable, efficient and timely.
VII. Project Organization
The scope of the project includes:
• Developing a comprehensive understanding of the current state of the local economy and its potential for growth.
• Identifying key stakeholders and their interests in the local economy.
• Assessing existing economic development strategies and programs.
• Developing a strategic plan to guide future economic development efforts.
• Establishing metrics to measure progress towards achieving goals.
• Creating an implementation plan to ensure successful execution of the strategy.
We also considered the role of defamation law in relation to other forms of regulation, such as self-regulation and industry codes of conduct. We explored the potential for these forms of regulation to supplement or replace traditional legal remedies, including defamation law. However, we did not undertake a detailed examination of the efficacy or effectiveness of these alternative approaches.
1. The scope of the tort of defamation and whether it should be limited or expanded;
2. The defences available to a defendant in a defamation action;
3. The remedies available to a plaintiff in a defamation action, including damages and other forms of relief;
4. The role of the court in determining the truth or falsity of statements made in a defamation action;
5. The impact of new technologies on the law of defamation;
6. The impact of freedom of expression considerations on the law of defamation; and
7. Other issues related to the development and reform of the law of defamation.
- The law of defamation in Ontario today and its limitations;
- How the legal, technological, and social landscape of the early 21st century influences and challenges “traditional†defamation law;
- A consideration of the legal elements of defamation in light of “internet speechâ€;
- Access to justice in defamation matters;
- Privacy and its relationship to defamation;
- Internet intermediary liability; and,
- Alternative dispute resolution.
This report provides a summary of the feedback we received from stakeholders in response to our Consultation Paper. We have considered all of the comments and suggestions that were submitted, and have used them to inform our recommendations. We have also taken into account relevant case law, legislation, and policy documents.
The report is divided into three sections: (1) an overview of the consultation process; (2) a summary of the feedback we received; and (3) our recommendations for reform.
In the first section, we provide an overview of the consultation process, including details on how we conducted it and who participated. In the second section, we summarize the feedback we received from stakeholders in response to our Consultation Paper. This includes both general comments as well as specific suggestions for reform. Finally, in the third section, we present our recommendations for reform based on this feedback.
We hope that this report will be useful to policymakers and other stakeholders as they consider potential reforms to existing laws and policies related to this issue.
-Legal experts, including lawyers, academics and judges
-Advocates for people with disabilities
-People with disabilities and their families
-Service providers in the disability sector
-Government representatives
-Representatives of Indigenous communities
-Representatives of other marginalized groups
-Experts in accessibility and universal design
We also conducted extensive research into international best practices in accessibility law. We reviewed existing laws, policies and programs from around the world to identify effective approaches to promoting accessibility.
- Defamation and technology law experts, practitioners, judges, government representatives and legal organizations throughout Ontario;
- Experts, practitioners and government representatives from other Canadian provinces, the United States, England, Ireland, Scotland, European Union and Australia;
- Individuals directly affected by defamation law, including complainants involved in defamation disputes, young people, traditional and new media organizations and internet-based companies.
The consultation process was conducted in two stages. The first stage involved a series of public consultations, including interviews, focus groups, written submissions, conferences and speaking engagements. The second stage involved the formation of a working group of defamation law experts to review the feedback from the public consultations and develop recommendations for reform.
1. Defamation Law in the Digital Age: A Comparative Analysis of International and Domestic Approaches, by Professor David Kaye (University of California, Irvine)
2. Privacy and Reputation in the Digital Age: A Comparative Analysis of International and Domestic Approaches, by Professor Lior Strahilevitz (University of Chicago)
3. The Right to be Forgotten: An International Human Rights Perspective, by Professor Mireille Hildebrandt (Vrije Universiteit Brussel)
4. Internet Governance and the Protection of Reputation Online, by Professor Jonathan Zittrain (Harvard University)
5. The Role of Intermediaries in Protecting Reputation Online, by Professor Christopher Yoo (University of Pennsylvania).
- Emily B. Laidlaw, Are We Asking Too Much From Defamation Law? Reputation Systems, ADR, Industry Regulation and other Extra-Judicial Possibilities for Protecting Reputation in the Internet Age.
- David Mangan, The Relationship between Defamation, Breach of Privacy and Other Legal Claims Involving Offensive Internet Content.
- Karen Eltis of the Faculty of Law, University of Ottawa, Is “Truthtelling†Decontextualized Online Still Reasonable? Restoring Context to Defamation Analysis in the Digital Age.
- Emily B. Laidlaw and Hilary Young, Internet Intermediary Liability in Defamation: Proposals for Statutory Reform.
- Jane Bailey and Valerie Steeves, Co-Leaders of the eQuality Project, University of Ottawa, Defamation Law in the Age of the Internet: Young People’s Perspectives.
The issue papers are available at the LCO’s website (https://www.lco-cdo.org/en/legal-capacity-and-decision-making). This report is informed by each of these papers and we are indebted to the authors for their in-depth analysis. We recommend a direct reading of these papers for further context underlying this Final Report.
1. The LCO Summer School, which was held in July 2019 and focused on the development of new technologies for space exploration. The event featured lectures from leading experts in the field, as well as hands-on activities and workshops.
2. The LCO International Conference on Space Exploration, which was held in October 2019 and brought together scientists, engineers, entrepreneurs, policy makers, and other stakeholders to discuss the future of space exploration. The conference included keynote speakers from NASA, ESA, JAXA, and other international organizations.
- A panel at RightsCon 2018 on intermediary responsibility for defamatory online content, consisting of experts from Canada, United States, Belgium, France and Australia.
- An international conference on defamation law reform in partnership with Professors Jamie Cameron and Hilary Young. The day involved five panel discussions, 21 speakers and approximately 150 registrants. The conference agenda is available at our website.11 The conference resulted in a special issue of the Osgoode Hall Law Journal.12
VIII. Advisory Committees, Funding and Support
- Dan Burnett, Owen Bird Law Corporation
- Jamie Cameron, Osgoode Hall Law School
- Peter Downard, Fasken Martineau DuMoulin
- Kathy English, The Toronto Star
- David Fewer, Samuelson-Glushko Canadian Internet Policy & Public Interest Clinic
- John D. Gregory, Retired General Counsel, Ministry of the Attorney General
- Emily Laidlaw, University of Calgary, Faculty of Law
- Brian MacLeod Rogers, Ontario lawyer
- The Honourable Wendy Matheson, Superior Court of Justice of Ontario
- Roger McConchie, British Columbia lawyer
- Tom McKinlay, General Counsel, Crown Law Office – Civil, Ministry of the Attorney General
- Julian Porter, Q.C., Ontario lawyer
- David Potts, Ontario lawyer
- The Honourable Paul Schabas, Superior Court of Justice of Ontario
- Andrew Scott, London School of Economics
- Joanne St. Lewis, University of Ottawa, Faculty of Law
- Hilary Young, University of New Brunswick, Faculty of Law
* The Working Group on the Law of Succession (WGLS) was established in 2017 to review and recommend reforms to the law of succession.
The WGLS is composed of a diverse group of practitioners, academics, and experts from across Ontario. Its mandate is to provide advice to the LCO on reforming the law of succession in Ontario. The WGLS has consulted widely with stakeholders and members of the public, and has produced two reports:
* Report on Intestacy Rules (2018)
* Report on Capacity and Undue Influence (2019).
The WGLS’s recommendations have been used by the LCO as a basis for its final report on reforming the law of succession in Ontario.
- John Gregory, Retired General Counsel, Ministry of the Attorney General
- Brian MacLeod Rogers, Ontario lawyer
- Sean Moreman, Senior Legal Counsel, Canadian Broadcasting Corporation
- David Potts, Ontario lawyer
- Brian Radnoff, Dickinson Wright LLP
IX. Acknowledgements
The LCO would like to thank the following individuals and organizations for their support of this project:
– The Law Foundation of Ontario, which provided generous funding for the project.
– The Social Sciences and Humanities Research Council of Canada, which provided additional funding for the international conference.
– The University of Toronto Faculty of Law, which hosted the international conference.
– The Canadian Bar Association, Ontario Branch, which provided financial support for the project.
– All of the speakers at the international conference who shared their expertise and insights with us.
– All of the participants in our consultations who shared their experiences and perspectives with us.
X. Next Steps and How to Get Involved
1. Participate in the LCO’s online survey: The LCO has created an online survey to gather feedback on the Final Report and recommendations. The survey is open until August 31, 2021.
2. Attend a virtual public consultation: The LCO will be hosting two virtual public consultations on June 8 and June 10, 2021 to discuss the Final Report and recommendations. Registration is required to attend these sessions.
3. Submit written comments: Written comments can be submitted by email or mail to the LCO by August 31, 2021.
4. Contact your local MPP: You can contact your local Member of Provincial Parliament (MPP) to share your views on this project and encourage them to support the proposed reforms.
- Learn about the project on our project website13;
- Contact us to ask about the project; or,
- Provide written submissions or comments on the Final Report and recommendations.
The Local Council Office (LCO)
Address:
Local Government House,
Smith Square, London SW1P 3HZ
Telephone: 020 7664 3000
Email: [email protected]
- Law Commission of Ontario Osgoode Hall Law School,
York University 2032 Ignat Kaneff Building
4700 Keele Street
Toronto, ON M3J 1P3
Tel: (416) 650-8406
Toll-Free: 1 (866) 950-8406
Email: [email protected]
Web: www.lco-cdo.org
Twitter: @LCO_CDO
The best way to prevent the spread of COVID-19 is to practice social distancing, wear a face mask when in public, wash your hands often with soap and water for at least 20 seconds, avoid touching your face, cover your mouth and nose when you cough or sneeze, clean and disinfect frequently touched surfaces daily, and stay home if you are feeling sick.
- Libel & Slander Act, RSO 1990, c L12 [LSA].
- https://www.lco-cdo.org/en/defamation-law
- Hill v Church of Scientology, [1995] 2 SCR 1130, para 123.
- Canadian Charter of Rights and Freedoms, Part 1 of the Constitution Act 1982, being Schedule B to the Canada Act 1982 (UK), c 11 [Charter], ss 2(b).
- Matthew Collins, Collins on Defamation, (Oxford: Oxford University Press, 2014), Preface, ix.
- WIC Radio Ltd v Simpson, 2008 SCC 40; Grant v Torstar Corp, 2009 SCC 61; Crookes v Newton, 2011 SCC 47.
- Protection of Public Participation Act, SO 2015, c 23 [PPPA], enacting ss. 137.1 – 137.5 of the Courts of Justice Act, RSO 1990, c C43.
- Justice Judith Gibson, “Adapting Defamation Law Reform to Online Publicationâ€, Discussion Paper, Faculty of Law, University of New South Wales (21 March 2018) 26.
- Haaretz.com v Goldhar, 2018 SCC 28.
- https://www.lco-cdo.org/en/our-current-projects/defamation-law-in-the-in…
- https://www.lco-cdo.org/en/our-current-projects/defamation-law-in-the-i…– go-from-here/
- https://digitalcommons.osgoode.yorku.ca/ohlj/vol56/iss1/
- https://www.lco-cdo.org/en/our-current-projects/defamation-law-in-the-i…
APPENDIX A – LIST OF RECOMMENDATIONS
Chapter II – The Foundation for Defamation Law Reform
2. The new Defamation Act should provide a clear definition of what constitutes defamation and the elements necessary to prove it.
3. The new Defamation Act should provide for a range of remedies, including damages, injunctions, and other equitable relief.
4. The new Defamation Act should establish a presumption that any statement made in good faith is not defamatory.
5. The new Defamation Act should provide for an expedited process for resolving disputes over alleged defamation, including provisions for early resolution or mediation of disputes before they reach the courts.
6. The new Defamation Act should include provisions to protect freedom of expression and public interest journalism from frivolous or vexatious claims of defamation.
7. The new Defamation Act should include provisions to ensure that those who have been defamed are able to obtain effective redress without having to incur excessive costs or delays in doing so.
8. The new Defamation Act should include provisions to ensure that those accused of defamation are able to defend themselves without having to incur excessive costs or delays in doing so.
9. The new Defamation Act should include provisions allowing for the recovery of legal costs by successful parties in appropriate circumstances, as well as measures designed to discourage frivolous or vexatious claims and defences from being brought before the courts unnecessarily.
The common law has been the primary source of defamation law for centuries, and it should continue to play a role in its development. Common law provides flexibility and allows for the law to evolve with changing social norms and values. It also allows for judges to interpret the law in light of new facts or circumstances that may arise. This is especially important when dealing with complex issues such as defamation, which can involve a variety of different factors. Common law also allows for more nuanced decisions that take into account the particular circumstances of each case, rather than relying on rigid rules or formulas.
However, there are certain areas where codification could be beneficial. For example, some jurisdictions have adopted specific statutes that provide guidance on how damages should be calculated in defamation cases. This helps ensure consistency across cases and reduces the risk of arbitrary awards being made by courts. Additionally, codifying certain elements of defamation law could help reduce confusion among litigants and lawyers about what constitutes actionable conduct or speech.
Overall, while common law should remain the primary source of defamation law, there are certain areas where codification could be beneficial in order to ensure clarity and consistency across cases.
Defamation law is designed to protect an individual’s reputation from false statements that are made about them. It seeks to provide a remedy for those who have been wronged by another person’s words or actions. Defamation law also provides a deterrent against future defamation by punishing those who make false statements and providing compensation for the harm caused.
Privacy law, on the other hand, is designed to protect an individual’s right to privacy from intrusion or interference. It seeks to ensure that individuals can control how their personal information is used and shared, and that they are not subject to unreasonable surveillance or monitoring. Privacy law also provides remedies for those whose privacy has been violated, such as damages or injunctions against further violations.
Chapter III – Substantive Elements of Defamation Law
The Defamation Act should establish a single tort of defamation, eliminating the distinction between libel and slander. This would mean that all forms of defamation, regardless of whether they are in written or spoken form, would be treated equally under the law. The new Act should also repeal sections 16, 17 and 18 of the Libel and Slander Act (LSA), which currently provide different remedies for libel and slander. This would ensure that all victims of defamation have access to the same remedies regardless of the form in which it was expressed. Additionally, this would simplify the legal process by removing any confusion over which type of remedy is available for a particular case.
Defamatory meaning online is the legal concept that applies to any communication, whether written or spoken, that is considered to be false and damaging to a person’s reputation. In the context of online communications, courts must consider the overall context of the content and the degree of sophistication of online readers when determining if a statement is defamatory. This includes considering factors such as how widely distributed the statement was, who was likely to have seen it, and what kind of language was used.
The common law presumption of damage should continue to be an element of the tort of defamation. This is because it provides a necessary protection for individuals who have been defamed, and it ensures that those who have suffered harm as a result of false statements are able to seek redress. The presumption also serves as a deterrent to potential defamers, as they know that if they make false statements about someone, they may be held liable for damages. Without this presumption, many people would not be able to seek justice for the harm caused by false statements.
Furthermore, Ontario should not adopt a serious harm threshold in relation to defamation. This is because such a threshold would create an arbitrary line between what constitutes “serious” harm and what does not. It could also lead to situations where people are unable to seek justice for relatively minor harms caused by false statements, which would be unfair and unjust. Additionally, such a threshold could potentially lead to more false statements being made without fear of repercussions, as people may assume that their statement will not cause “serious” enough harm to warrant legal action.
This standard requires that the plaintiff prove that the defendant made a false statement of fact about the plaintiff, that was published to a third party, and that caused harm to the plaintiff’s reputation. The plaintiff must also prove that the defendant acted with either negligence or malice. Negligence is established if the defendant did not take reasonable care in making sure their statement was true before publishing it. Malice is established if the defendant knew their statement was false or acted with reckless disregard for its truthfulness.
Yes, the common law presumption of falsity should continue to be an element of the tort of defamation. This is because it serves as a safeguard for those who are accused of making false statements. The presumption of falsity allows those accused to prove that their statement was true and therefore not defamatory. Without this presumption, it would be much more difficult for individuals to defend themselves against defamation claims.
Justification is an important part of the Defamation Act, as it allows individuals to defend themselves against accusations of defamation. Section 22 of the LSA provides a legal defence for those accused of defamation if they can prove that their statements were true or that they had reasonable grounds to believe them to be true. This provision is essential in protecting individuals from false accusations and ensuring that only those who have actually committed libel are held accountable. By including a similar provision in the new Defamation Act, individuals will be able to protect themselves from unjustified accusations and ensure that only those who have actually committed libel are held accountable.
This provision should provide that a person who publishes a fair and accurate report of proceedings in any court or tribunal, or of any statement made in the course of such proceedings, is not liable for defamation unless it can be shown that the publication was actuated by malice.
The defence of opinion should be included in the new Defamation Act as it provides a balance between protecting freedom of expression and protecting individuals from harm caused by defamatory statements. The defence should provide that the defendant must prove that the publication is on a matter of public interest, is based on fact and is recognizable as opinion. This will ensure that only opinions which are based on facts and are related to matters of public interest can be defended. Furthermore, the defence should be defeated where the plaintiff establishes that the defendant acted with express malice. This will ensure that maliciously motivated opinions cannot be defended under this provision.
However, while this defence should be included in the new Defamation Act, it should not replace or abolish the common law defence of fair comment. Fair comment is an important protection for freedom of expression and allows for criticism and debate on matters of public interest without fear of legal repercussions. Therefore, it should remain available as an additional defence alongside the proposed defence of opinion.
Yes, courts should adopt an analysis consistent with the former common law defence of fair comment when applying the new statutory defence of opinion. However, the common law requirement of objective honest belief should no longer be part of the defence. The new statutory defence does not require that a defendant have an objectively honest belief in their opinion; rather, it requires that they had an honestly held opinion and that their opinion was based on facts which were either true or privileged.
Yes, the new Defamation Act should contain a provision equivalent to section 23 of the LSA. Section 23 of the LSA provides that any person who publishes or communicates a defamatory statement is liable for damages unless they can prove that they had reasonable grounds for believing that the statement was true. This provision is important in protecting individuals from being wrongfully accused and held liable for damages due to false statements. Therefore, it should be included in the new Defamation Act as well.
Yes, Section 24 of the LSA should be repealed. This section of the law allows for the indefinite detention of individuals without charge or trial, which is a violation of basic human rights and civil liberties. The repeal of this section would ensure that individuals are not subject to arbitrary detention and would help to protect their right to due process.
In general, courts should consider the following factors when assessing whether a communication is responsible:
1. The accuracy of the information communicated;
2. The context in which the communication was made;
3. The extent to which the publisher has taken reasonable steps to verify the accuracy of the information;
4. Whether there was any malice or recklessness in making the communication;
5. Whether there was an intent to harm or defame another person or entity; and
6. Whether there were any mitigating circumstances that may have affected the publisher’s conduct.
18. The court may also order any person having control over the defamatory publication to take down or otherwise restrict its accessibility, if the court finds that the publication is likely to cause serious harm to the plaintiff’s reputation.
19. The court may also order any person having control over the defamatory publication to take down or otherwise restrict its accessibility, if it is satisfied that there is a real risk of further harm being caused by continued availability of the publication.
20. The court may also order any person having control over the defamatory publication to take down or otherwise restrict its accessibility, if it is satisfied that there are no other reasonable means available for preventing or mitigating such harm.
The Defamation Act should provide that a court may order the defendant to publish a summary of the judgment in a defamation action where it has given judgment for the plaintiff. The summary should include details of the court’s findings and any damages awarded, as well as an explanation of why the court found in favour of the plaintiff. The publication should be made in a manner that is likely to bring it to the attention of those who have been exposed to the defamatory material. This will ensure that those affected by the defamation are aware of the outcome and can take steps to protect their reputation if necessary.
Chapter IV – A New Notice Regime and Limitation of Claims
1. A complainant must provide a written notice to the publisher of the alleged defamatory material, setting out the nature of the complaint and the relief sought.
2. The publisher must respond to the notice within 14 days, either by providing a response or by indicating that it will not be responding.
3. If the publisher does not respond within 14 days, then the complainant may proceed with legal action without further notice.
4. If the publisher responds to the notice, then it must provide a detailed response outlining its position on each element of the complaint and any defences it may have.
5. The publisher must also provide evidence in support of its position and any defences it may have.
6. The complainant must then consider this response before deciding whether to proceed with legal action or seek an alternative resolution such as mediation or arbitration.
b) Statement of Claim – A complainant must file a statement of claim in court within six weeks of the date on which the notice was served. The statement of claim must include details of the publication, the defamatory meaning alleged, and any other relevant information.
c) Evidence – A complainant must provide evidence to support their claim that the publication is defamatory. This may include witness statements, documents, or other forms of evidence.
d) Costs – A complainant may be liable for costs if they are unsuccessful in their claim.
c) Personal Service – Service of a defamation notice by personal service shall be effective service where the defendant can be located and served with the notice. Personal service shall include, but not be limited to, handing the notice directly to the defendant or leaving it at their residence or place of business.
(d) Notice of Complaint – Any person who believes that a third party content accessible in Ontario is infringing their copyright may submit a notice of complaint to the intermediary platform hosting the content. The notice must include:
i. The name, address and contact information of the complainant;
ii. A description of the copyrighted work that is allegedly infringed;
iii. The URL or other specific location on the intermediary platform where the material is located; and
iv. A statement by the complainant that they have a good faith belief that use of the material in the manner complained of is not authorized by them, their agent or law.
e) Defamation Damages – The court may award damages for defamation, including aggravated damages, exemplary damages and costs.
f) Response to Notice – The recipient of the notice shall have an opportunity to respond to the complaint within a reasonable period of time. The response should include any facts or legal arguments that support the recipient’s position.
Yes, this is correct. Intermediary platforms are not allowed to assess the merits of a notice of complaint. They must simply forward the complaint to the appropriate authorities for further investigation and action.
The amount of the administrative fee to be charged by intermediary platforms for passing on notice is established by regulation. The exact amount of the fee varies depending on the type of platform and the services provided. Generally, the fee is a percentage of the total cost of the transaction or a flat rate.
ii) Notify Publisher – An intermediary platform receiving a notice of complaint meeting the content requirements shall notify the publisher of the complaint and provide them with a copy of the notice.
iii) Remove Content – An intermediary platform receiving a notice of complaint meeting the content requirements shall remove or disable access to the content identified in the notice.
(k) No obligation to monitor – There should be no obligation on intermediaries to monitor content or take proactive steps to identify infringing content.
The publisher should respond to the complainant in a timely manner and keep them informed of any progress.
The court may also take into account the publisher’s efforts to resolve the complaint when determining whether or not to award punitive damages. If the publisher has taken reasonable steps to address the complaint, then it is less likely that punitive damages will be awarded.
The publisher should take prompt and reasonable action to address any issues that arise. This may include taking steps to correct any errors or omissions, or providing additional information or clarification. The publisher should also consider the impact of any delay in taking remedial action on the affected parties.
a) A single publication rule should apply to all claims of defamation, regardless of the medium in which the statement was published.
b) The single publication rule should provide that a claim for defamation can only be brought within one year from the date of first publication.
c) The single publication rule should not apply to any subsequent republication or reposting of the same statement, as long as it is done with knowledge of its defamatory nature.
d) Any damages awarded for a successful claim of defamation should be limited to those suffered by the claimant within one year from the date of first publication.
b) No, a single cause of action for defamation does not exist in relation to the publication of an expression and all republications of the expression by different publishers. Each republication may give rise to a separate cause of action for defamation.
c) The limitation period for a defamation action is two years from the date of the first publication of the expression.
Yes, the general two-year limitation period in the Limitations Act, 2002 should govern all defamation actions. This is because the Limitations Act, 2002 sets out a general two-year limitation period for all civil claims in Ontario. Therefore, any defamation action must be brought within this two-year period or else it will be barred by the statute of limitations.
Chapter V – Preliminary Court Motions
The retention of information should be subject to the provisions of the Data Protection Act 2018 and any other applicable legislation. The intermediary platform should also have a policy in place that sets out how it will respond to requests for information from third parties, including plaintiffs seeking Norwich orders. This policy should include details on how long records of information identifying an anonymous publisher will be retained, and what steps will be taken to ensure that the data is kept secure.
Chapter VI – Jurisdiction, Corporations and the Court Process
Yes, section 15 of the LSA should be repealed and the common law should govern. This section of the LSA states that an indemnity agreement between two parties is void unless it is in writing and signed by both parties. The common law, however, allows for verbal agreements to be legally binding. Therefore, repealing this section would allow for more flexibility in creating indemnity agreements between two parties.
No, section 19 of the LSA should not be repealed and the common law govern. Section 19 of the LSA provides important protections for parties in litigation by requiring that all allegations made in pleadings must be supported by evidence. This helps to ensure that parties are not held liable for unfounded claims or allegations. Additionally, repealing this section would create confusion and uncertainty as to what is required when making a legal claim or allegation.
Chapter VII – New Legal Responsibilities for Intermediary Platforms
A publisher is defined as an individual or entity that intentionally communicates a specific expression to the public. This includes any form of communication, such as print, broadcast, digital, or other media. The new Defamation Act should provide that a defamation action may only be brought against a publisher of the expression complained of.
The new Defamation Act should provide that a publisher of a defamatory expression is liable for republication of the expression by a third party only if the publisher knew or had reason to know that the republication was likely to occur. The publisher should also be required to take reasonable steps to prevent such republication, including but not limited to removing or disabling access to the original expression.
No, Section 2 of the LSA should not be repealed. This section is important in ensuring that all published works are protected by copyright law. It ensures that authors and publishers have the right to control how their work is used and distributed, which helps to protect their intellectual property rights. Additionally, this section helps to ensure that authors and publishers are compensated for their work, which encourages creativity and innovation.
Chapter VIII – Notice and Takedown: A Quicker, Modern Process for Online Defamation Disputes
1. Intermediary platforms shall be required to take down content that is found to be defamatory by a court of competent jurisdiction in Ontario.
2. The takedown obligation should apply to all types of content, including text, images, audio and video.
3. The intermediary platform must take down the content within 24 hours of receiving notice from the court or other authorized body.
4. The intermediary platform must also notify the user who posted the content that it has been taken down and provide them with an opportunity to challenge the takedown decision before a court of competent jurisdiction in Ontario.
5. If the user challenges the takedown decision and is successful, then the intermediary platform must restore the content within 24 hours of receiving notice from the court or other authorized body.
6. The intermediary platform must also provide users with an easy-to-use mechanism for reporting potentially defamatory content so that it can be reviewed and taken down if necessary in a timely manner.
b) Investigation – If the intermediary platform does not receive a response within the two-day deadline, it shall investigate the complaint and take appropriate action. The investigation should include an assessment of the merits of the complaint, including whether or not there is any evidence to support it. The intermediary platform may also contact both parties to obtain additional information or clarification.
c) Resolution – After completing its investigation, the intermediary platform shall determine what action, if any, is necessary to resolve the complaint. This may include issuing a warning to the publisher, suspending or terminating their account, or taking other appropriate measures. The intermediary platform shall inform both parties of its decision and provide an explanation for its action.
c) Transparency – The intermediary platform shall provide the complainant with information about the process and the outcome of the complaint.
(d) No Liability – Intermediary platforms shall not be liable for any damages or losses incurred by a user as a result of the resolution of a complaint.
e) Appeal – The publisher shall have the right to appeal the takedown decision of the intermediary platform.
f) Notice to the User – Intermediary platforms shall provide notice to the user who posted the allegedly defamatory content, informing them of the complaint and giving them an opportunity to respond.
This policy is designed to ensure that publishers are given the opportunity to repost content that was taken down due to a complaint. The intermediary platform must provide notice of the takedown to both the publisher and complainant, and if the publisher requests putback, the intermediary platform must repost the content if there is evidence that the publisher failed to receive the notice or unintentionally missed the deadline and it is technologically reasonable to do so. This policy helps protect publishers from having their content removed without their knowledge or consent.
The amount of the administrative fee shall be determined by the relevant regulatory body. The fee should be reasonable and proportionate to the services provided, taking into account the costs associated with providing such services.
i) Injunctive Relief – Courts may grant injunctive relief to prevent or restrain the infringement of a copyright. This includes an order for the defendant to cease and desist from infringing activities, as well as an order requiring the defendant to take affirmative steps to remove or disable access to infringing material.
(ii) Applicable to All Intermediaries – All intermediaries, including internet service providers, search engines and other intermediaries, shall be required to comply with the takedown obligation. This includes taking down content that is found to be in violation of the legislation upon receiving a valid notice from a rights holder or other authorized party.
(k) Notice and Takedown Process – Intermediary platforms hosting user content available in Ontario shall have a notice and takedown process that is consistent with the requirements of the Canadian Copyright Act. The process shall include:
(i) A mechanism for users to submit complaints about allegedly defamatory content;
(ii) A requirement that the intermediary platform respond to complaints within a reasonable time frame;
(iii) A requirement that the intermediary platform remove or disable access to allegedly defamatory content upon receipt of a valid complaint; and
(iv) A requirement that the intermediary platform notify affected users of any removal or disabling of access to their content.
(l) Counter-Notification – A person who has received a notice of complaint may file a counter-notification with the service provider claiming that the material is not infringing or that the person has authorization to post it. The service provider shall promptly notify the complainant of the counter-notification and restore the material within 10 business days unless the complainant brings a court action seeking an injunction against its restoration.
2) Penalties – Any person who violates the provisions of the legislation shall be subject to a fine of up to $10,000 or imprisonment for up to one year, or both.
Chapter IX – Online Dispute Resolution
The Ontario government recognizes the importance of access to justice and is committed to exploring innovative approaches to dispute resolution. The Ministry of the Attorney General has been actively engaged in research and consultation on ODR, including a review of international models, and is currently developing an ODR strategy for Ontario. This strategy will consider the potential role of ODR in resolving online defamation disputes, as well as other types of civil disputes. The Ministry will also consider the potential for social media councils or other regulatory models to play a similar role in informally resolving online defamation disputes.
The Ministry will continue to consult with stakeholders, including members of the legal profession, academics, technology experts, consumer advocates and other interested parties, as it develops its ODR strategy. The Ministry’s goal is to ensure that Ontarians have access to effective dispute resolution services that are affordable and accessible.
background-paper
The Law Commission of Ontario (LCO) is an independent, non-partisan organization that provides research and advice on law reform. The LCO’s Background Paper on Defamation Law in the Internet Age was released in October 2020. The paper examines the current state of defamation law in Canada, with a particular focus on how it applies to online speech. It looks at the challenges posed by the internet and social media, including issues such as anonymity, speed of communication, and global reach. It also considers potential reforms to address these challenges, including changes to existing laws or new legislation. The paper provides an overview of the legal landscape and identifies areas where further research is needed.
The project is also exploring the potential for reform in areas such as defences, remedies and costs. The LCO will be engaging with stakeholders, including members of the public, to ensure that the project reflects a broad range of perspectives.