By Stephen Dickens, Partner, and Megan Armstrong, Law Graduate
In a recent landmark decision, Justice Rothman of the Supreme Court of New South Wales found that media companies can be liable for defamatory comments posted on their public Facebook pages by members of the public. As it stands, the case of Voller delivers serious and far reaching implications for businesses utilising Facebook and other social media platforms as a means of sharing content and promoting public engagement for commercial purposes.
Former Juvenile Detention Centre detainee, Dylan Voller, brought defamation proceedings against media companies, Fairfax Media, Australian News Channel (Sky News) and Nationwide News, for comments posted by third party users on the companies’ Facebook pages in response to news articles.
What set these proceedings apart from previous defamation actions involving social media publications is that Mr Voller did not claim the articles themselves were defamatory, nor was it alleged that the media companies were negligent or reckless in failing to delete the comments in question. Rather, Mr Voller claimed that particular comments posted by third party members of the public were defamatory, and that the media companies are liable in defamation as the publisher of those comments.
Defamation law in Australia recognises that it is the publication of a comment, and not the compiling of that comment, that gives rise to damages in defamation. The central issue before Justice Rothman was therefore whether the media companies were the ‘primary publisher’ of the comments, in which his Honour found in the affirmative. In determining the primary publisher of comments, a number of factors were considered relevant, namely the degree of control held by a host of a public Facebook page; the commercial interests of the media companies; and whether the initial posts contained content that would likely attract defamatory comments.
Degree of Control
The Defendants argued very strongly that it was physically impossible to monitor every comment made by third party users on their public Facebook posts. They said they could spend many hours trying to review and manage long threads of conversation. His Honour said, though, that volume cannot create its own shield, and there ‘was nothing physically preventing the Defendants from discovering the contemptuous and disparaging nature of it’. Indeed, His Honour found that “it is possible to hide, in advance, all, or substantially all, comments”, or to delete them. He also found that the media companies had the ability to adapt Facebook’s management tools to vet all comments prior to publication by compiling a list of prohibited words, such as all pronouns and other common parts of speech to hide substantially all comments pending the manual review of each. He didn’t consider it an unrealistic expectation for the companies to carry out such vetting.
In its reasoning, the Court distinguished the degree of control held by hosts of public Facebook pages from the control held by hosts of discussion webpages or a Google search facility, which have no capacity to vet postings in advance of their placement.
Ultimately, it is this perceived ability to control defamatory comments which is the key to holding the hosts of public Facebook pages liable as ‘publishers’ of such comments.
Commercial Interests & Content Likely to Attract Defamatory Material
The Court’s decision was influenced by the primary purpose of each media company’s public Facebook pages being to optimise readership and advertising revenue.
The Court found that the exchange of controversial ideas on a public forum, such as Facebook, is a mechanism by which this purpose is achieved.
Further, given that the initial posts which led to the defamatory comments related to articles reporting on Mr Voller’s mistreatment at the Don Dale Youth Detention Centre, it was open for the media companies to assess their posts as being likely to attract defamatory comments.
In his reasoning, Justice Rothman made clear that any business which encourages discussion by the public on its Facebook page for its own commercial advantage “cannot escape the likely consequences of its actions by turning a blind eye to it”.
Effect of Voller
The decision in Voller is subject to appeal and therefore, may not be the final ruling on the matter.
However as it stands, Voller is a win for individuals who find themselves the subject of untoward public scrutiny, but the high standard means that businesses using Facebook to engage with the public for commercial purposes now find themselves in a precarious situation.
To avoid liability for defamatory posts made by a third party, businesses with Facebook pages must do more than merely monitor comment threads and hide defamatory comments detected.
Instead, as primary publishers, they would now be expected to vet or filter defamatory comments before they are released to the general readership. The reality is that while many businesses would find it difficult to undertake such an extensive task, the NSW Supreme Court has said it does not consider it to be an unrealistic expectation that businesses do so.
Some businesses might also argue that those responsible for vetting comments may find it difficult to properly assess triviality, truth or other defences possibly available to a third party commenter, with the result that they may end up deleting, hiding or silencing many communications which are in fact lawful. This, though, really misses the point, as the primary focus must be on the actual comments or posts made, and the publisher developing a system that proactively seeks to manage, and if necessary, hide or delete them, so that the publisher avoids liability. The third party’s position is a matter for it, and involves a very different analysis.
While Voller also only considers liability in relation to comments made on Facebook, there is no reason why the same principles at hand in Voller would not extend to defamatory comments on other social media platforms, such as Instagram, Twitter and YouTube. Additionally, whilst the Court considered the question of extended liability in the context of public pages which are hosted for commercial purposes, further cases will no doubt test just how widely or liberally that ought be construed, and as to whether it should extend to Not For Profits or government agencies which host public social media pages in connection with fund raising or other purposes in the context of their operations.
Although Justice Rothman indicated that his decision had little to do with freedom of speech, its effect impacts and arguably displaces the current balance between society’s interest in the free exchange of information and ideas, and the protection of an individual’s reputation. The long-term effect of Voller could be that the ability for third parties to comment publicly and freely in social media forums is seriously constrained, perhaps even lost. This is undoubtedly an area of some tension, which hopefully will be resolved somewhat by way of the appeal.
Pending an appeal to the New South Wales Court of Appeal, Justice Rothman has most certainly sounded alarm bells for social media companies, and indeed anyone, who hosts a public Facebook page or other media platform, for commercial purposes and who promote or seek public or third party posts or engagement on those public pages.
Those who do so must now take such steps as are reasonable to manage and vet all posts made, and to hide or delete those that are considered to ‘cross the line’. A failure to act reasonably, or acting recklessly, or simply turning a blind eye to what it is they promote, could subject them to liability in defamation law for defamatory comments which are posted by others. The key to trying to avoid liability is exercising appropriate due diligence, but as to what that should be is a question left largely unanswered at the moment.
At this juncture, though, any business seeking to mitigate the risk of being held liable for defamatory comments made by a third party on its public Facebook page should at least consider taking the following steps:
- Before publishing any post, assess the content of the post to determine whether it is likely to attract controversial discussion. If it is more probable than not that it will attract controversial and defamatory material, it may be best to think twice before publishing the post;
- Utilise Facebook’s management tools to vet posts by implementing a system that captures or flags a range of adverse words, profanities or phrases to hide, block or delete, prior to any comment being published;
- Err on the side of caution when assessing communications that could potentially be defamatory; and
- Establish a clear set of rules as to who should be entitled to post, and what can be posted by those entering and using the public page.
  NSWSC 766.
 Ibid .
 Ibid .