Internet defamation: who is legally responsible for online comments?

As the legal profession struggles to catch up with the age of social media, the question of who is responsible for defamatory remarks on websites is constantly evolving, says Hugh McCarthy.

Internet defamation: who is legally responsible for online comments?

WITH social media titans such as Twitter, Facebook, and Google playing an increasingly prominent role in how we communicate, the question of their liability for defamation is under closer scrutiny.

Never before have people been able to convey information and opinions or indeed pictures and videos with such ease.

The reality that reputations can be destroyed by just a few simple clicks is brought into sharp focus as the first wave of Twitter defamation cases reach Irish or British courts.

Earlier this year, the High Court ordered the Dublin-based Twitter to remove a false profile displaying "grossly defamatory and offensive sexually related pictures and tweets" of a young Abu Dhabi-based Irish school-teacher. This follows the much publicised April 2013 UK High Court decision ordering Sally Bercow, the wife of the Speaker of the House of Commons, to pay £15,000 (€18,294) in damages to Lord McAlpine.

Bercow had incorrectly tweeted that the former Conservative MP was the politician at the heart of child sex abuse allegations.

Lest we need any reminding, these cases reinforce the now-established legal principle that defamatory statements made via the internet can and do have legal consequences. The Defamation Act 2009 expressly confirms this.

While there has long been a trickle of internet defamation cases, the arrival of the social media age is certain to generate a torrent of such actions. Against this backdrop, a particularly vexed question is whether or not, and to what extent, online service providers (OSPs) themselves can be liable for hosting defamatory comments.

This arises not just for social media platforms such as Facebook and Twitter but also the operators of news websites, chatrooms, and blogs, where the OSPs allow their users to comment or upload material.

Material created by such users is known as user-generated content (UGC), as distinct from material created and published by the OSP itself. UGC ranges from tweets, Facebook statuses, and comments to various uploads and significantly includes comments made by readers on blogs and online news articles. The question for OSPs is whether they can be liable for UGC that they themselves have not created.

The EU legal framework governing OSP liability for hosting defamatory comments is primarily governed by the so-called ‘notice and takedown’ mechanism. In principle, OSPs such as Google will avoid liability when they react quickly to a complaint of defamatory content by removing it.

This immunity from defamation liability is described as a "safe harbour". Because member states are prohibited from compelling OSPs to proactively monitor all material they host, the extent of OSP’s duty is limited to reacting to complaints.

However, recent judicial decisions have blurred the boundaries of the ‘notice and takedown’ safe harbour and there is now evidence of a trend across Europe whereby courts are imposing greater responsibility on OSPs for the UGC.

Most recent is the October 2013 decision of the European Court of Human Rights in Delphi AS v Estonia. The case arose when an Estonian ferry company decided to alter its island routes and, in doing so, destroyed areas where ice-roads to these islands were planned.

Consequently, prospective users of the ice-roads were forced to use the ferry service at substantial cost. Estonia’s largest online news portal, Delphi, published an article on the episode to which numerous defamatory comments were anonymously posted by Delphi readers.

The ferry company sued, not on the basis of the article itself but rather due to the defamatory nature of the readers’ comments on the article. The case ultimately reached the European Court of Human Rights, where it was held that Delphi was liable in defamation for the comments.

The most striking feature of the Delphi case is that Delphi had taken multiple measures to combat defamatory user comments on its articles, and operated an effective ‘notice and takedown’ system. As soon as Delphi was notified of the specific defamatory comments, it removed them.

Despite Delphi’s approach, the court took the view that, given the highly controversial and provocative nature of the news article itself, it was reasonably foreseeable that Delphi readers would post defamatory comments. An influential factor in the decision was that Delphi facilitated, if not encouraged, anonymous commenting on its news items.

Accordingly, the court found Delphi was obliged to predict and proactively prevent publication of such comments. This goes beyond the ‘notice and takedown’ principle which contemplates OSPs merely reacting to complaints and removing content.

The Delphi reasoning is therefore at odds with the EU prohibition against compelling OSPs to actively monitor the content they host. An appeal is likely.

Another notable aspect of the Delphi decision is the court’s willingness to impose damages on well-capitalised OSPs in place of the actual authors of the defamatory comments. As a matter of litigation strategy, plaintiffs routinely pursue the defendant who is the best "mark for damages", but financial muscle alone is an unconvincing legal justification for imposing such liability on OSPs.

In effect, the Delphi decision as it stands exposes OSPs to liability in defamation where they facilitate users posting comments. The decision is especially alarming for online news sites who increasingly rely on user comments to bolster advertising revenue.

Placed in its proper context, the Delphi decision forms part of a wider European trend whereby courts are increasingly holding OSPs responsible for hosting defamatory UGC.

In February 2013, the UK’s second highest court ruled in Tamiz v Google that the host of a blog (Google Blogger) could, in principle, be liable for defamatory comments that it hosted after Google had received a complaint but then failed to remove the comments.

The court considered Google’s Blogger platform to be similar to a "gigantic notice-board" upon which Google allowed people to post messages. Accordingly, once aware of the content, Google could then be liable for it.

In what may seem bizarre in the era of near-ubiquitous 3G internet access, the court relied on a 1937 precedent (Byrne v Dean) in which the secretary of a golf club was found to have ‘acquiesced’ in the publication of a defamatory notice on a golf club notice board. The key fact in that case was that the secretary had been aware of the notice, yet had failed to remove it. In this vein, the court concluded that the secretary had acquiesced in the publication and therefore could be liable for it. Whilst it is debatable whether such archaic precedents should apply in the digital era, the Tamiz Court had no hesitation in applying this old law to the new reality.

Nevertheless, the Tamiz decision leaves several issues unresolved. First, is the question of what level of "notice" is sufficient to fix an OSP with liability for failing to remove defamatory comments posted by its users.

Flowing from this is the very practical consideration of whether the automated ‘report abuse’ buttons (typically located beside Facebook and Twitter posts) carry any weight in defamation proceedings against OSPs.

This question has yet to be definitively resolved, but the indication from the UK courts is that at a minimum a formal letter of complaint is required to trigger the OSP’s duty to remove allegedly defamatory material.

In practical terms, given the daily volume of social media communications, it would place an unworkable burden on the likes of Facebook if they were legally obliged to respond every time a user hit the ‘report abuse’ button.

Accordingly, OSPs will of course seek a higher standard of notice, and possibly a court order, before they react.

A recent Irish case on this point is the ongoing McKeogh v Facebook litigation. The dispute over whether internet hosting sites have immunity from defamation litigation is to be thrashed out before the Supreme Court.

The issue has a significance "well beyond" the case where it is raised — that of student Eoin McKeogh, who sued over a YouTube video clip falsely accusing him of evading a taxi fare, the Supreme Court was told earlier this month.

Facebook agreed that the case raises important issues concerning interpretation of the E-Commerce Directive 2000/31/EC, and the Irish regulations implementing that directive.

Those issues centre on whether internet hosting sites may be sued over defamatory material posted on them. Other issues concern whether they have a responsibility to monitor their sites for such material.

In May 2013, Mr McKeogh was granted an interlocutory order requiring that steps be taken by Google, Facebook, and YouTube to permanently remove the video. Mr Justice Michael Peart made that order on foot of his earlier finding that the video was defamatory, as Mr McKeogh was not the person in it.

The judge gave the internet companies a month for their experts, and experts for Mr McKeogh, to come up with reports on how to remove it permanently on a worldwide basis.

The companies sought a stay on the interlocutory order pending their appeal against that order.

On a separate point, an earlier UK decision (Kaschke v Hilton) also highlights the perils to editors of news websites and blogs when they actively moderate and edit user comments. Although it seems counter-intuitive to the policy goal of reducing defamatory online content, editors can actually expose themselves to liability where they edit user comments but fail to remove or edit specific defamatory material.

Faced with this dilemma, there is anecdotal evidence that editors as a matter of policy opt not to edit reader comments for fear that they may expose themselves to liability in this way. This uncertainty contrasts with the clarity of the dedicated statutory regime in the US, which offers unconditional immunity to OSPs from liability in defamation in respect of UGC.

American OSPs can therefore actively monitor UGC without jeopardising their immunity. There is much to be admired in the simplicity of the American approach. Viewed through a wider optic, the predominantly judicially-crafted European approach to OSP liability for hosting defamatory content lacks coherence and certainty.

The eminent American jurist, Oliver Wendell Holmes Jr, once said: "The young man knows the rules, but the old man knows the exceptions." In Europe it seems, that OSPs cannot be sure of either.

lHugh McCarthy is a graduate of UCC and Oxford University, where he conducted research in the Oxford Intellectual Property Centre. This an extract from a paper written by the author.

more courts articles

Defendant in Cobh murder case further remanded in custody Defendant in Cobh murder case further remanded in custody
Further charges to be brought against accused in MV Matthew drugs haul case Further charges to be brought against accused in MV Matthew drugs haul case
Football fan given banning order after mocking Munich air disaster Football fan given banning order after mocking Munich air disaster

More in this section

Inhumanity on grand scale seen in denial of basic aid items Inhumanity on grand scale seen in denial of basic aid items
Trump and the risk of a US debt default Trump and the risk of a US debt default
Dramatic portrait of sad scared young woman on smart mobile phone suffering cyber bullying and harassment. feeling lonely, depre Can AI image generators be policed to prevent explicit deepfakes of children? 
Lunchtime News
Newsletter

Keep up with the stories of the day with our lunchtime news wrap.

Sign up
Revoiced
Newsletter

Sign up to the best reads of the week from irishexaminer.com selected just for you.

Sign up
Cookie Policy Privacy Policy Brand Safety FAQ Help Contact Us Terms and Conditions

© Examiner Echo Group Limited