Liability for Third-party Content

views updated

Liability for Third-party Content

INTRODUCTION
DEFAMATION IN THE UK
ANALYSIS
CONCLUSION
APPENDIX 5.1 THE US
APPENDIX 5.2 EUROPEAN UNION
APPENDIX 5.3 GERMANY
APPENDIX 5.4 SWEDEN
APPENDIX 5.5 SINGAPORE
APPENDIX 5.6 INDIA
APPENDIX 5.7 BERMUDA

INTRODUCTION

Ali runs a website in which he moderates a discussion forum. Wong posts on the forum some material that defames Mohammad. Should Ali also be held liable? Does it matter if he only moderates the forum by deleting off-topic or irrelevant posts?

This is the issue of liability for third-party content. Such content can come from messages posted on bulletin boards or websites hosted by an internet service or content provider (ISP). Some ISPs and internet hosts have been held liable for the content even though they did not originate the content themselves.

The types of substantive law more likely to be infringed by using online facilities include the following:1

1 Rosa Julià-Barceló, “Liability for On-Line Intermediaries: A European Perspective,” European Intellectual Property Law Review 20(12) (1998): 454-63.

  • Copyright material. The infringing act may occur when certain files containing copyright material, such as text, pictures or sounds, are posted on a webpage from which they may be downloaded all over the world. The general counsel of Adobe estimated that the number of webpages containing pirated software rose from 100,000 in 1997 to 900,000 in 1998 to two million in 1999. In one case, a copy of the software Photoshop was downloaded 100,000 times over a six-month period, with an estimated street value of US$60 million.2
  • Illegal and harmful content. The infringing act may occur when material containing pornography, racism or terrorist propaganda are disseminated via internet facilities.
  • Private and defamatory material. Private material, such as pictures taken in intimate situations, can be posted on webpages, bulletin boards, chat rooms, etc., and made available to users, and therefore, infringing rights of privacy, including those contained in European data protection laws. The same may occur with defamatory material.
  • Misrepresentation. This may occur when false or incorrect information is disseminated using online facilities causes damage to a third party.
  • Others. An intermediary could also be held liable for the infringement of other substantive laws such as patents, trademarks and unfair trade practices.

The issue of liability is the most salient in the areas of defamation and copyright. And of the two, copyright is thornier because first, a

2 Batur Oktay and Greg Wrenn, “A Look Back at the Notice-Takedown Provisions of the U.S. Digital Millennium Copyright Act One Year after Enactment,” (lecture, Workshop on Service Provider Liability, World Intellectual Property Organization, Geneva, Switzerland, December 1999). http://www.wipo.org/eng/meetings/1999/osp/doc/osp_lia2.doc (accessed February 13, 2000).

violation of copyright is also a criminal offence and second, it involves international interests. In 1996, during the World Intellectual Property Organization Diplomatic Convention in Geneva, draft provisions to impose liability on ISPs and other network operators were extensively debated before they were deleted from the final drafts of the treaties.3

This chapter uses defamation to clarify the issues before applying the analysis to copyright. Except for the criminal nature of copyright, the principles to be applied are the same.

DEFAMATION IN THE UK

Laurence Godfrey v. Demon Internet4 provides an interesting discussion of defamation on the internet. This was the first defamation case involving an ISP to be tried in an English court, which means that the judgment and, more importantly, the legal arguments used to reach it, are publicly open to scrutiny. Why is an English, as opposed to an American, case important? Because the English legal system has been adopted in a number of Asian countries and is a useful counterpoise to the US system, which was discussed in the case.

In the Godfrey case, the ISP lost. No new law was created and the loss of free speech, if any, is much less than first meets the eye. Nevertheless, the case was controversial.

3 William Foster, “Copyright: Internet Service Provider Rights and Responsibilities,” Internet Society Conference 1997, Kuala Lumpur, Malaysia, June 24-27, 1997. http://www.isoc.org/inet97/proceedings/B1/B1_2.HTM” (accessed December 30, 2004).

4(1999) 4 EMLR 542.

The Facts

On January 13, 1997, someone posted a message described by the judge as “squalid, obscene and defamatory” on the alt.soc.culture.thai Usenet discussion group hosted by Demon Internet.5 The message was posted by an anonymous impostor of Dr. Laurence Godfrey, who is a lecturer in physics, mathematics and computer science based in London.

On January 17, 1997, when Dr. Godfrey became aware of the posting, he sent the managing director of Demon Internet a fax denying his authorship, stating that the post was a forgery, and asked that the message be deleted because it was defamatory. Although Demon Internet could have deleted the message, it took no action until January 27, 1997 when the material on the server expired and was routinely purged.

A legitimate course of action that ISPs have to take when faced with a defamation notice is that they need to seek legal advice to determine whether a statement is defamatory. In this case, however, it is the purported author who was disclaiming authorship.

Dr. Godfrey sued Demon for libel arguing that as the company hosted the Usenet group, it was in effect the publisher of the defamatory material.

The Law

The UK was the first European country to deal with the issue of online defamation. Its Defamation Act 1996 extended the “innocent dissemination” defense to ISPs. This is a somewhat technical defense but without it, a newspaper vendor would be spreading libel and would be held as culpable as the originator of the libel. The defense works for an “innocent disseminator,” such as a newspaper vendor,

5 Just what the message was is not revealed; English law presumes that hurtful messages should not be repeated. This is not the case in US courts where, as will be shown below, the key parts of the defamatory messages are sometimes spelt out.

who would not be able to read every news item in a newspaper before selling it.

The technical details, in brief, are stated in the Defamation Act 1996, Section 1(1) “Responsibility for Publication”:

In defamation proceedings a person has a defence if he shows that—

  • he was not the author, editor or publisher of the statement complained of,
  • he took reasonable care in relation to its publication, and
  • he did not know, and had no reason to believe, that what he did caused or contributed to the publication of a defamatory statement.

Points (a) and (b) and (c) must be proved for the defense to succeed.

The judge in Godfrey ruled that because Demon Internet had not posted the defamatory message, it was not its “author, editor or publisher” and therefore the defense succeeded under its claim in (a). But after January 17, when the matter had been brought to the attention of Demon, it could not succeed with its claims under (b) and (c).

The judge cited a high-level UK government report that had studied the issue of defamation. In considering the defense under (b) and (c), the report stated:

The defence of innocent dissemination has never provided an absolute immunity for distributors, however mechanical their contribution. It does not protect those who knew that the material they were handling was defamatory, or who ought to have known of its nature. Those safeguards are preserved, so that the defence is not available to a defendant who knew that his act involved or contributed to publication defamatory of the plaintiff. It is available only if, having taken all reasonable care, the defendant had no reason to suspect that his act had that effect.6

6 Lord Chancellor's Department, “Consultation Document on Defamation,” July 1996, para. 2.4, Lord Chancellor's Department.

The trial judge said that Demon Internet was not a mere conduit because it had control over the content. He said: “They chose to receive the ‘soc.culture.thai’ postings, to store them, to make them available to accessors and to obliterate them.”

The defense cited several American cases, three of which the court said were in line with English law. In Anderson v. New York Telephone Co.,7 the telephone company was held to be a passive conduit in the transmission of defamatory statements. The court held that the rule created in that case was the same as that in England, although, as will be discussed below, that conclusion is questionable. The English court also agreed with the US case of Cubby v. CompuServe,8 especially when the US court said: “[T]he appropriate standard of liability to be applied to CompuServe is whether it knew or had reason to know of the allegedly defamatory…statements.” Because CompuServe had no editorial control, it was held to be merely a distributor and, therefore, not liable for the defamation. Following that line of reasoning, the US ISP Prodigy was held liable in Stratton Oakmont v. Prodigy9 because it had “uniquely arrogated to itself the role of determining what is proper for its members to post and read on its bulletin boards.” The court held that Prodigy was a publisher rather than a distributor.

It should be noted that the outcome in the Prodigy case was widely, though perhaps not universally, perceived as unfair. After all, it was only because Prodigy deleted postings for their offensiveness and bad taste that it was found to be an editor. The outcome, in effect, encouraged ISPs to ignore the contents of the bulletin boards in order to attain the status of a distributor instead of a publisher. The US Congress, recognizing the unfairness, passed what was called a “Good Samaritan law,”10 to reverse the rule laid down by the Prodigy

7 [1974] 35 NY 2d 746.

8 776 F.Supp. 135 (SDNY 1991).

9 [1995] Misc. Lexis 229; 23 Media Law Rep. 1794.

10 It is so called because in the US, some people who have helped those injured in accidents have been sued, successfully in some cases, for not helping them properly. Without such a law exempting liability, no one will dare touch an injured person.

case. This is Section 230 of the Communications Decency Act (47 USC),11 which was relied upon in two US cases that the Godfrey court had decided were not to be followed.

First, in Zeran v. America Online (AOL),12 the court concluded that AOL was immune from the suit because of Section 230. Zeran had sued AOL for defamation in not deleting defamatory messages about him after he had notified them. It began on April 25, 1995, six days after the bombing of the Oklahoma City federal building. Someone posted a message on a bulletin board on AOL, announcing the availability of “Naughty Oklahoma T-Shirts,” bearing slogans like “Rack'em, Stack'em and Pack'em—Oklahoma 1995” and “Visit Oklahoma—it's a Blast.” The message, posted by someone using the name “Ken ZZ03,” gave a number for phone orders. The number belonged to Zeran, who was not an AOL member, and who maintained that he had nothing to do with the posting. On the day of the posting, Zeran immediately began to receive threatening phone calls. He could not change his phone number because he used it for his business. Two more postings appeared a few days later under the names “Ken ZZ033” and “Ken Z033,” prompting more calls. Zeran notified AOL of the postings and asked the company to delete them and take steps to prevent his phone number from appearing in future postings. AOL declined to help and the postings remained for more than a week. The true identity of “Ken ZZ03” remains unknown because the person gave AOL false information when signing up for a trial account and AOL did not verify the information.

Meanwhile, a radio station in Oklahoma City picked up the postings and broadcast them. Again, Zeran received a deluge of abusive phone calls, at the rate of about one every two minutes.

11 The US Supreme Court in Reno v. ACLU 117 S. Ct. 2329 (1997) did not strike out the entire Act. If there is a clash between a law and the Constitution, only the contradicting section of the law is struck out. In that case, the offending section was found to be in conflict with the First Amendment. It should also be noted that the Act is long and that in a quirk of the US legal system, the title of the Act does not always tell you what the contents of the Act are.

12 [1997] 129 F3d 327.

The court found that Section 230 immunized the ISP from liability for third-party postings. The relevant part of Section 230 states: “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.“13

Zeran appealed but the appellate court rejected his appeal and observed: “By its plain language, Section 230 creates a federal immunity to any cause of action that would make service providers liable for information originating with a third-party user of the service. Specifically, Section 230 precludes courts from entertaining claims that would place a computer service provider in a publisher's role. Thus, lawsuits seeking to hold a service provider liable for its exercise of a publisher's traditional editorial functions—such as deciding whether to publish, withdraw, postpone or alter content—are barred.”

The chief judge in his verdict said that Zeran's recourse ultimately lies with the original anonymous poster, not AOL. He said, “Congress made a policy choice, however, not to deter harmful online speech through the separate route of imposing tort liability on companies that serve as intermediaries for other parties' potentially injurious messages.”

The Zeran case shows the special rights that Section 230 confers on the ISP, in contrast to the radio station. When Zeran called the station, it retracted the statement and later ran an apology in a newspaper. And when he sued the station, it could not claim any special immunity. Nevertheless, the court found for the radio station for several reasons. First, Zeran suffered no harm to his reputation. Second, the radio disk jockey had tried to email him and had no knowledge that the information she was broadcasting was false. (This is different from the position under English law where it is possible to

13 47 USC s230(c)(1). See Appendix 5.1 for the full section.

unintentionally defame someone.) Third, although he suffered some mental anxiety, it did not interfere with his work. So even if the outcome for the radio station was the same as that for AOL, the court relied on the facts of the case when it came to the radio station, whereas the court in the case involving AOL relied on the law to excuse AOL.

There was a lot of sympathy for Zeran. As some lawyers observed, Zeran unquestionably suffered harm but he had no remedy.14 If there is any saving grace, it is that, in the trial where there was no immunity provision, the court found that Zeran was not actually defamed because his reputation did not suffer.15 Even with the immunity provision, it would have been immaterial if his reputation had suffered. The ISP would not be liable even if they knew that the statement was false. The Zeran rule means that ISPs in the US have no responsibility whatsoever for third-party content. The case is troubling because it may have gone too far in reversing the rule in the Stratton Oakmont v. Prodigy case.

The other case that the Godfrey court deemed was not relevant was Lunney v. Prodigy.16 The Lunney court did not even look at Section 230. The case suggests that if an ISP behaves like a telephone company, under existent case law, it would be automatically immune from liability for content it carried. In that case, someone opened several accounts with Prodigy under variants of the name of Alexander Lunney, a 15-year-old boy scout. Using the accounts, the anonymous prankster, on September 9, 1994, posted offensive material on an electronic bulletin board and sent an offensive and threatening email to his scoutmaster. The scoutmaster contacted the police, who concluded that Lunney was not the perpetrator.

14 Carl Kaplan, “Another Legal Defeat for Victim of Online Hoax,” New York Times, November 24, 2000.

15 This is different from English law. No actual damage is necessary; it is sufficient if the words are defamatory.

16 [1999] NY Int 0165.

Less than a week later, in a letter dated September 14, 1994, Prodigy notified Lunney that it was terminating one of his accounts because he had sent “abusive, obscene, and sexually, explicit material” through its service. Lunney replied in a letter dated September 23, 1994 that an impostor had opened the account and sent the message. A month later, in a letter dated October 27, 1994, Prodigy apologized to Lunney and informed him that several other accounts that had been opened under his name had been terminated. On December 22, 1994, Lunney sued Prodigy.

Lunney lost in the trial and two appellate courts. The Court of Appeals, following the New York Supreme Court, likened email to a telephone service and, relying on the New York State common law rule created in Anderson v. New York Telephone Co., held Prodigy not liable for defamation. In Anderson, the telephone company employees had dialed the number and listened to the defamatory messages being played over the telephone through a recording device but they did not take any action. Nevertheless, the company was not held liable. The court in Anderson noted that the telephone company was a passive carrier of the messages. It was unlike the telegraph, in which someone had to write down the messages relayed and therefore could be said to be a direct participant. The court in the Lunney case held that an ISP functioned more like a phone company. This immunity or privilege of a phone company is qualified by the exceptions of bad faith or malice.

The postings on the bulletin board were more problematic because of the different ways they were maintained. But even where the defendant retains more active editorial control, the court held that it would be unreasonable to expect the defendant to patrol the myriad of messages placed on its bulletin boards. The Court of Appeals declined to use Section 230 of the Communications Decency Act. It said, “Given the extraordinarily rapid growth of this technology and its developments, it is plainly unwise to lurch prematurely into emerging issues….”

How should the Lunney case be read? All court cases can be read along two avenues: based on the facts or on the law. On the facts, it would be appear that Lunney as a minor had very little of a reputation to be dented. And he suffered little, if any, financial loss. Prodigy did resolve the matter in a relatively quick two days. From that point of view, the case was fairly decided.17 In contrast, in both the Zeran and Godfrey cases, the defamatory postings were left on for more than a week. So on the facts of the case, the Lunney decision is probably right.

Do ISPs have any duty to ensure that innocent people are not defamed or hurt? In principle, it would be unreasonable to expect ISPs to patrol the message boards, even their own. But don't they owe a duty while gaining income, directly or indirectly, from those who are posting defamatory messages? Intuitively, there should be some responsibility if one is selling goods and services: a producer of goods or services who is under no duty of care will simply produce goods or services that are dangerous and harmful.

Treating an ISP as a telephone service is a tricky analogy. All ISPs have some web presence. To grant an ISP immunity from liability simply because it is an ISP would give its websites preferential treatment over other content sites. The issue must lie in the particular function of the ISP in the matter complained of.

These issues were discussed in two more cases that used the Section 230 immunity for ISPs, which happened to be AOL. In Doe v. America Online,18 the mother of an eleven-year-old boy sued AOL after her son was molested. AOL failed to remove the account of the child pornographer in question. The court held that AOL was not

17 It may be worth noting that there may be a relationship between the plaintiff and the lawyers for the case. The law firm that represented Lunney was Lunney & Murtagh, and one of the lawyers identified in the appellate court's opinion was J. Robert Lunney. This may explain the tenacity of the case in going through two appeals.

18 718 So. 2d 385 (Fla. Dist. Ct. App. 1998).

liable to the plaintiff for failing to monitor the marketing of child pornography. Given that Section 230 is part of the Communication Decency Act, this is an ironic outcome unlikely to have been foreseen by the US Congress.

In Blumental v. Drudge and AOL,19 the court ruled that AOL could not be sued because Section 230 protected it. This was despite the fact that AOL had hired Drudge at US$3,000 a month and he was not receiving income elsewhere. In effect, Drudge appeared to be more of an employee than an independent contractor. The court was clearly uncomfortable with the outcome. It said:

If it were writing on a clean slate, this Court would agree with the plaintiffs. AOL has certain editorial rights with respect to the content provided by Drudge and disseminated by AOL, including the right to require changes in content and to remove it; and it has affirmatively promoted Drudge as a new source of unverified instant gossip on AOL. Yet it takes no responsibility for any damage he may cause. AOL is not a passive conduit like the telephone company, a common carrier with no control and therefore no responsibility for what is said over the telephone wires. Because it has the right to exercise editorial control over those with whom it contracts and whose words it disseminates, it would seem only fair to hold AOL to the liability standards applied to a publisher or, at least, like a book store owner or library, to the liability standards applied to a distributor. But Congress has made a different policy choice by providing immunity even where the interactive service provider has an active, even aggressive role in making available content prepared by others. In some sort of tacit quid pro quo arrangement with the service provider community, Congress has conferred immunity from tort liability as an incentive to Internet service providers to self-police the Internet for obscenity and other

19 US Dist. Ct., DC, Civil Action No. 97-1968.

offensive material, even where the self-policing is unsuccessful or not even attempted.20

In short, the court in Blumenthal concluded that AOL had taken advantage of all the benefits of the Communications Decency Act without accepting any of the burdens, by making rumours and gossip instantly accessible and then claiming immunity when a defamation suit inevitably resulted. The breadth of the scope of Section 230 has been criticized because “it ignores the dual interests of states in protecting the reputation of its domicilliaries and protecting its citizens from the dissemination of false information.“21

ANALYSIS

No country has followed the US approach of granting a blanket civil immunity to content hosts. Section 230 may be perceived as favoring the service provider at the expense of the individual. Far from encouraging any kind of regulation, self or otherwise, it, in fact, discourages anyone from being a good samaritan.22 There are attendant costs in being one but doing good deeds does not confer any additional benefit. In all the cases, although the offending materials were removed, in theory at least, there was no need to. That

20 Ibid. at p. 15.

21 B.J. Waldman, “A Unified Approach to Cyber-Libel: Defamation on the Internet, a Suggested Approach,” Richmond Journal of Law & Technology 9 (1999). http://www.richmond.edu/jolt/vbi2/note1.html (accessed November 29, 2001).

22 In a March 2001 case, the Florida Supreme Court in a 4-3 verdict granted AOL immunity against a civil suit alleging that child pornography was transacted on its site. The minority judges said that to immunize AOL “flies in the face of the very purpose of the Communications Decency Act” and service providers “can very profitably and with total immunity, knowingly allow their customers to operate through their Internet services.” Jackie Hallifax, “AOL Immune from Porn Lawsuit, Florida High Court Rules,” Nandotimes, 2001. http://www.nandotimes.com/technology/story/0,1643,500461435-500703332-503838001-0,00.html (accessed March 12, 2001).

is, the defamatory or copyright-violating material could be left on a US site permanently even after the court case.

On the other hand, Section 230 does not cover illegal materials, so service providers are exposed to criminal liability. The one area where there is immunity from criminal liability is copyright. The US Digital Millennium Copyright Act lays down guidelines on how service providers should act when offending material is brought to their notice. The procedures for taking down and restoring are detailed and obviously add to costs.

Such an outcome suggests that there is a need for the law to encourage ISPs and hosting service companies to take the appropriate action soon after it has been determined that the material is illegal or offensive.

Indeed, as the survey below will reveal, most countries do immunize ISPs and hosting companies from liability until they know of the presence of the offending material under their control. The precise mechanics of removal varies but they must be removed within a reasonable period.

European Union

The European Union's Directive on Electronic Commerce of May 2000 immunizes service providers where they play a passive role as a “mere conduit” of information. The service provider must not (1) originate the information, (2) select the receiver of the information, and (3) select or modify the information that is transmitted (Article 12). And upon knowledge of the offending material, the service provider must act “expeditiously” to remove or deny access to them (Articles 13 and 14). Service providers have no general obligation to monitor content although there could be monitoring in specific cases (Article 15).23

23 See Appendix 5.2 for the recital and relevant articles.

France

France has passed a law clarifying the responsibilities of website hosts. The law was prompted by a case in which the nude images of a model were posted by a subscriber on an ISP's website without the model's consent. Even though it was unaware, the ISP was found liable for privacy violations.24 The Liberty of Communication Act (Loi Sur La Liberté De Communication) proposes a general principle that hosts are not responsible for third-party content. However, they must keep proper records that will allow the author of the content to be identified. Under the law:25

  • ISPs may be made liable if they do not delete content when told to do so by a judge, or if they have failed to undertake the “appropriate diligences” when informed by a third party that they are hosting allegedly illegal content, or content that may cause prejudice to the third party (Article 43-6-2).
  • Access providers and host providers are required to keep track of data, allowing a content provider to be identified. This information may only be provided to a judge (Article 43-6-3).
  • Content providers are required to identify themselves to the public by putting their details on the website. If the provision of content is not a professional activity, the content provider

24 Lefebure v. Lacambre, Tribunal de Grande Instance de Paris, Ref. 55181/98, No. 1/JP (6/9/98). Under French law, an ISP is responsible for the content hosts, and may be liable for violations of privacy. The offending site was shut down, damages of FFr 50,000 were awarded, and the ISP was charged, under the threat of a fine of FFr 100,000 per day for non-compliance, to implement measures “to ensure the impossibility of diffusion of the photos in question from any site offered by the service.” (Perkins Coie LLP, Internet Case Digest. http://www.perkinscoie.com/casedigest/icd_results.cfm?keyword1=international&topic=International (accessed February 10, 2001)). See also “Les hébergeurs du Net sous surveillance,” Le Figaro, July 7, 2000, 10.

25 Translation from Imaginons un Réseau Internet Solidaire (first published June 23, 2000). Declaration of Internet Actors. http://www.iris.sgdg.org/actions/loi-comm/declaration.html (accessed February 10, 2001).

is allowed to restrict identification to the host provider (Article 43-6-4).

Germany26

Germany's Information and Communication Services Act (Informations- und Kommunikationsdienste Gesetz IuKDG, August 1, 1997,)27 also known as the Multimedia Act, divides the various Internet players into content service providers, access service providers, hosting service providers. A service provider is only liable for content that it is aware of. So a content provider is likely to be the most liable for content and an access provider the least likely. Still, an access provider may be ordered to block access if it is “technically able and can reasonably be expected” to do so (Article 1, Section 5(2)).28

It should be noted, however, that although the law sounds reasonable, it was this same law that was used to convict Felix Somm, the head of CompuServe in Germany, for carrying illegal content in its newsgroups. The court ruled that the company should have known that some of the 200-odd discussion groups that it carried must have had illegal content, although it may not have known which particular ones and at what point. To compound matters, CompuServe initially did block some of the groups, suggesting that it was “technically able” to do so. Of course, while it was technically able to do so in Germany, it affected the operations of the company in Europe.

From a legal interpretation standpoint, criminal statutes should be interpreted in such a way so as to favor the accused. From a policy

26 For a detailed analysis of the German provisions, see Sieber (1999a). Professor Sieber was on the defense team of Felix Somm, the managing director of CompuServe Europe, when the company was prosecuted by the Bavarian state police in 1998 for failing to block access to child pornography.

27 http://www.iid.de/iukdg/gesetz/iukdge.html (accessed February 11, 2001).

28 See Appendix 5.3 for the relevant article.

standpoint, the words “know” and “technically able” need to be interpreted more liberally so as to favor the operation of the Network. In short, the verdict in the court of first instance deserved the wide reprobation it obtained and the reversal at the Court of Appeal.29

Sweden

In May 1998, the Swedish Parliament passed the Act on Responsibility for Electronic Bulletin Boards (Lag (1998:112) om ansvar för elektroniska anslagstavlor) (the “Act”). The Act obliges service providers to remove obviously illegal or copyright infringing material. In order to fulfill this obligation, the provider must supervise the activities of his subscribers in so far as can reasonably be expected in view of the size and the purpose of the service.30

The Act is unique in that it imposes upon service providers a duty to monitor content “to the extent that can reasonably be required, considering the scope and direction of the operation” (Article 4). Service providers must remove or deny access to obviously illegal postings such as copyright-infringing material, child pornography, racial agitation and other matters illegal under the Swedish Penal Code.

According to a Swedish academic in computing, “[I]f, however, checking every single message is too cumbersome, the provider can handle the supervision through an abuse board, to which users can complain about illegal messages. For areas where illegal contributions are common, the provider of the area, however, must check regularly and remove illegal content. Thus, it is enough to react to complaints in areas where illegal contributions are rare, but this is

29 Mary Lisbeth D'Amico, “Porn Ruling Protects German ISPs,” Industry Standard, 1999. http://www.e-gateway.net/infoarea/news/news.cfm?nid=221 (accessed February 11, 2001).

30 See Appendix 5.4 for the relevant articles.

not enough for areas where illegal contributions are common.“31 This means that in practice, large sites will work through complaints. It is not clear, however, if all sites must show some monitoring activity so as to demonstrate that such work is “too cumbersome.”

Australia

Australia's internet rules have become more complex under the interventionist Liberal Party. The Broadcasting Services Amendment (Online Services) Act 1999 immunizes ISPs from all State and Territory criminal laws, and any rule of common law or equity (but not Commonwealth statutes) that would subject ISPs to liability for content the nature of which they were not aware, or which would require the ISP to monitor, make inquires about or keep records of internet content hosted or carried (Section 91). The Act also immunizes ISPs and content hosts from civil liability if they comply with an industry-developed code of conduct approved by the Minister (Section 88).

Singapore

Singapore's Section 10 of the Electronic Transactions Act passed in 1998 exempts a “network service provider” from civil and criminal liability, including copyright and defamation.32 The term “network service provider” is not defined and is intended to be defined in terms of functions. The section was drafted independently of the US Communication Decency Act and its exemption of criminal liability makes it more liberal than the US position. As it stands, ISPs have no stated duties of monitoring or even taking action after they have been put on notice.

31 Jacob Palme, “Swedish Law on Responsibilities for Internet Information Providers,” June 3, 1998. http://dsv.su.se/jpalme/society/swedish-bbs-act.html (accessed February 12, 2001).

32 See Appendix 5.5 for the text of Section 10.

Section 10 has been used as a starting point by India33 and Bermuda34 in drafting their own ecommerce laws. But these two countries have modified the clause so that service providers must act once they have been put on notice.

India

The Indian experience is instructive because the initial draft in 1998 was a word-for-word adoption of the Singapore version.35 Then, representatives of the Indian Music Industry and the National Association of Software and Service Companies wanted Section 10 of the Singapore Act to be deleted.36 In its final form, as passed in May 2000, the representation was successful such that the section exempts service providers who did not know the offense was committed or who had exercised “all due diligence” to prevent the offense being committed (Section 79).37

Bermuda

Bermuda is the first country in the world to have a ministry of ecommerce. Perhaps because of this focus, it has elaborate rules on handling offending third-party content. The Electronic Transactions Act 1999 uses the more abstract term of “intermediary” and defines it as “a person who, on behalf of another person, sends, receives or stores that electronic record or provides other services with respect to that electronic record.”38 The choice of the words highlights the

33 Ministry of Commerce and Industry, 1999, Section 48. http://commin.nic.in/doc/ecact11.htm#p48.

34 Ministry of Telecommunications and E-Commerce, 1999, 27.

35 Ministry of Commerce and Industry.

36 Indian Parliamentary Committee, Committee on Science and Technology, Environment and Forests, 79th Report on the Information Technology Bill, May 12, 2000. http://rajyasabha.nic.in/book2/reports/science/79report.html (accessed February 12, 200).

37 See Appendix 5.6 for the section.

38 See Appendix 5.7.

difficulty of drafting detailed provisions so as to apply the law and therefore able to target to specific, ever-changing commercial situations. The German approach of carving up the industry into content, hosting and access providers is helpful in clarifying the law for industry. On the other hand, the German approach also assumes that technology has stabilized.

The Bermudan example is less specific and is drawn along the lines of functions, that is, an entity is only an intermediary and not an originator of content. But it uses the clause “provides other services,” which seems to suggest that modification of content may be allowed.39

The Electronic Transactions Act immunizes the intermediary from civil and criminal liability if it had no knowledge of the offending material. If the intermediary knew of the material, it is immune if it follows the procedure laid down in a code of conduct (Section 27). This legislation then empowers the Minister to compel industry to develop such a code, failing which a code would be drawn up by the government and imposed on industry.

CONCLUSION

It is instructive to note that a number of countries have placed the immunity provision in their ecommerce laws. This suggests that the provision is essential in order to conduct and encourage ecommerce. The immunity clauses that favor industry the most are to be found in the US and Singapore. In both regimes, immunity exists even if the service provider knew of the offending material. Up to the time of

39 On a more technical note, it also uses the word “person” which is not defined in the Electronic Transactions Act and so should be taken to mean someone who is not a minor.

writing, there have been no Singapore cases that have passed through the courts. In the US, the most recent decisions by the courts suggest that the immunity provision may have gone too far in favoring the service provider. In practice, the US law discourages self-regulation and imposes no responsibility at all on the service provider.

For the rest of the world, the common immunity requirement is that the service provider is an independent third party without knowledge of the offending material; once the offending material is brought to its attention, the immunity benefit starts to break down as it is required to take the necessary steps to correct the matter. Such a position, requiring reasonable action by the ISPs, is intuitively more appealing and arguably more logical.

A number of laws require that the offending material should be removed “expeditiously” (the EU), with “all due diligence” (India) and “as soon as practicable” (Bermuda). Just what these phrases mean in practice is not determined. This is where industry input would be helpful. The Bermudan provision that gives the option of creating a code40 is interesting because it suggests scope for self-regulation in the industry.

Indeed, industry can play a role in regulating itself. The court cases in France (the model case) and Germany (the CompuServe case) suggest that uninformed courts can make decisions that gravely impact the workings of the net.

The precise mechanics of such a self-regulatory code vary, but they must balance the interest of industry with that of the person who may be affected by the offending material. Such a code is vital as the instances of such material has increased. Yahoo handles several thousand copyright notices each quarter. The workload has increased

40 The actual code of conduct (available at http://bibaproject.northrock.bm/documents/legislation/code.pdf [accessed February 13, 2001]), however, is a disappointment. The code deals with generalities and “reasonable” behavior but does not deal with specifics, which is the major concern of industry.

such that the company, in which one person used to handle offending material, has trained three additional backs. And they have additional support from more than fifteen full-time customer service employees, two full-time lawyers and one paralegal, all of whom handle a range of complaints including alleged copyright infringement as well as allegations of defamation, trademark infringement and other claims.41

Offline laws are already in place to impose civil or criminal liability for content that contains defamation and copyright infringement. An industry code, by automatically minimizing, if not eliminating, liability through certain courses of action, can reduce costs as well. There is scope, therefore, for a self-regulatory code on the matter of third-party postings. It is suggested that a code be drafted on the issue of reasonableness of action for compliance, on the procedures for taking down postings and perhaps a procedure for putting them back as well.

41 Oktay and Wrenn, “A Look Back at the Notice.”

APPENDIX 5.1 THE US

Telecommunications Act of 1996 (Pub. L. 104-104, Title V, February 8, 1996) Section 230. Protection for Private Blocking and Screening of Offensive Material42

(a) Findings

The Congress finds the following:

  1. The rapidly developing array of Internet and other interactive computer services available to individual Americans represent an extraordinary advance in the availability of educational and informational resources to our citizens.
  2. These services offer users a great degree of control over the information that they receive, as well as the potential for even greater control in the future as technology develops.
  3. The Internet and other interactive computer services offer a forum for a true diversity of political discourse, unique opportunities for cultural development, and myriad avenues for intellectual activity.
  4. The Internet and other interactive computer services have flourished, to the benefit of all Americans, with a minimum of government regulation.
  5. Increasingly Americans are relying on interactive media for a variety of political, educational, cultural, and entertainment services.

(b) Policy

It is the policy of the United States

  1. to promote the continued development of the Internet and other interactive computer services and other interactive media;

42 http://www.usinfo.state.gov/usa/infousa/laws/majorlaw/s652titl.htm(accessed February 9, 2001).

  1. to preserve the vibrant and competitive free market that presently exists for the Internet and other interactive computer services, unfettered by Federal or State regulation;
  2. to encourage the development of technologies which maximize user control over what information is received by individuals, families, and schools who use the Internet and other interactive computer services;
  3. to remove disincentives for the development and utilization of blocking and filtering technologies that empower parents to restrict their children's access to objectionable or inappropriate online material; and
  4. to ensure vigorous enforcement of Federal criminal laws to deter and punish trafficking in obscenity, stalking, and harassment by means of computer.

(c) Protection for “Good Samaritan” blocking and screening of offensive material

  1. Treatment of publisher or speaker
    No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.
  1. Civil liability
    No provider or user of an interactive computer service shall be held liable on account of —
  • any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected; or
  • any action taken to enable or make available to information content providers or others the technical means to restrict access to material described in paragraph (1).

(d) Effect on other laws

  1. No effect on criminal law
    Nothing in this section shall be construed to impair the enforcement of section 223 or 231 of this title, chapter 71 (relating to obscenity) or 110 (relating to sexual exploitation of children) of title 18, or any other Federal criminal statute.
  1. No effect on intellectual property law
    Nothing in this section shall be construed to limit or expand any law pertaining to intellectual property.
  1. State law
    Nothing in this section shall be construed to prevent any State from enforcing any State law that is consistent with this section. No cause of action may be brought and no liability may be imposed under any State or local law that is inconsistent with this section.
  1. No effect on communications privacy law
    Nothing in this section shall be construed to limit the application of the Electronic Communications Privacy Act of 1986 or any of the amendments made by such Act, or any similar State law.

(e) Definitions

As used in this section:

  1. Internet
    The term “Internet” means the international computer network of both Federal and non-Federal interoperable packet switched data networks.
  1. Interactive computer service
    The term “interactive computer service” means any information service, system, or access software provider that provides or enables computer access by multiple users to a computer server, including specifically a service or system that provides access to the Internet and such systems operated or services offered by libraries or educational institutions.
  1. Information content provider
    The term “information content provider” means any person or entity that is responsible, in whole or in part, for the creation or development of information provided through the Internet or any other interactive computer service.
  1. Access software provider
    The term “access software provider” means a provider of software (including client or server software), or enabling tools that do any one or more of the following:
  • filter, screen, allow, or disallow content;
  • pick, choose, analyze, or digest content; or
  • transmit, receive, display, forward, cache, search, subset, organize, reorganize, or translate content.

APPENDIX 5.2 EUROPEAN UNION

Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the internal market43(Directive on Electronic Commerce)

Recitals

(41) This Directive strikes a balance between interests at stake and establishes principles upon which industry agreements and standards can be based.

(42) The exemptions from liability established in this Directive cover only cases where the activity of the information society service provider is limited to the technical process of operating and giving access to a communication network over which information made available by third parties is transmitted or temporarily stored, for the sole purpose of making the transmission more efficient; this activity is of a mere technical, automatic and passive nature, which implies that the information society service provider has neither knowledge of nor control over the information which is transmitted or stored.

(43) A service provider can benefit from the exemptions for ‘mere conduit’ and for ‘caching’ when he is in no way involved with the information transmitted; this requires among other things that he does not modify the information that he transmits; this requirement does not cover manipulations of a technical nature which take place in the course of the transmission as they do not alter the integrity of the information contained in the transmission.

43 http://europa.eu.int/comm/internal_market/en/media/eleccomm/com31en.pdf (accessed February 11, 2001).

(44) A service provider who deliberately collaborates with one of the recipients of his service in order to undertake illegal acts goes beyond the activities of ‘mere conduit’ or ‘caching’ and as a result cannot benefit from the liability exemptions established for these activities.

(45) The limitations of the liability of intermediary service providers established in this Directive do not affect the possibility of injunctions of different kinds; such injunctions can in particular consist of orders by courts or administrative authorities requiring the termination or prevention of any infringement, including the removal of illegal information or the disabling of access to it.

(46) In order to benefit from a limitation of liability, the provider of an information society service, consisting of the storage of information, upon obtaining actual knowledge or awareness of illegal activities has to act expeditiously to remove or to disable access to the information concerned; the removal or disabling of access has to be undertaken in the observance of the principle of freedom of expression and of procedures established for this purpose at national level; this Directive does not affect Member States' possibility of establishing specific requirements which must be fulfilled expeditiously prior to the removal or disabling of information.

(47) Member States are prevented from imposing a monitoring obligation on service providers only with respect to obligations of a general nature; this does not concern monitoring obligations in a specific case and, in particular, does not affect orders by national authorities in accordance with national legislation.

(48) This Directive does not affect the possibility for Member States of requiring service providers, who host information provided by recipients of their service, to apply duties of care, which can

reasonably be expected from them and which are specified by national law, in order to detect and prevent certain types of illegal activities.

(49) Member States and the Commission are to encourage the drawing-up of codes of conduct; this is not to impair the voluntary nature of such codes and the possibility for interested parties of deciding freely whether to adhere to such codes.

Section 4: Liability of intermediary service providers

Article 12

‘Mere conduit’

  1. Where an information society service is provided that consists of the transmission in a communication network of information provided by a recipient of the service, or the provision of access to a communication network, Member States shall ensure that the service provider is not liable for the information transmitted, on condition that the provider:
    • does not initiate the transmission;
    • does not select the receiver of the transmission; and
    • does not select or modify the information contained in the transmission.
  2. The acts of transmission and of provision of access referred to in paragraph 1 include the automatic, intermediate and transient storage of the information transmitted in so far as this takes place for the sole purpose of carrying out the transmission in the communication network, and provided that the information is not stored for any period longer than is reasonably necessary for the transmission.
  3. This Article shall not affect the possibility for a court or administrative authority, in accordance with Member States' legal systems, of requiring the service provider to terminate or prevent an infringement.

Article 13

‘Caching’

  1. Where an information society service is provided that consists of the transmission in a communication network of information provided by a recipient of the service, Member States shall ensure that the service provider is not liable for the automatic, intermediate and temporary storage of that information, performed for the sole purpose of making more efficient the information's onward transmission to other recipients of the service upon their request, on condition that:
    • the provider does not modify the information;
    • the provider complies with conditions on access to the information;
    • the provider complies with rules regarding the updating of the information, specified in a manner widely recognized and used by industry;
    • the provider does not interfere with the lawful use of technology, widely recognized and used by industry, to obtain data on the use of information; and
    • the provider acts expeditiously to remove or to disable access to the information it has stored upon obtaining actual knowledge of the fact that the information at the initial source of the transmission has been removed from the network, or access to it has been disabled, or that a court or an administrative authority has ordered such removal or disablement.
  2. This Article shall not affect the possibility for a court or administrative authority, in accordance with Member States' legal systems, of requiring the service provider to terminate or prevent an infringement.

Article 14

Hosting

  1. Where an information society service is provided that consists of the storage of information provided by a recipient of the service, Member States shall ensure that the service provider is not liable for the information stored at the request of a recipient of the service, on condition that:
    • the provider does not have actual knowledge of illegal activity or information and, as regards claims for damages, is not aware of facts or circumstances from which the illegal activity or information is apparent; or
    • the provider, upon obtaining such knowledge or awareness, acts expeditiously to remove or to disable access to the information.
  2. Paragraph 1 shall not apply when the recipient of the service is acting under the authority or the control of the provider.
  3. This Article shall not affect the possibility for a court or administrative authority, in accordance with Member States' legal systems, of requiring the service provider to terminate or prevent an infringement, nor does it affect the possibility for Member States of establishing procedures governing the removal or disabling of access to information.

Article 15

No general obligation to monitor

  1. Member States shall not impose a general obligation on providers, when providing the services covered by Articles 12, 13 and 14, to monitor the information which they transmit or store, nor a general obligation actively to seek facts or circumstances indicating illegal activity.
  2. Member States may establish obligations for information society service providers promptly to inform the competent public

authorities of alleged illegal activities undertaken or information provided by recipients of their service or obligations to communicate to the competent authorities, at their request, information enabling the identification of recipients of their service with whom they have storage agreements.

APPENDIX 5.3 GERMANY

Information and Communication Services Act44(Informations- und Kommunikationsdienste-Gesetz -IuKDG) August 1, 1997

Article 1 Section 5: Responsibility

  1. Providers shall be responsible in accordance with general laws for their own content, which they make available for use.
  2. Providers shall not be responsible for any third-party content which they make available for use unless they have knowledge of such content and are technically able and can reasonably be expected to block the use of such content.
  3. Providers shall not be responsible for any third-party content to which they only provide access. The automatic and temporary storage of third-party content due to user request shall be considered as providing access.
  4. The obligations in accordance with general laws to block the use of illegal content shall remain unaffected if the provider obtains knowledge of such content while complying with telecommunications secrecy under §85 of the Telecommunications Act (Telekommunikationsgesetz) and if blocking is technically feasible and can reasonably be expected.

44 http://www.iid.de/iukdg/gesetz/iukdge.html (accessed February 11, 2001).

APPENDIX 5.4 SWEDEN

Act on Responsibility for Electronic Bulletin Boards45(Lag (1998:112) om ansvar för elektroniska anslagstavlor) passed May 1998

Section 4

Supervision of the service

The supplier of electronic bulletin board shall, in order to be able to fulfill the obligations according to Section 5, supervise the service to an extent which is reasonable considering the extent and objective of the service.

Section 5

Obligation to remove certain messages

If a user submits a message to an electronic bulletin board, the supplier must remove the message, or in other ways make it inaccessible, if

  1. the message content is obviously such as is referred to in the Penal Code, Chapter 16, Section 5, about instigation of rebellion, Chapter 16, Section 8 about racial agitation, Chapter 16, Section 10a about child pornography, Chapter 16, Section 10b about illegal description of violence, or
  2. it is obvious that the user has, by submitting the message, infringed on the copyright or other right protected by Chapter 5 in the law about copyright to literary and artistic work.

In order to be able to fulfill the obligation according to the first and second clause above, the supplier is allowed to check the content of message in the service.

These obligations and rights also apply to those who have been given the task, by the supplier, to supervise the service.

45 Jacob Palme, “Swedish Law on Responsibilities for Internet Information Providers,” June 3, 1998. Translation: http://dsv.su.se/jpalme/society/swedish-bbs-act.html (accessed February 12, 2001).

APPENDIX 5.5 SINGAPORE

Electronic Transactions Act 199846

Liability of Network Service Providers

10 (1) A network service provider shall not be subject to any civil or criminal liability under any rule of law in respect of third-party material in the form of electronic records to which he merely provides access if such liability is founded on —

  • the making, publication, dissemination or distribution of such materials or any statement made in such material; or
  • the infringement of any rights subsisting in or in relation to such material.

(2) Nothing in this section shall affect —

  • any obligation founded on contract;
  • the obligation of a network service provider as such under a licensing or other regulatory regime established under any written law; or
  • any obligation imposed under any written law or by a court to remove, block or deny access to any material.

(3) For the purposes of this section —

  • “provides access”, in relation to third-party material, means the provision of the necessary technical means by which third-party material may be accessed and includes the automatic and temporary storage of the third-party material for the purpose of providing access;
  • “third-party”, in relation to a network service provider, means a person over whom the provider has no effective control.

46 http://www.cca.gov.sg/eta/index.html (accessed February 10, 2001) (changed).

APPENDIX 5.6 INDIA

Information Technology Act 200047

79. Network service providers not to be liable in certain cases

For the removal of doubts, it is hereby declared that no person providing any service as a network service provider shall be liable under this Act, rules or regulations made thereunder for any third-party information or data made available by him if he proves that the offence or contravention was committed without his knowledge or that he had exercised all due diligence to prevent the commission of such offence or contravention.

Explanation. — For the purposes of this section,

  • “network service provider” means an intermediary;
  • “third-party information” means any information dealt with by a network service provider in his capacity as an intermediary;

47 http://www.mit.gov.in/itbillonline/itbill2000.htm (accessed February 10, 2001).

APPENDIX 5.7 BERMUDA

Electronic Commerce Act48

Liability of Intermediaries

27 (1) An intermediary is not subject to any civil or criminal liability in respect of any information contained in an electronic record in respect of which the intermediary provides services, if the intermediary was not the originator of that electronic record and—

  • has no actual knowledge that the information gives rise to civil or criminal liability;
  • is not aware of any facts or circumstances from which the likelihood of civil or criminal liability in respect of the information ought reasonably to have been known; or
  • follows the procedure set out in section 28 if the intermediary —
    • acquires knowledge that the information gives rise to civil or criminal liability; or
    • becomes aware of facts or circumstances from which the likelihood of civil or criminal liability in respect of the information ought reasonably to have been known.

(2) An intermediary is not required to monitor any information contained in an electronic record in respect of which the intermediary provides services in order to establish knowledge of, or to become aware of, facts or circumstances to determine whether or not the information gives rise to civil or criminal liability.

48 http://www.mtec.bm/PDFs/ecommerce/ecommerce_act.pdf (accessed February 10, 2001) (changed).

(3) Nothing in this section relieves an intermediary from complying with any court order, injunction, writ, Ministerial direction, regulatory requirement, or contractual obligation in respect of an electronic record.

Procedure for Dealing with Unlawful, Defamatory, etc. Information

28 (1) If an intermediary has actual knowledge that the information in an electronic record gives rise to civil or criminal liability, as soon as practicable the intermediary shall —

  • remove the information from any information processing system within the intermediary's control and cease to provide or offer to provide services in respect of that information; and
  • notify the Minister or appropriate law enforcement agency of the relevant facts and of the identity of the person for whom the intermediary was supplying services in respect of the information, if the identity of that person is known to the intermediary.

(2) If an intermediary is aware of facts or circumstances from which the likelihood of civil or criminal liability in respect of the information in an electronic record ought reasonably to have been known as soon as practicable the intermediary shall —

  • follow the relevant procedure set out in a code of conduct approved or standard appointed under section 29 if such code or standard applies to the intermediary; or
  • notify the Minister.

(3) If the Minister is notified in respect of any information under subsection (2), the Minister may direct the intermediary to —

  • remove the electronic record from any information processing system within the control of the intermediary;
  • cease to provide services to the person to whom the intermediary was supplying services in respect of that electronic record; and
  • cease to provide services in respect of that electronic record.

(4) An intermediary is not liable, whether in contract, tort, under statute or pursuant to any other right, to any person, including any person on whose behalf the intermediary provides services in respect of information in an electronic record, for any action the intermediary takes in good faith in exercise of the powers conferred by, or as directed by the Minister under, this section.

Codes of Conduct and Standards for Intermediaries and E-Commerce Service Providers

29 (1) If a code of conduct is approved or a standard is appointed by the Minister under this section to apply to intermediaries or e-commerce service providers, those intermediaries or e-commerce service providers must comply with the code of conduct or standard.

(2) An intermediary or e-commerce service provider who fails to comply with an approved code of conduct or appointed standard, shall in the first instance be given a written warning by the Minister and the Minister may direct that person to cease and desist or otherwise to correct his practices, and, if that person fails to do so within such period as may be specified in the direction, he shall be guilty of an offence and be liable on summary conviction to a fine of $5,000 for each day on which the contravention continues.

(3) If the Minister is satisfied that a body or organization represents intermediaries or e-commerce service providers, the Minister may, by notice given to the body or organization, request that body or organization to —

  • develop a code of conduct that applies to intermediaries or e-commerce service providers and that deals with one or more specified matters relating to the provision of services by those intermediaries or e-commerce service providers; and
  • provide a copy of that code of conduct to the Minister within such time as may be specified in the request.

(4) If the Minister is satisfied with the code of conduct provided under subsection (3), the Minister shall approve the code of conduct by notice published in the Gazette and thereupon the code of conduct applies to intermediaries or e-commerce service providers as may be specified in the notice.

(5) If the Minister is satisfied that —

  • no body or organization represents intermediaries or e-commerce service providers; or
  • a body or organization to which notice is given under subsection (3) has not complied with the request of the Minister under that subsection,

the Minister may, by notice published in the Gazette, appoint a standard that applies to intermediaries or e-commerce service providers.

(6) If the Minister has approved a code of conduct or appointed a standard that applies to intermediaries or e-commerce service providers and —

  • the Minister receives notice from a body or organization representing intermediaries or e-commerce service providers of proposals to amend the code of conduct or standard; or
  • the Minister no longer considers that the code of conduct or standard is appropriate,

the Minister may, by notice published in the Gazette, revoke or amend any existing code of conduct or standard.

(7) A code of conduct approved or standard appointed under this section may relate to one or more of the following matters —

  • the types of services and of customers that are permitted to be provided services by intermediaries;
  • the types of information permitted to be contained in electronic records for which services are provided by intermediaries;
  • the contractual application of relevant codes of conduct or standards to customers of intermediaries and e-commerce service providers;
  • information to be disclosed by intermediaries and e-commerce service providers including name, address, email address and contact and registration details;
  • the use of a quality accreditation mark associated with Bermuda;
  • the actions to be taken in the event of customers of intermediaries or e-commerce service providers sending bulk, unsolicited electronic records;
  • business activities prohibited by the Tenth Schedule to the Companies Act 1981;
  • publication of material that contravenes the Obscene Publications Act 1973 or the Criminal Code Act 1907;
  • procedures for dealing with complaints;
  • procedures for dispute resolution, including dispute resolution by electronic means;
  • such other matters as the Minister may require.

(8) References in this section to intermediaries and e-commerce service providers include reference to a particular class of intermediary and a particular class of e-commerce service provider respectively.