Privacy Regulation on the Internet

views updated

Privacy Regulation on the Internet1

INTRODUCTION
THE ISSUES
MODES OF REGULATION
RECOMMENDATIONS

INTRODUCTION

There is no doubt, privacy concerns are top in the minds of users and potential users when transacting over the internet. Study after study has borne this out.2 A poll conducted in 1998 found that those who were not likely to access the net blamed privacy concerns more than costs, difficulty of use, security of financial transactions or more control over unwanted messages for their non-access.3 The fear is

1 A version of this chapter appeared in Peng Hwa Ang, “The Role of Self-Regulation of Privacy on the Internet,” Journal of Interactive Advertising 1(2) (2001). http://www.jiad.org/vol1/no2/ang/ (accessed January 2, 2002).

2 For a fairly up-to-date list of surveys, see the “Privacy” sub-directory of NUA Internet Surveys at http://www.nua.ie/surveys/ and the Trust-e's “How Does Online Privacy Impact Your Bottom Line?” available at http://www.truste.org/webpublishers/pub_bottom.html.

3 Alan Westin and Danielle Maurici, “E-Commerce and Privacy: What Net Users Want. Privacy and American Business and Price Waterhouse,” in None of Your Business: World Data Flows, Electronic Commerce, and the European Privacy Directive, eds., P. Swire and Robert Litan, 80 (Washington, DC: The Brookings Institution, 1998).

understandable. That same year, the US Federal Trade Commission found that while “almost all Web sites (ninety-two percent of the comprehensive random sample) were collecting great amounts of personal information from consumers, few (fourteen percent) disclosed anything at all about their information practices.”4

Two years later, in 2000, a study by the UCLA Center for Communication Policy found that sixty-four percent of internet users “strongly agreed” or “agreed” that logging onto the internet puts their privacy at risk. Of all the internet issues explored by the study entitled “Surveying the Digital Future,” it was said, “Privacy raises the greatest concern.”5

The concern is universal. A 2000 poll in South Korea found that more than ninety-five percent of net users were concerned about “possible leakage of their personal information.”6

In short, privacy concerns do have some impact on the use of the net. There is, therefore, an incentive for internet sites to adopt privacy standards that would ease the concerns of the visitor. However, if owners of websites are truly aware of such concerns and the need for standards, why has it been so difficult for standards to emerge? Therein lies a tale fraught with tension. Tensions exist not just between the consumer and business but even among those who agree as to the mode and extent of privacy regulation.

This chapter presents the issues and gives an overview of where privacy rules are headed. The focus will be on privacy as defined in the US and the EU. The reason is that the EU data protection directive

4 US Federal Trade Commission, “Privacy Online. A Report to Congress,” 1998. http://www.ftc.gov/reports/privacy3/privacyonlinetestimony.pdf (accessed January 31, 2001).

5 Jeff Cole, “Surveying the Digital Future,” The UCLA Intrnet Report, UCLA Center for Communication Policy, 2000, p. 32. http://ccp.ucla.edu/UCLA-Internet-Report-2000.pdf (accessed Decemebr 31, 2004).

6 J.H. Lee, “95.7 Percent of Netizens Fear Personal Information Leakage,” Korea Herald, January 22, 2001. http://www.koreaherald.co.kr/SITE/data/html_dir/2001/01/22/200101220038.asp (accessed January 3, 2001).

has been the dominant driving force behind privacy legislation. The US reaction is an example of the shape in which a non-legislative approach to privacy regulation may take. Given the present state of legal uncertainty and the lack of action by legislators, policy-makers and industry, the chapter concludes with some suggestions as to what needs to be done by site owners.

THE ISSUES

What is Privacy?

Broadly speaking, an individual's information privacy is the right to determine when, how and to what extent information about the person is communicated to others.7 Like virtually all other rights, the right to privacy is not absolute. A person with total privacy would have to be so isolated that he or she would have to cease to exist socially. Any contact with the world, from a magazine subscription to a telephone line to a bank account, exposes him or her to possible privacy violation. The gain from the loss of some privacy is convenience. In theory, on the internet, a website that is customized to the user's preference is easier to surf. In the offline world, a user registered with a marketing company would receive special marketing material of particular interest and use. In official dealings, a citizen would find it easier to fill in forms and have documents verified.

However, in practice, the scenario is less than ideal. A person without any privacy protection would be wholly vulnerable to being harassed by the world at large; companies simply send out a flood of

7 Alan Westin, Privacy Freedom (New York: Atheneum, 1967).

often untargeted mail leaving the customer to wade through all kinds of commercial junk before coming to truly useful mail. He or she would receive numerous phone calls and emails. It would be arduous to live in the modern world without some form of privacy.

Given this spectrum of what could reasonably be expected from privacy, the question is then: At which point should the right of privacy be set. It is at this crucial point that it is difficult to obtain a workable definition. Alan Westin, a noted scholar of privacy, is reported to have said that it is impossible to define privacy because privacy issues are fundamentally matters of values, interests and power.8

The two major continents that have initiated legislation on privacy are Europe and North America. The most comprehensive privacy rules have been instituted in the EU. Europeans have seen for themselves how governments can use data against their citizens during World War II. It was partly the abuse of such data that enabled the Nazis to send six million Jews to the death camps. In contrast, privacy rules in the US are not comprehensive in scope. The ensuing discussion delves into the differences in approach, differences that affect the rest of the world.

American v. European Notions of Privacy

There are several major differences between the American and the European approaches. The first is that in the EU, the protection of personal data and privacy is clearly stated under Article 8 of the European Convention on Human Rights (1963) (the “European Convention”):

  1. Everyone has the right to respect for his private and family life, his home and his correspondence.

8 R. Gellman, “Does Privacy Law Work?” in Technology and Privacy: The New Landscape, eds., P Agre and M. Rotenberg, 193-218 (Cambridge, MA: MIT Press, 1997).

  1. There shall be no interference by a public authority with the exercise of this right except such as in accordance with the law and is necessary in a democratic society in the interests of national security, public safety or the economic well-being of the country, for the prevention of disorder or crime, for the protection of health or morals, or for the protection of the right and freedoms of others.

To be sure, Article 12 of the Universal Declaration of Human Rights (1948) (the “Universal Declaration”), which precedes the European Convention, has similar words:

No one shall be subjected to arbitrary interference with his privacy, family, home or correspondence, nor to attacks upon his honor and reputation. Everyone has the right to the protection of the law against such interference or attacks.

In fact, similar phraseology in the two clauses suggests that the European Convention took its inspiration from the Universal Declaration. However, the legal force of the European Convention is much stronger. The Universal Declaration might be seen as an “idealistic ideal”—there is no enforcement mechanism beyond diplomatic and international pressure; the European Convention sets out a “realistic ideal” by backing the articles with the enforcement mechanism of the European Court of Human Rights.

Under the European Convention, being classified as a fundamental human right entails obligations over and above that of local legislation. A law that violates the European Convention can be struck down, the government forced to enact new law to reflect the European Convention or to pay damages to the injured victim. And a fundamental human right cannot be negotiated.

The US does value privacy but the right to privacy has been read by courts as implied. There is no mention of the word “privacy” or the phrase “protection of privacy” in the US Constitution. The US

Supreme Court Justice Louis Brandeis, in a dissenting judgment, characterized privacy as “the most comprehensive of rights, and the right most valued by civilized men.”9 Justice Brandeis was a dissenting voice in that case but he was proven right when Olmstead was overturned in Katz v. US.10 And in 1965, Griswold v. Connecticut11 recognized a limited right to privacy as a constitutional guarantee. The Court found the right to privacy implied in the Constitution, specifically in the Fourth, Ninth and Tenth Amendments.

The Fourth Amendment states that “(t)he right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no warrants shall issue, but upon probable cause, supported by oath or affirmation, and partially describing the place to be searched, and the persons or things to be seized.” The Ninth Amendment states that “(t)he enumeration in the Constitution, of certain rights, shall not be construed to deny or disparage others retained by the people.” Under a rule of legal interpretation, the listing of rights could be read to imply that the entire list constituted all the rights. The Tenth Amendment states that “(p)owers not delegated to the United States by the Constitution, nor prohibited by it to the states, are reserved to the states respectively or to the people.”

So while both the European and US approaches do enshrine privacy as a fundamental human right, the US approach of being an implied right creates ambiguity, which contributes to the problem of regulating privacy. The European approach creates stronger privacy protection for another reason: to guarantee a fundamental human right requires the force of law. In the US, there are privacy laws but these are “fragmented, incomplete and discontinuous.”12 Among the

9 Olmstead v. US 277 US 478 (1928).

10 389 US 347 (1967).

11 381 US 479 (1965).

12 Gellman, “Does Privacy Law Work?”

Acts in the US that affect privacy are:

  • Bank Secrecy Act
  • Cable TV Privacy
  • Children's Online Privacy Protection Act
  • Consumer Credit Reporting Reform Act
  • Driver's Privacy Protection Act
  • Electronic Communications Privacy Act
  • Electronic Funds Transfer Act
  • Electronic Signatures in Global Commerce Act
  • Fair Credit Reporting Act
  • Family Educational Right to Privacy Act
  • Financial Services Modernization Act
  • Freedom of Information Act
  • Privacy Act
  • Right to Financial Privacy Act
  • Social Security Number Confidentiality Act
  • Telecommunications Act
  • Telemarketing and Consumer Fraud Act
  • Video Privacy Protection Act

The list demonstrates just how fragmented the laws are: The website Privacy Exchange,13 which is touted as an authority on privacy, did not have the entire list of Acts above. Technically, therefore, the right to privacy in the US is weaker than that in Europe, although more recent data suggest that this may not be the case.14

The laws that have been passed in the US are sectoral, generally aimed at the public sector, or else targeted at preventing specific abuses, usually after they have occurred. The Driver's Privacy Protection Act, for example, was passed after an actress was stalked

13 http://www.privacyexchang.org

14 Consumers International, “Privacy@Net: An International Comparative Study of Consumer Privacy on the Internet.” http://www.consumersinternation.org/news/pressreleases/fprivreport.pdf (accessed January 31, 2001).

and killed after the details of her driver's license were disclosed. Apart from a few sectoral Acts, the private sector is governed by voluntary self-regulatory codes.

Such industry self-regulation, by definition, is not comprehensive. The regulations, after all, affect only the industry. Such a sector-specific approach, when applied to an all-embracing concept such as privacy, inevitably creates gaps in the law. That is, there will be areas of law to which privacy rules do not apply. This is an unsatisfactory result of self-regulation.

It should be noted that the presence of law does not exclude self-regulation. Indeed, European industry has many self-regulatory privacy codes.15 But to be meaningful, these codes must cover areas not accounted for by law or set a higher degree of protection.

MODES OF REGULATION

So which mode of regulation is better? There is a gulf between government reglation and self-regulation. The argument for direct government legislation is that it increases consumer confidence and, therefore, increases commerce. But those supporting less regulation argue that regulation interferes with the workings of the free market. Experience shows that some rules do increase commerce without such a negative effect. So which rules do enhance business? The general conclusion is that “clear definition and assignment of rights, efficient rules for making and enforcing contracts, and good laws that reduce the need for consumers to protect themselves will all help

15 J. Mogg, “Data Protection and Privacy. Internal Market and Fianncial Services, the European Commission” (lecture, European-American Business Council, March 18, 1998). http://www.eurunion.org/news/speeches/1998/980318.htm (accessed January 31, 2001).

commerce. Privacy laws are seen as falling into the third category of reducing the need or consumers self-protection.“16

What is the right mix of approaches—market mechanism, technology, industry self-regulation or government regulation—that would guarantee maximum privacy.17 There are shortcomings in each of the approaches.

Market Mechanism and Technology

The market mechanism approach is premised on the notion of privacy as a negotiable item, such as property, instead of an inalienable right. A willing consumer trades some aspect of privacy as part of the transaction for goods or services. Not everyone agrees with this and for good reason: If privacy is a fundamental human right, it is not the consumer but those who collect and use the data who bear the responsibility for maintaining privacy.

Nevertheless, a system that has been set up around the notion of privacy as a negotiable right is the Platform for Privacy Preferences or P3P. It is backed up by the Online Privacy Alliance, a consortium of almost fifty US organizations including the US White House, Microsoft, America Online and the Center for Democracy and Technology.18

P3P uses a protocol developed by World Wide Web Consortiums (W3C) called Platform for Internet Content Selection (PICS).19 In its

16 P. Swire and Robert Litan, None of Your Business: World Data Flows, Electronic Commerce, and the European Privacy Directive, 86 (Washington, DC: The Brookings Institute, 1998).

17 P. Swire, “Privacy and Self-Regulation in the Information Age,” 1997, US Department of Commerce. http://www.osu.edu/units/law/swire.htm (accessed January 31, 2001).

18 W3C, “P3P 1.0: A New Standard in Online Privacy,” June 16, 2003. http://www.w3.org/P3P/brochure.html (accessed January 11, 2005).

19 S. Garfinkel, “Can a Labeling System Protect Your Privacy?” Salon, July 11, 2000. http://www.salon.com/tech/col/garf/2000/07/11/p3p/index.html (accessed January 31, 2001).

conception, PICS was designed to carry labels that would describe its content to users. So a pornography site would warn a visitor that it had adult material and so users who have set their browser to alert them of pornography would be blocked off.

Under the PICS protocol, the website owner would set the privacy level of the site. The user would also indicate a privacy preference on the web browser. If the level of privacy on the web coincides with that of the browser, the user accesses the site transparently. But if the privacy on the site is set at a lower level than what the user prefers, a window pops up and asks if the user is prepared to sacrifice some loss of privacy for the service. In theory, this is an appealing concept as it bypasses a thorny problem regarding privacy, that is, the question of definition, which varies between individuals. P3P would, therefore, allow the buyer and seller to negotiate an agreeable level of privacy.

There has been much criticism of P3P as a means of securing privacy protection for the consumer20 in that it does not provide for sanctions. It is, in fact, a self-management system and not a self-regulatory system, that is, the system assumes that the company will abide by its own rules that it had negotiated with the customer. It is a basic tenet in developing a code of practice that it should not pretend to deliver more than it does because when the truth emerges, the consumer is left in a spiteful mood.21 It is here that P3P falls gravely short. Although it does not aim to mislead, its very nature suggests to the user that privacy will be protected on a site that uses the

20 R. Clarke, “Platform for Privacy Preferences: A Critique,” Privacy Law & Policy Reporter 5(3) (1998): 46-48. http://www.anu.edu.au/people/Roger.Clarke/DV/P3PCrit.html (accessed January 31, 2001); K. Coyle, “P3P: Pretty Poor Privacy? A Social Analysis of the Platform for Privacy Preferences,” June 1999. http://www.kcoyle.net/p3p.html (accessed January 31, 2001); Electronic Privacy Information Center, “Pretty Poor Privacy: An Assessment of P3P and Internet Privacy,” 2000. http://www.epic.org/reports/prettypoorprivacy.html (accessed January 31, 2001).

21 A. Falk, F. Ernst and F. Urs, “Informal Sanctions” (September Working Paper Series, Paper 35, Institute for Empirical Research in Economics, University of Zurich, 2000). http://papers.ssrn.com/sol3/papers.cfm?abstract_id=245568 (accessed January 31, 2001).

technology when in fact the protection is very skimpy. Indeed this is the major criticism by the EU of P3P: It could mislead even its users, in this case, website owners, into believing that they have discharged their legal obligation of privacy protection.22

However, a supporter of P3P, the Center for Democracy and Technology, has stated in a report that “P3P cannot protect the privacy of users in jurisdictions with insufficient data privacy laws” and “cannot ensure that companies follow privacy policies.”23 What P3P then hopes to achieve is accountability through transparency. The technology makes it easier to locate and compare privacy policies. The ease of comparison would then drive the industry towards standardization.

The argument would parallel that of car reviews: as reviewers raise the same issues of acceleration, safety and space, car manufacturers would feel compelled to improve on those areas in order to get a good review for their products. Unlike cars, however, it is difficult to tell if the site owner has complied with the self-made rules. So just as violators are not visibly sanctioned, so the rule-abiders are not visibly rewarded either.

This lack of reward may explain the low number of websites using P3P. According to the P3P website, the number of users as at May 17, 2004 was 876.24 Regardless of how its start date is calculated, that is a very low acceptance rate.

What would happen if there were a critical mass of sites using P3P? The author is of the view that there would still be uncertainty as

22 P.J. Hustinix, “Privacy for Privacy Preferences (P3P) and the Open Profiling Standard (OPS): Draft Opinion of the Working Party,” June 16, 1998. http://www.europa.eu.int/comm/internal_market/en/media/dataprot/wpdocs/wp11en.htm (accessed January 31, 2001).

23 D. Mulligan, “P3P and Privacy: An Update for the Privacy Community,” March 28, 2000, Center for Democracy and Technology. http://www.cdt.org/privacy/pet/p3pprivacy.html (accessed January 31, 2001).

24 W3C, “Web Sites Using P3P,” W3C, May 17, 2004. http://www.w3.org/P3P/compliant_sites (accessed December 31, 2004).

to the quality of privacy protection. Where there is quality uncertainty, Akerlof's “market for lemons“25 could emerge, with lemons driving out high-quality producers and, at worse, destroying the market.

Here is how Akerlof's analysis would apply to privacy on the internet. A website's privacy practices are marked by asymmetric information. Consumers cannot judge its quality. Because consumers cannot view the quality of the service, they will accept paying a price higher than “normal.” Producers with high-quality services, such as a good standard of privacy protection, will, therefore, not be willing to sell that service at the “normal” price. Akerlof concludes that asymmetric information reduces (1) the volume of transactions and (2) the average quality of goods and services exchanged. Such a situation where high-quality goods and services are penalized is known as “adverse selection.”

As a result of asymmetric information, information providers are not able to signal quality differences to consumers. Merchants of low-quality goods and services are not punished by the market because the market is unaware that these are low-quality products. The market price will reflect only the average quality of the product and therefore attract average-quality sellers. This leads to a consumer perception of a drop in the average quality, which leads to a further reduction of market value and another round of lowering of average quality. This process is cumulative and, in the extreme, destroys the market altogether. In sum, for the market mechanism, or indeed any regulatory mechanism, to be credible, there must be some visible sanctions of wrongdoers.

As long as self-regulation is accepted as a possible means of protecting privacy, there will be continuing efforts to use technology

25 George A. Akerlof, “The Market for “Lemons”: Qualitative Uncertainty and the Market Mechanism,” Quarterly Journal of Economics 84 (1970): 488-500.

26 http://www.privada.net

to address the issue, especially for the online world. Some are quite ingenious. Two now-defunct companies that attempted to minimize the issue of privacy through anonymity and disguise were Privada and Zero Knowledge Systems. Privada26 provided a system where users could make all their internet transactions, including email, web browsing, online chat, and ecommerce, anonymously. Zero Knowledge Systems27 used a sophisticated encryption software called “Freedom” to disguise the identity of an internet user with up to five pseudonyms. Both methods assumed that the companies providing the software would themselves be ethically vigorous in their protection of their users’ privacy.

In 2001, a Dutch-led consortium announced that they had been awarded a three-year €3.2 million (about US$3 million then) contract from the European Union and Netherlands’ Ministry of Economic Affairs to create a Privacy Incorporated Software Agent (PISA) that would meet the requirements of the EU data protection directives. PISA aims to protect the personal information of users when they use intelligent agents that gather information and transact over the internet on their behalf.28 This is one step removed from someone directly giving away information to another party. At the conclusion of the project, a prototype of PISA's privacy enhancing technology (PET) architecture and a handbook are available at the website http://www.pet-pisanl/.

Such well-intentioned efforts to create products for the global internet community almost invariably require some publicity and outreach campaigns to inform and educate the community about the product. To be sure, there are exceptions: software such as Napster, Kazaa and Skype have been downloaded and used by millions of users with hardly any advertising. But PISA has a very low profile and

27 http://www.freedom.net

28 TNO-FEL, “Fast and Safe Internet Work with PISA,” January 17, 2001. http://www.tno.nl/instit/fel/pisa/press_release_start_pisa_17012001.html (accessed January 31, 2001).

so does not count among them. The September 2004 26th International Conference on Privacy and Personal Data Protection did not mention PISA. This may be due in part to PISA asking questions ahead of the rest of the internet community, which is focusing on personal data, as opposed to representation of personal data in an intelligent agent.

Ths use of technology for privacy protection will continue. The Ontario Information and Privacy Commission is leading a Privacy Enhancing Technologies Testing and Evaluation Project (PETTEP) to develop standards along the lines of the ISO (International Standards Organization).29 It will not be possible to rely on PET alone for privacy protection. At best, such technologies will play a role supplementary to the law.30

The use of technology to aid privacy, however, is a ding-dong battle. In early 2001, there were reports of “web bugs,” small graphic files that can send information about a website visitor to a separate server.31 Undoubtedly, there will emerge both new rules and technology to disclose or defeat these web bugs.

Industry Self-Regulation v. Government Legislation

In the meantime, however, one is left with the two main modes of privacy protection. To be sure, there are advantages to self-regulation. For a fast-moving industry such as the internet, the less formal processes of self-regulation makes regulating the internet more flexible and, therefore, less likely to stifle innovation or excessively limit consumer choice.

29 Mike Gurski, “PETTEP History and Future: Making the ISO Connection,” Information & Privacy Commissioner/Ontario. n.d. http://www.ipc.on.ca/docs/PPPP048.ppt (accessed December 31, 2004).

30 H. Burket, “Privacy-enhancing Technologies: Typology, Critique, Vision,” in Technology and Privacy: The New Landscape, eds., P. Agre and M. Rotenberg, 136 (Cambridge, MA: MIT Press, 1997).

31 Richard Smith, “FAQ: Web Bugs,” n.d. http://www.privacyfoundation.org/education/webbug.html (accessed October 8, 2001).

Further, for such a technical business, it is industry that has the best capability to guarantee quality, to ensure the efficacy of potential courses of action and to access information they need for action at the lowest cost.

Third, because industry bears the costs of regulation, it has incentives to keep enforcement and compliance costs down.

But there are major disadvantages to self-regulation. Self-regulation is perceived as never going against the interests of the (self-) regulator. However, having been involved in a self-regulatory regime, in this case, advertising, the author can state that this is not necessarily the case. The author has voted for business just as business has voted against their own interests in favor of the consumer. Nevertheless, the perception is difficult to shake.

The most significant disadvantage is the lack of incentives to control and enforce standards. Self-regulation works in the medical and legal professions because they are entering into a high-income, homogenous group through compulsory membership.32

Self-regulation seems to work well as long as the group of agents exerting power is relatively small and cohesive. Various studies come to the same conclusion for different reasons: Self-regulation is more difficult and less effective when it involves a large and heterogeneous group of agents.33 This heterogeneity may be the critical reason that makes self-regulation of internet privacy difficult if not impossible. The easiest solution to heterogeneity is to set the lowest standard. But that is problematic as the EU has already set a higher standard.

32 Roger van den Berg, “Self-Regulation of the Medical and Legal Profession,” in Organized Interests and Self-Regulation: An Economic Approach, eds., Bernaerdo Bortolotti and Gianluca Fiorentini, 89-130 (Oxford: Oxford University Press, 1999).

33 Carlo Scarpa, “The Theory of Quality Regulation and Self-Regulation,” in Organized Interests and Self-Regulation: An Economic Approach, eds., Bernaerdo Bortolotti and Gianluca Fiorentini, 254 (Oxford: Oxford University Press, 1999).

34 The concern is that self-regulation leads to the formation of cartels.

Australia and the UK, which have relied heavily on self-regulation, are heading more towards direct legislation. In part, there is the zeitgeist of greater competition and self-regulation tends to be in conflict with competition.34

In practice, the compliance record has not been sterling. In 1998, the US FTC identified four widely-accepted fair information practices that were an essential part of any government or self-regulatory privacy regimes. These are:

  1. notice, displaying a clear and conspicuous privacy policy;
  2. choice, allowing consumers to control the dissemination of information they provide to a site;
  3. access, opening up the consumer's personal information file for inspection; and
  4. security, protecting the information collected from consumers.35
  5. In May 1999 a survey by Georgetown University found that, while almost sixty-six percent of the busiest sites (a random sample of 300 out of 7,500 sites as at January 1999) had a privacy policy, only 9.5 percent of these sites had standards that met those of the FTC's set a year earlier.36 Based on the Georgetown report, the FTC in 1999 issued a statement allowing more time for self-regulation of privacy on the internet.37

In 2000, the FTC conducted several studies that found shortcomings in postings and standards. It did acknowledge an improvement in terms of posting of privacy policies (eighty-eight percent, up from sixty-six percent in a previous survey) and compliance with the four fair information practices (twenty percent, up from less than ten percent in the previous survey). But FTC found

35 US Federal Trade Commission, “Privacy Online. A Report to Congress.”

36 Mary Culnan, “Georgetown Internet Privacy Policy Survey,” 1999. http://www.msb.edu/faculty/culnanm/GIPPS/mmrpt.PDF (accessed January 31, 2001).

37 US Federal Trade Commission, “Privacy Online: A Report to Congress.”

that the rate of compliance was still poor. Another survey by the FTC found that only eight percent of sites in a random sample and forty-five percent of the busiest sites had a privacy seal. Evidently displaying impatience with industry for dragging its feet on the issue (the second paragraph of the 2000 report begins with: “The Federal Trade Commission has been studying online privacy issues since 1995”), the Commission urged Congress to pass legislation “to ensure adequate protection of consumer privacy online.”38 It should be noted that there were dissenting views in the 2000 report.

Failure of Seals

The three best-known prominent privacy seals on the web are TRUSTe, BBBOnline and WebTrust, all based in the US. As at November 2000, they had attracted only a small number of members: 1,900, 680 and two respectively. Only a quarter of the top one hundred ecommerce sites subscribe to the seals. Among the notable holdouts are Amazon.com and BarnesandNoble.com.39

Apart from procedural difficulties in filing a complaint against a website, TRUSTe has not removed a seal for privacy violation in more than three years.40 There have been orders to remove the seal for those who have not renewed payment of fees. An email query from the author on the number of members it had and how many have had their seals revoked was not answered.

38 US Federal Trade Commission, “Privacy Online: Fair Information Practices in the Electronic Marketplace,” A Report to Congress, May 2000. http://www.ftc.gov/reports/privacy2000/privacy2000text.pdf (accessed January 31, 2001).

39 Edmund Sanders, “Privacy Certification Earning Seal of Disapproval,” Los Angeles Times, November 16, 2000. http://chicagotribune.com/tech/economy/article/0,2669,2-48015,PF.html (accessed January 31, 2001).

40 Christopher Hunter, “Recoding the Architecture of Cyberspace: Why Self-Regulation and Technology are Not Enough,” February 2000. http://www.asc.upenn.edu/usr/chunter/net_privacy_architecture.html (accessed January 31, 2001).

41 Ibid.

TRUSTe's difficulties have been highlighted in the cases of Real Audio and AOL. In AOL's case, a complaint was filed against the company passing information to third parties. AOL's answer was that the TRUSTe seal applied only to the www.aol.com site, not the members.aol.com site.41

In the case of Real Audio, the company's software, RealJukebox, surreptitiously monitored and collected data about the listening habits and some other activities o its users.42 The company apologized but was never punished by TRUSTe because the surreptitious monitoring and collection of data was done by the software program; this was not covered by the terms of the TRUSTe seal, which only covered activities on RealAudio's website.

A self-regulatory regime without enforcement is dangerous. The assumption that a privacy notice is better than no notice is fallacious because consumers who are misled become disgruntled and angry.

A recent study using a large number of game theory experiments concluded that there are two major motivating forces that drive consumers to seek sanctions. The first is where the consumer feels that the fairness principle has been violated (and just what is fair and whether the fairness principle even applies depends on the circumstance). The second is, surprisingly enough, spite. The consumer was not driven by “strategic sanctions that are imposed to create future material benefits.“43 This finding corroborates a survey on privacy by the Pew Research Center, which found that users were in a “punishing mood.“44

42 Sara Robinson, “CD Software Said to Gather Data on Users,” New York Times, November 1, 1999.

43 Falk et al., “Informal Sanctions.”

44 Pew Research Center, “Trust and Privacy Online: Why Americans Want to Rewrite the Rules,” 2000. http://www.pewinternet.org/reports/toc.asp?Report=19 (accessed January 31, 2001).

Government Legislation

Given the publicity cast on the failures of the seals and the US FTC's announcement, comprehensive privacy legislation in the US along the lines of that in the EU seemed inevitable. In March 2000, in what may be thought of as an interim measure, the EU and the US agreed on a “safe harbor” provisions that presumes companies that subscribe to the safe harbor principles to be compliant with the EU data protection directive.45 The provisions mean that there are strict guidelines on the collection of data from EU (not American or other non-EU) customers. At the end of 2004, only 637 organizations had subscribed to the principles.46 Because some of the organizations on the list have other organizations under them, such as Truste, there are many more US companies governed by the safe harbor provisions.

However, privacy concerns changed after September 11, 2001. The US passed the PATRIOT Act and other countries also passed similar legislation to facilitate surveillance of their citizens in the wake of fears of terrorism.47 The European Union's draft legislation were also modified to allow the retention of traffic data and weaken the privacy protection of such data.48

A concern of government regulation is the cost of compliance. It is difficult to compare the costs of self-regulation with that of government legislation. The US Children's Online Privacy Protection Act, which came into effect in April 2000, has forced some sites to

45 Jeri Clausing, “Europe and US Reach Data Privacy Pact,” New York Times, March 15, 2000.

46 US Department of Commerce, “Safe Harbor List,” n.d. http://web.ita.doc.gov/safeharbor/shlist.nsf/webPages/safe+harbor+list!OpenDocument&Start=637 (accessed December 31, 2004).

47 Reporters sans Frontieres, “Let's Not Forget 10 September 2001”, Internet under Surveilliance, 2004. http://www.rsf.org/article.php3?id_article=10760 (accessed December 31, 2004).

48 Henry Farrell, “Privacy in the Digital Ages: States, Private Actors and Hybrid Arrangements,” in Governing Global Electronic Networks: International Perspectives on Power and Policy, eds., William Drake and Ernest Wilson (Cambridge, MA: MIT Press, forthcoming).

close those sections that cater to children because of compliance costs.49

The author is of the view that these costs will be noted by the EU and, in fact, may be the very reason for the slow application of the data protection directive as well as the creation of the safe harbor principles. The aim of privacy is to protect the end user and thereby encourage greater activity on the net. If the costs of compliance are so onerous that site owners are forced to close part or all of the site, then that aim is defeated. The EU is, therefore, unlikely to compel EU-only data protection rules as this may harm its own internet industry relative to that of the rest of the world.

As if the situation is already not complex enough, Consumers International released a report that criticized both the US and EU websites for falling “woefully short of international standards on data protection.” It added, “Despite tight European Union (EU) legislation, sites within the EU are no better at providing decent information to their customers than sites based in the US. Indeed some of the best privacy policies are to be found on US sites.”50

RECOMMENDATIONS

In this state of flux, what should website owners do? Some form of regulation, preferably from industry, would be helpful. It is neither in the interest of business nor consumers to have cloaking software to bypass privacy concerns. It is not in the interest of business because the information about the consumer is not known. The costs of

49 “New Privacy Law Costs Children's Sites,” USA Today, September 14, 2000. http://www.usatoday.com/life/cyber/tech/citi526.htm (accessed January 31, 2001).

50 Consumers International, “Privacy@Net,” 5.

acquiring and serving the customer will increase and eventually passed to the consumer. It is, therefore, clearly in the interests of both parties to have privacy standards that both can agree on, but, more importantly, one that also punishes recalcitrant violators. In the meantime, websites have little choice but to present themselves at least with the appearance of being responsible.

First, sites should collect the minimum data necessary. Going by the US FTC's fair information practice principles as well as sensible business practices, where any data need to be collected, there must be a privacy policy notice posted in a prominent location. It is beyond the scope of this chapter to discuss the particulars of privacy policies; there are samples on other websites that the reader can refer to and templates to formulate policies to suit a website owner's privacy preferences.

Next, users must have the choice to opt in rather than to opt out.51 eBay s attempt to get people to opt out is a misunderstanding of consumer preferences.52 Most consumers do want to receive material that a website may send and therefore will select the “Yes” button eventually. But they also want the freedom of opting to say “Yes” rather than first facing a default that has already been set at “Yes.”

The FTC's fair information practice principles also require that sites allow users to see and correct their personal data. This may entail some software cost for small sites, some of which, in the author's experience, merely collect email addresses into a mailing list.

Finally, sites need to have sufficient security. When the US Social Security Administration put online a system to check the benefits due

51 S. Strover and Joe Straubhaar, “E-Government Services and Computer and Internet Use in Texas,” June 2000, Telecommunications and Information Policy Institute, University of Texas at Austin (for the Electronic Government Task Force). http://www.utexas.edu/research/tipi/reports/dir_final2.htm (accessed January 31, 2001).

52 Michelle Delio, “EBay E-mail Makes Users ‘Bidder’,” Wired News, January 9, 2001. http://www.wirednews.com/news/business/0,1367,41086,00.html (accessed January 29, 2001).

to an individual, it did not have sufficiently strong protection against unauthorized access. As a result, it had to withdraw the facility.53

In the end, however, a lot depends on both a genuine desire to protect the consumer as well as plain commonsense. The author had the unpleasant experience of incurring a wave of unsubscriptions to an e-newsletter when an untrained person used the “To” header instead of the “Bcc” to send out that month's mailing. Those who cancelled were upset that their email address had been released inadvertently to everyone on the list.

Conclusion

This is an unsettling time for website owners in the area of privacy. It is clear that consumers want more assurance than just an unenforceable seal. The self-regulatory movement in the US has not come through with flying colors. To be sure, there has been some progress in the awareness of the need to protect the privacy of consumers, but not enough. In order to boost the confidence of consumers, much more needs be done.

53 R. Pear, “Social Security Closes On-Line Site, Citing Risks to Privacy,” New York Times, April 10, 1997, A15.

About this article

Privacy Regulation on the Internet

Updated About encyclopedia.com content Print Article