* The views expressed here are those of the author, and not necessarily of the Federal Trade Commission or any Commissioner.
I. INTRODUCTION
The Internet has transformed from a communications link among defense researchers to a network providing millions of users easy access to a wealth of information, goods, and services. Currently, it is estimated that more than 120 million Americans have access to the Internet, and that foreign users recently surpassed U.S. ones in numbers.(1) In part, the exponential growth in the online consumer market has propelled the Internet's extraordinary growth. As the Federal Trade Commission (FTC) noted in its July 1999 report to Congress, online commerce tripled from approximately $3 billion in sales in 1997 to approximately $9 billion in 1998.(2) Annual consumer sales are projected to skyrocket from $15 billion in 1999 to $184 billion in 2004.(3)
The Net is transforming not just our economy but also our society and our notions of privacy. While the Internet provides a goldmine of information, products, and services to consumers, the Internet also is a rich source of information about consumers. Internet sites collect substantial amounts of personal information, both directly through registration pages, survey forms, order forms, and online contests, and indirectly through software products such as "cookies" and other types of tracking software.(4) By following consumers' online activities, Internet site owners and other data collectors gather significant information about visitors' personal interests and preferences. Such consumer data have proven to be extremely valuable to online companies -- they enable online marketers to target products and services tailored specifically to the interests of individual consumers and permit companies to boost their revenues by selling the data or selling advertising space on their Internet sites.(5) An entire industry has emerged to market a variety of software products designed to assist Internet sites in collecting and analyzing visitor data and in serving targeted advertising.(6)
Ultimately, the prevalence, ease, and relatively low cost of collecting, maintaining, and disseminating personal consumer information has a Janus-faced aspect. On the one hand, the ability to gather, process, and disseminate information on the Internet provides consumers with a wealth of benefits (e.g., Web sites can "remember" where a consumer has been and what type of products the consumer likes so that when the consumer returns to the site, s/he can be directed to additional products that are likely to interest him/her). On the other hand, darker voices are legitimately concerned that the manipulative use of information available on the Net may adversely affect privacy or citizenship.(7) Some uses of personal data can be intrusive, as when private information is widely circulated; or reckless, as when inaccurate information is widely shared with other people and companies; or predatory, as when the information is used to target victims for a scam or crime.
For over five years, the Federal Trade Commission has actively monitored developments in e-commerce, particularly those affecting consumer privacy. The FTC has supported industry self-regulation and taken enforcement actions as needed. The FTC also has endorsed certain legislative initiatives (e.g., the Children's Online Privacy Protection Act) to address specific online privacy concerns. As I will explain below, there is no simple choice between self-regulation or legislation as the anointed vehicle for protecting consumers' privacy. We already have both and will continue to need both in the future.
II. PRIVACY ON THE INTERNET -- THE EVOLVING LEGAL LANDSCAPE
In the United States, individual privacy, including online privacy, is protected through a combination of constitutional guarantees, federal and state statutes, regulations, and voluntary codes of conduct, all of which apply to the public and private sectors in different ways. Although the U.S. Constitution does not explicitly mention a right to privacy, the Supreme Court decades ago recognized a fundamental right to privacy or the right to be left alone.(8) It subsequently interpreted the Bill of Rights as creating, through a penumbra of various rights, "a right of personal privacy or a guarantee [that] certain zones of privacy [do] exist under the Constitution."(9) Viewed in hindsight, the federal courts have effectively acknowledged a right to privacy with respect to marital relations, procreation, contraception, family relationships, and child rearing and education.(10) In addition, a number of state constitutions specifically enumerate the right of citizens to be protected from privacy invasions.(11)
Aside from constitutional guarantees, the U.S. legislative approach to privacy has been traditionally sectoral, that is, privacy law has developed to address particular data types and users.(12) Historically, fear of the government's use of personal data was the primary concern. Certain statutes thus limit the use of personally identifiable data that the government maintains.(13) Other statutes limit the government's use of personal data maintained by industry.(14) And some statutes limit firms' use of personal data.(15) While no single law or regulation specifically recognizes a U.S. citizen's general right to informational privacy, certain laws as applied do afford a fair amount of such privacy to consumers.
Section 5 of the Federal Trade Commission Act(16) in particular can protect consumers' informational privacy whenever a company collects or disseminates personal data in an unfair or deceptive manner. For example, in August 1998, the Commission brought its first online privacy case against GeoCities.(17) In that case, the Commission was concerned that GeoCities, one of the Web's most frequently visited sites,(18) collected personal identifying information from its members, both adults and children, and misled them as to its use of that information. When visitors become members, they must fill out an online application that requires disclosure of certain personal identifying information and requests optional information regarding education level, income, marital status, occupation, and interests. Through the registration process, GeoCities created a database rich with target markets for advertisers. The Commission alleged in its complaint that GeoCities falsely represented that the mandatory information that members provided would not be released to third parties without permission. In addition, GeoCities collected personal identifying information from children, for whom it promotes a GeoKidz Club(19) that offers activities, contests, and games. The FTC charged that GeoCities misrepresented that it alone maintained this identifying information from children, when in fact a third party collected and maintained it.
Ultimately, GeoCities settled the case by agreeing to disclose prominently on its Web site just what information it is collecting, for what purpose, to whom it will be disclosed, and how consumers can inspect and, if desired, remove their personal information from the databases of third parties. The consent order also prohibits GeoCities from misrepresenting who is sponsoring the various activities offered on its Web site and who actually is collecting and maintaining personal information. Finally, to protect children, the order requires GeoCities to obtain parental consent before collecting information from those age 12 or younger, and to delete any such information already collected, unless GeoCities obtains affirmative parental consent to retain it.
The comprehensive GeoCities consent agreement helped establish some of the key elements of fair information practices that protect consumers' online privacy. Those elements include: 1) notice of the site's privacy practices; 2) consumer choice regarding the use of information collected; 3) consumer access to correct or remove personal information;
4) safeguarding the security of information; 5) parental control over the collection and use of information gathered from children and 6) an enforcement mechanism to ensure compliance.(20) These are precisely the types of protections that the Commission has been urging Web site operators to provide voluntarily through self-regulation.(21) These same principles also served as the foundation for the Children's Online Privacy Protection Act (COPPA) that was enacted in 1998 (see infra).(22)
Another recent case illustrating Section 5's ability to protect informational privacy is the Commission's action against ReverseAuction.com.(23) ReverseAuction is an online site that features "Declining Price" Auctions (that is, the initial, opening price of an item drops the longer the item remains up for auction) and "Wanted" Auctions (that is, buyers who are looking for a particular item or service indicate how much they are willing to pay for the item or service, and sellers then try to outbid each other by offering lower prices). The Commission charged that the firm violated consumers' privacy by harvesting consumers' personal information from a competitor's site and then sending deceptive spam to those consumers, soliciting their business.
In essence, what was alleged is that, in promoting its new site, ReverseAuction registered with eBay.com, a competitor auction site. ReverseAuction agreed to be bound by eBay's User Agreement and Privacy Policy, which protect consumers' privacy by prohibiting eBay users from gathering and using personal identifying information (such as names and email addresses) for unauthorized purposes such as spam. Notwithstanding that agreement, ReverseAuction harvested eBay's users' personal identifying information and used the data to send them spam promoting ReverseAuction's own web site. ReverseAuction indicated that the spam recipient's eBay user ID would expire soon when in fact it was in no danger of expiring. But by using this deceptive tactic, ReverseAuction lured eBay users to its web site where they could "re-register." The Commission also claimed that consumers believed that eBay had provided their eBay user IDs and other information to ReverseAuction, or at least had authorized these practices. In reality, eBay had no idea that ReverseAuction was engaging in these activities. At the end of the day, those who "re-registered" their eBay user IDs at ReverseAuction's site were simply registering with and becoming a member of ReverseAuction.com with their eBay IDs now also serving as their ReverseAuction IDs.
The proposed settlement bars ReverseAuction from making any misrepresentations about complying with another company's user agreement, privacy policy, or other provisions that govern the collection, use, or disclosure of consumers' personal identifying information. In addition, ReverseAuction will be barred from making any misrepresentations about the features, terms, conditions, business practices, or privacy policy of any other company. Furthermore, the proposed settlement requires ReverseAuction to send an e-mail message to all consumers to whom it had sent spam, explaining that ReverseAuction had not intended to suggest that consumers' eBay user IDs would expire and stating that eBay did not know about and had not authorized any of ReverseAuction's actions. The e-mail will inform consumers that their names and eBay user IDs can be purged from ReverseAuction's database and their registration canceled. Finally, the proposed settlement requires ReverseAuction to post its own privacy policy on its Internet web site and maintain certain records to enable the FTC to monitor compliance with the proposed settlement.
This case illustrates both Section 5's broad authority and the Commission's commitment to protecting consumers' privacy online whenever that privacy is threatened. Here, the FTC was able to ensure that the privacy protections assured to eBay's users were not compromised when the deceptive tactics of a competitor auction site thwarted eBay's self-regulatory efforts to protect consumers' privacy. Without actions such as this one, consumers will lose confidence about whether their privacy choices will be honored. And, with consumer confidence one of the biggest threats to e-commerce in general, cases such as this one are essential for fostering the continuing growth of e-trade.
In addition to the Federal Trade Commission Act, a few other federal statutes, such as the Fair Credit Reporting Act (FCRA)(24) and Title V of the Gramm-Leach-Bliley Act,(25) provide a certain amount of informational privacy protection. While these statutes, particularly the FCRA, may have been conceived for an off-line world, they function to protect the privacy of both on- and off-line consumers. Indeed, it is becoming increasingly difficult to meaningfully protect privacy without addressing concerns in both the real and virtual worlds. This will become even more true as companies begin to merge online and offline consumer data and profiles.
First enacted in the 1970s, the FCRA regulates consumer reporting agencies, also known as credit bureaus, and establishes important protections for consumers with respect to the privacy of their sensitive financial information that credit bureaus hold. The FCRA allows credit bureaus to disclose consumer credit reports only to entities with specified "permissible purposes" (such as evaluating individuals for credit, insurance, employment, or other, similar purposes). Moreover, these disclosures can only occur under specified conditions (such as certification of need from a prospective employer or insurer). In these ways, the FCRA generally limits the disclosure of consumer reports primarily to instances where a consumer initiates a transaction, such as a loan or employment application. Of course, these processes can now occur completely online.
There are certain caveats associated with the FCRA. First, in contrast to credit bureaus with their rich, accurate and up-to-date data collections, individual merchants, both on- and off-line, are free to distribute any information that they collect as part of their discrete transactions or experiences with consumers.(26) Second, the 1996 amendments to the FCRA include a provision that permits "affiliated" companies to share consumer reports, so long as consumers are notified and given the opportunity to prevent such sharing.(27) Sharing credit(?) information among affiliated companies may well raise special concerns in the electronic banking or electronic payments context, where detailed and sometimes sensitive information about consumers is gathered.
Congress recently enacted financial privacy provisions in Title V of the Gramm-Leach-Bliley Act, which add to the legal protections available for consumers' financial information. Under Title V, banking institutions may share personal confidential financial information with their affiliates, but not with third parties such as online marketers, unless consumers are first notified and given the option to require the banking institution to keep all information private.(28) Even with respect to affiliated entities, Title V requires financial institutions to disclose their privacy policies to their customers, including any intent to share nonpublic personal information.
Finally in the extremely sensitive area of medical records, the Department of Health and Human Services recently issued proposed regulations establishing the first-ever national standards to protect health information that is transmitted or maintained electronically.(29) Among other things, the proposal would require an individual's written consent to release medical information for purposes unrelated to treatment and payment. A notable loophole in the proposed rules is that the rules would not protect health information if it is transmitted or maintained solely via traditional, paper records.
Specific, federal protections for consumer privacy on the Internet are fairly limited.(30) Most notably, last year's Children's Online Privacy Protection Act (COPPA) requires that operators of Web sites directed to children under 13 or who knowingly collect personal information from children under 13 must: 1) provide parents notice of their information practices; 2) obtain prior, verifiable parental consent for the collection, use, and/or disclosure of personal information from children (with certain limited exceptions); 3) upon request, provide a parent with the ability to review the personal information collected from his/her child; 4) upon request, provide a parent with the opportunity to prevent the further use of personal information that has already been collected, or the future collection of personal information from that child;
5) limit collection of personal information for a child's online participation in a game, prize offer, or other activity to information that is reasonably necessary for the activity; and 6) establish and maintain reasonable procedures to protect the confidentiality, security, and integrity of the personal information collected.(31)
The Act directs the Commission to adopt regulations implementing these requirements and on November 3, 1999, the Commission published its final rule incorporating and explaining COPPA's privacy protections.(32) The Rule also provides a safe harbor for operators who follow Commission-approved self-regulatory guidelines. Moreover, the Commission retains its authority under Section 5 of the FTC Act to investigate and enforce against any child-related information practices that are deceptive or unfair.(33)
COPPA is a classic example of federal legislation aimed at protecting a particular privacy problem -- children's information privacy on the Internet. The lack of much additional federal legislation protecting Internet privacy stems from several sources. First, there is no consensus that one general approach solves all privacy problems. Some believe that firms should always seek consumers' consent before sharing their personal data (the opt-in advocates). Others believe that it is more efficient and beneficial for all involved if firms are allowed to share consumers' personal data so long as they notify consumers and enable them to prevent such sharing before it occurs (the opt-out advocates). Alternatively, it may prove most desirable to have opt-in (affirmative consent) rules for certain types of sensitive data sharing (e.g., medical information) and opt-out rules when less sensitive data is being shared or when there is a broad consensus that such sharing is useful for both consumers and firms.
Second, there is a general reluctance to create a plethora of national or state laws for an inherently global technological environment. There is a legitimate concern that an explosion of various sovereigns' laws regulating the Net would only create conflicts of laws rather than resolve issues of privacy invasion. In addition, there has been at least a tentative conclusion that existing laws such as Section 5 of the Federal Trade Commission Act, combined with self-regulation, may be adequate to protect personal informational privacy on the Net. But that tentative consensus may be falling apart, as businesses and consumers recognize the serious harm that hackers, fraud artists and sheer mistakes can wreak with personal financial and identifying information. Moreover, businesses are beginning to worry about the specter of fifty different state law standards as legislators attempt to respond to citizens' concerns about identity theft and privacy invasions.(34)
III. CURRENT FEDERAL POLICY ON INTERNET PRIVACY
The choice of either legislating privacy on the Net or fostering self-regulation is a false one. In fact, we already have both. However, there are legitimate reasons why federal policy regarding privacy on the Internet thus far has favored a self-regulating cyberspace marketplace.
Given the rapidly evolving nature of the Internet and computer technology, self-regulation is certainly the least intrusive and may be the most efficient means to ensure fair information practices.(35) Voluntary codes are by definition developed and adopted by those with the greatest expertise about and sensitivity to industry practices and conditions. And self-regulatory codes can be revised when necessary, more promptly than legislative codes. This allows firms to respond quickly to the rapid evolution of the Internet and computer technology and to employ emerging technologies to protect consumers' privacy. Moreover, when regulation is voluntarily-adopted, compliance tends to be broader, and enforcement more prompt than when a legislature or agency imposes its mandate. And self-regulation wholly avoids many of the First Amendment issues associated with governmental regulation. Finally, where an industry can regulate itself, the government need not devote as many of its limited resources to the task.
Self-regulatory efforts, of course, may fail -- they may not be rigorously implemented or enforced, or they may lapse into a vehicle for exclusionary or collusive conduct among rivals. Businesses may be reluctant to disclose to consumers what they do with personal data, simply to avoid having to compete for customers based on how firms protect personal data. Government vigilance is therefore appropriate and necessary, especially where business rivals, who have an incentive to restrain competition, are involved in the process.
The Commission's numerous activities to monitor industry efforts to protect consumers' privacy include public workshops, Commission Task Forces, creation of an Advisory Group, and surveys of web sites' privacy practices. The Commission's most recent workshop, jointly sponsored by the U.S. Department of Commerce, focused on online profiling.(36) This workshop, examined the use of "cookies" and other types of tracking software -- and even methods of combining this information with personal information collected directly from consumers or contained in other databases -- to create targeted, user-profile based advertising campaigns. Some maintain that no personally identifiable information is collected by this profiling -- at most the profile is of a browser with no knowable street or e-mail address or name. But consumer groups and privacy advocates have raised concerns about these practices, primarily because many consumers are unaware that software is used to create online profiles about them. Privacy advocates believe that consumers should at least be given notice that such profiling is occurring and given the choice of whether an online profile can be created, maintained, and used. I confess to having some sympathy with that position.
The Commission has also used Task Forces to grapple with various Internet privacy issues such as understanding the costs and benefits of implementing fair information practices online. The costs and benefits of granting consumer access to their online information and guaranteeing the security of personal information have been particularly contentious. Indeed, the benefits and costs are interrelated insofar as increased access to information, at least with today's technology, tends to undermine the security and integrity of the data.
To better address this dilemma, the Commission recently announced the establishment of the Advisory Committee on Online Access and Security.(37) The Advisory Committee is to examine what constitutes reasonable access to personal information, including whether the extent of access provided by Web sites should vary, depending upon the sensitivity of the personal information collected and/or the purpose for which such information is collected. It will consider whether consumers should be provided access to enhancements to personal information (e.g., inferences about their preferences or purchasing habits). Other issues are whether appropriate and feasible methods exist for verifying the identity of individuals seeking access and whether the difficulty and costs of retrieving consumers' data should be considered. Finally, should a reasonable fee be assessed for access and should limits be placed on the frequency of requests for access?
On the security side, the Committee will consider how to define appropriate standards for evaluating the measures taken by Web sites to protect the security of personal information. That is, what constitute reasonable steps to assure the integrity of this information and what measures should be undertaken to protect against its unauthorized use or disclosure? By May 2000, the Advisory Committee will prepare a written report presenting options for implementing these fair information practices along with the costs and benefits of each option.
Finally, the Commission continues to monitor through surveys the progress of firms' privacy efforts and to assess whether self-regulatory programs are in fact fulfilling their promise. The results of past surveys of commercial web sites suggest that online businesses are increasingly providing more notice of their information practices. The Commission's 1998 survey found that only 14% of the sites surveyed posted any disclosure regarding their information practices, and even fewer -- 2% -- posted a comprehensive privacy policy. The "most popular" web sites performed better -- 71% had posted an information disclosure practice or notice.(38) A year later, two other, independent surveys found that 66% of 361 busy web sites surveyed posted at least one disclosure about their information practices, while 93% of the 100 top web sites did.(39) Unfortunately, these same surveys found that very few sites (10 to 22%) posted disclosures covering all five of the fair information practice principles.(40) Thus, major challenges remain for effective self-regulation.
In March 2000, Commission staff will conduct a new Internet survey to assess the progress that commercial web sites have made in implementing all of the fair information practices (notice, choice, access, security, and enforcement)(41) and to try to get beneath the surface to determine whether online privacy practices are adequate for enabling consumers to exercise choice about how their personal information is collected and shared.(42) Commission staff will analyze what information a domain collects; whether a privacy policy is posted and what it covers; whether a recognized seal is posted; and whether third party advertisers attempt to place cookies on the site's visitors' computers. Staff hopes to issue its findings in a report by mid-summer.
If self-regulation is to succeed, another critical issue is how to create incentives to encourage the development of privacy-enhancing technologies that will give consumers more control over how and when their personal data is collected and used. One such technology is the Web's Consortium's Platform for Privacy Preferences (P3P).(43) The P3P platform would enable web sites to present their privacy policies in such a way that consumers' computers could automatically "read" the policies and automatically "release" information to sites that conform to consumers' pre-programmed choices on acceptable privacy policies. The P3P protocol, however, is still in the drafting stage.
IV. THE U.S. APPROACH TO PRIVACY VERSUS THE E.U. APPROACH
The U.S. approach to protecting consumer privacy online -- relying significantly on industry self-regulation with a minimum of legislative and administrative mandates -- differs from that of the European Union (E.U.), which relies more on legislative protections. In particular, the E.U. passed a Directive in 1995 that extensively regulates the buying and selling of personal data.(44) This Directive, which took effect on October 25, 1998, specifies common rules that firms must observe when collecting, holding, or transmitting personal data in their business or administrative activities. Most fundamental for firms is an obligation to collect data only for specified, legitimate purposes and to hold only data that are relevant, accurate, and up-to-date. European citizens, in turn, are guaranteed a bundle of rights -- a right of access to their personal data; a right to correct any data that are inaccurate; the right to refuse use of their data for activities such as direct marketing; and a right of recourse if unlawful processing occurs.
Significant for the U.S. is that the Directive prohibits the transfer of personal data to any country that does not provide "adequate" (meaning "comparable") protection. Each E.U. member country has been enacting its own laws to implement the Directive. It is still too early to know how stringent the various E.U. member states' laws and policies will be; how strictly they will be enforced; or how flexible their contemplated system of exemptions and special conditions for individual companies will be. Nevertheless, the Directive may impose substantial restrictions on U.S. subsidiaries who buy and sell personal data in the E.U., or on firms that acquire and transmit personal data to the United States.
The U.S. and the E.U. are currently in negotiations to determine how best to harmonize the different approaches taken to protect personal data. They have been working on developing "safe harbors" that establish a set of criteria which, if met, would allow U.S. companies to do business with European citizens or firms.(45) These "safe harbors" would require U.S. firms to provide: notice of their information practices; choice as to whether and how personal information may be disclosed to third parties; onward transfer of personal data to third parties, consistent with the notice and choice provided; security for personal data, whether at its creation, maintenance, use, or transmission; access by individuals to the information that a firm holds about them and the ability to correct, amend, or delete inaccurate information; and enforcement mechanisms to assure compliance with the foregoing principles and recourse for injured persons. Most recently, U.S. and E.U. delegates met in December to try and complete the negotiations on the safe harbors, but the December meeting ended without any definitive conclusions.
V. CONCLUSION
Like the ever-evolving Internet, the legal landscape that applies to the Internet is in motion and will be for years to come. One way to cope with such an ever-changing scene is to allow self-regulation to develop and change along with it. But industry self-regulation may only gradually develop effective enforcement mechanisms and may ultimately provide inadequate protection against highly motivated hackers or fraud artists. Another way is to empower citizens with technology that helps protect their privacy and permits them to assert control over how their personal data are used. But self-help requires considerable consumer education and sophistication and may well fail to protect consumers against surreptitious privacy invasions or identity theft. Finally, legislation may establish protection against specific fraudulent abuses, or for specific groups, such as children, or may even create useful minimum criteria. But the legislative process can be slow and cumbersome and may lag behind or interfere with technological developments. Thus, neither pure self-regulation, nor consumer education and technological empowerment, nor legislation alone can be the answer. All are needed to ensure meaningful privacy protection on the evolving Net.
1. Cass R. Sunstein, Code Comfort, The New Republic, Jan. 10, 2000, at 37; Nielsen Media Research and NetRatings, Inc., The Nielsen/Netratings Reporter (Jan. 13, 1999) (reported at http://www.nielsen-netratings.com/press_releases/pr_000113.html ).
2. FTC, Self-Regulation and Privacy Online: A Report to Congress 1 (July, 1999) [hereinafter FTC's July, 1999 Privacy Report]; FTC, The FTC's First Five Years: Protecting Consumers Online 3 (Dec. 1999) [hereinafter The FTC's First Five Years].
3. Actual sales for 1999 are not yet available, but the projected figures of $12 to $18 billion are probably fairly accurate, given estimated holiday sales of $7 billion. Online Holiday Sales Hit $7 Billion, Consumer Satisfaction Rising (Jan. 18, 2000). <http://jup.com/company/pressrelease.jsp?doc=pr000113>. See also Forrester Research, Inc., Online Retail to Reach $184 Billion by 2004 as Post-Web Retail Era Enfolds (Sept. 28, 1999) (reported at http://www.forrester.com/ER/Press/Release/0,1769,164FF.html).
4. John Markoff, Bitter Debate Divides Two Experts, (Dec. 30, 1999) <http://www.nytimes.com/library/tech/99/12/biztech/articles/30privacy.html >. Cookies technology allows a web site server to place information about a consumer's visits to the site on the consumer's computer in a text file readable only to that web site server. The cookie assigns each consumer's computer a unique identifier so that the consumer can be recognized in later visits to the site. Advertisers are now able to assign a cookie to the computers of users who visit sites in advertising networks and to follow those users from site to site by reading information stored in that cookie at each site.
5. FTC's July, 1999 Privacy Report at 2.
6. Id.; see also note 4, supra.
7. Glenn R. Simpson, E-Commerce Firms Start to Rethink Opposition to Privacy Regulation as Abuses, Anger Rise, Wall St. J., Jan. 6, 2000, at A-24; Cass R. Sunstein, Code Comfort, The New Republic, Jan. 10, 2000, at 37.
8. See Griswold v. Connecticut, 381 U.S. 479, 483-86 (1965).
9. Roe v. Wade, 410 U.S. 113, 152 (1973).
10. See, e.g., Loving v. Virginia, 388 U.S. 1, 12 (1967); Skinner v. Oklahoma, 316 U.S. 535 (1942); Eisenstadt v. Baird, 405 U.S. 438, 453-54 (1972); Prince v. Massachusetts, 321 U.S. 156, 166 (1944); Pierce v. Society of Sisters, 268 U.S. 510 (1925).
11. See, e.g., Cal. Const., art. I, § 1; Ariz Const., art. II, § 8; Ill. Const., art. I, § 6.
12. See, e.g., The Driver's Privacy Protection Act of 1994, 18 U.S.C. §§ 2721-2725 (1994 ed. and Supp. III). The DPPA regulates the disclosure and resale of personal information contained in records that state Departments of Motor Vehicles maintain. Recently, the state of South Carolina challenged the statute's constitutionality, arguing that it violated fundamental principles of federalism. The U.S. Supreme Court upheld the statute. Reno v. Condon, ---- U.S. ---- (2000), 2000 U.S. LEXIS 503 (Jan. 12, 2000).
13. For example, the Tax Reform Act of 1976, 26 U.S.C. § 6103 (1989 & Supp. 1996), protects the confidentiality of tax returns and return-related information and limits the dissemination of individual tax return data to other federal agencies. Another example is the Privacy Act of 1974, 5 U.S.C. § 552a (1994), which regulates the government's creation, collection, use, and dissemination of records which can identify an individual by name or other personal information.
14. See, e.g., the Electronic Communications Privacy Act, 18 U.S.C. §§ 2510-2522, § 2701 (1994), as amended, which limits the circumstances under which the federal and state governments may access oral, wire, and electronic communications.
15. Privacy statutes that regulate private industry include the Right to Financial Privacy Act, 12 U.S.C. §§ 3401-3422 (1994) (bank records); the Fair Credit Reporting Act, 15 U.S.C. §§ 1681-1681t (1994) (credit reports and credit bureaus); the Video Privacy Protection Act, 18 U.S.C. §§ 2710-2711 (1994) (barring video stores from disclosing customers' video choices); the Family Educational Rights and Privacy Act, 20 U.S.C. § 1232g (1994) (educational institutions' informational records); the Employee Polygraph Protection Act, 29 U.S.C.A. §§ 2001-2009 (West Supp. 1999) (limits employers' ability to use polygraphs); the Telemarketing Protections Act, 47 U.S.C. § 227 (1994) (limits on use of automatic dialing machines in telemarketing); and the Cable Communications Policy Act, 47 U.S.C. § 551(a) (1994) (cable television).
16. The Federal Trade Commission Act, 15 U.S.C. § 45(a) (1994 ed.).
17. GeoCities, Inc., Docket No. C-3849, 1999 FTC LEXIS 17 (FTC Feb. 5, 1999). Since the time of the settlement, GeoCities has become part of Yahoo.
18. GeoCities offers its members free and fee-based personal home pages, and links its members' home pages into a virtual community of themed neighborhoods.
19. The GeoKidz Club has been replaced by a kids' club called the Enchanted Forest.
20. These core information privacy principles have developed from studies, task forces, directives, and reports, including: U.S. Dep't of Health, Education, and Welfare, Records, Computers, and the Rights of Citizens (1973); The U.S. Privacy Protection Study Commission, Personal Privacy in an Information Society (1977); Organization for Economic Cooperation and Development, OECD Guidelines on the Protection of Privacy and Transborder Flows of Personal Data (1980); U.S. Information Infrastructure Task Force, Information Policy Committee, Privacy Working Group, Privacy and the National Information Infrastructure: Principles for Providing and Using Personal Information (1995); U.S. Dep't of Commerce, Privacy and the NII: Safeguarding Telecommunications-Related Personal Information (1995); The European Union Directive on the Protection of Personal Data (1995); and the Canadian Standards Association, Model Code for the Protection of Personal Information: A National Standard of Canada (1996).
21. The FTC now recognizes five widely accepted principles essential for effective self-regulatory (or legislative programs) to protect privacy. These five principles are: Notice/Awareness is the most basic principle. All web sites should disclose to consumers the site's information use and privacy protection policies such as: 1) what information is being collected; 2) who is collecting it; 3) how it will be used; 4) who might have, or will be given access to the data; 5) what passive or non-obvious, data collection methods are used by the site; 6) whether providing the requested information is mandatory or voluntary; and 7) how the data will be protected. Choice/Consent embodies the principle that web sites should seek consumers' consent regarding any uses of the information beyond those necessary to achieve the basic purpose of the data request. Access/Participation establishes the principle that consumers should be able to access data about themselves and to challenge the data's accuracy or completeness. Timely and inexpensive access, a means for consumers to verify the information recorded in the site's database, and a method to correct information or add objections to the file, are essential for meaningful access. Integrity/Security reflects the principle that data collectors should ensure that the information they collect is secure and accurate. For example, the collector should use only reputable sources of data, should cross-check data where possible, and take steps to secure the data against loss or unauthorized access. Enforcement/Redress recognizes the principle that an enforcement mechanism is vital to ensure compliance with all the other fair information practices and to provide recourse for injured parties. A self-regulatory program that seeks to assure enforcement and redress might incorporate such features as periodic compliance audits, neutral investigation of consumer complaints, a dispute resolution mechanism, and correction of misinformation or compensation for injured parties.
22. Children's Online Privacy Protection Act, Pub. L. No. 105-277, Title XIII, 112 Stat. 2681, 2681-728 to 2681-735 (1998) (codified at 15 U.S.C. §§ 6501-6506).
23. ReverseAuction.com, File No. 002-3046 (Jan. 6, 2000) (available at /os/2000/01/reversecmp.html and/os/2000/01/reverseconsent.html
24. The Fair Credit Reporting Act (FCRA), 15 U.S.C. § 1681 et seq.
25. Gramm-Leach-Bliley Act, Pub. L. No. 106-102, Title V, Privacy, 113 Stat. 1338, 1436-1450 (1999) (to be codified at 15 U.S.C. §§ 6801-6809).
26. FCRA, 15 U.S.C. § 1681a(d) ("The term "consumer report" . . . does not include (A) any report containing information solely as to transactions or experiences between the consumer and the person making the report.").
27. 15 U.S.C. § 1681a(d)(2)(A)(iii). Note, moreover, that affiliates may freely share among themselves their individual transaction and experience data without providing any notice or opportunity to object to consumers. 15 U.S.C. § 1681a(d)(2)(A)(ii).
28. Gramm-Leach-Bliley Act, Pub. L. No. 106-102, 113 Stat. at 1437.
29. Dep't of Health & Human Servs., Standards for Privacy of Individually Identifiable Health Information, 64 Fed. Reg. 59,918 (Nov. 3, 1999) (proposed rule).
30. This situation may change during this Congressional session. The CQ Monitor News reported that "'the privacy dam may burst,'" because the argument that self-regulation is working to protect privacy just will not "'hold water anymore.'" Adam S. Marlin, Business Advocates Playing Catch-Up On Privacy Issues, CQ Monitor News, <http://onCongress1.cq.com/PSUser/psrecor...tternfiles/newsanalysis&NS_initial_frm=1 > (quoting Mark Rotenberg, director of the Electronic Privacy Information Center). Currently, there are two, nearly identical proposals to amend Title V of the Gramm-Leach-Bliley Act. They would require that customers opt-in (that is, give explicit permission beforehand) to the release of any of their confidential financial information. See H.R. 3320, 106th Cong. (1999) (sponsored by Rep. Markey); S. 1903, 106th Cong. (1999) (sponsored by Sen. Shelby). In addition, the White House states that it will push Congress this session to pass legislation to protect medical records. Two Virginia Congressmen (Boucher and Goodlatte) have offered legislation (H.R. 1685) that would require Internet sites to post how they use any information that they gather. A bill (S.809) sponsored by Senators Burns and Wyden would require all Web sites to post privacy information and give consumers the chance to limit disclosure of their personal information. The Burns-Wyden bill is closely modeled on the Children's Online Privacy Protection Act but would apply to adults.
31. Children's Online Privacy Protection Act, 15 U.S.C. §§ 6501-6506.
32. Children's Online Privacy Protection Rule, 64 Fed. Reg. 59,888 (Nov. 3, 1999) (to be codified at 16 C.F.R. Part 312). Notably, COPPA provides that "[a] violation of a regulation prescribed under subsection (a) [the FTC-promulgated rules] shall be treated as a violation of a rule defining an unfair or deceptive act or practice under section 18(a)(1)(B) of the Federal Trade Commission Act (15 U.S.C. § 57(a)(1)(B))." 15 U.S.C § 6502. By treating a COPPA rule violation as if it were a violation of a rule promulgated under section 18(a)(1)(B), the Commission can seek civil penalties immediately.
33. The FTC has done precisely this in Liberty Financial Cos., Docket No. C-3891 (Final Order Aug. 12, 1999), where it alleged, among other things, that the Young Investor web site falsely represented that personal information collected from children in a survey would be maintained anonymously when in fact it was not.
34. See Glenn R. Simpson, E-Commerce Firms Start to Rethink Opposition to Privacy Regulation as Abuses, Anger Rise, Wall St. J., Jan. 6, 2000, at A24.
35. FTC, Privacy Online: A Report to Congress (1998) </reports/privacy3/index.html)>.
36. See Notice of Public Workshop on Online Profiling, Docket No. 990811219-9219-01 (Sept. 1999) </os/1999/9909/FRN990915.html>.
37. Establishment of the Federal Trade Commission Advisory Committee on Online Access & Security and Request for Nominations, 64 Fed. Reg. 71,457 (Dec. 21, 1999).
38. See FTC's July, 1999 Privacy Report, at 8.
39. Prof. Mary Culnan, McDonough School of Business, Georgetown University, The Georgetown Internet Privacy Policy Survey Report (1999) (available at http://www.msb.edu/
faculty/culnanm/gippshome.html). See also Online Privacy Alliance, Privacy and the Top 100 Sites: A Report to the Federal Trade Commission (1999) (available at http://www.msb.edu/faculty/culnanm/gippshome.html ).
40. See FTC's July, 1999 Privacy Report, at 7-8.
41. See note 21 supra.
42. The survey will cover two data sets: 1) a random sample of 500 domains drawn from a list of the busiest U.S. ".com" sites having an audience of 39,000 unique visitors or more as compiled by Nielsen's Netratings from the month of January 2000; and 2) a review of the busiest 100 domains on the same Nielsen Netratings list.
43. The Consortium was created to develop common protocols that promote Web evolution and interoperability. It is an international group, jointly run by the MIT Laboratory for Computer Sciences (LCS) in the U.S., the National Institute for Research in Computer Science and Control in France, and Keio University in Japan. Currently, there are more than 260 members in the Consortium. W3C Publishes First Public Working Draft of P3P 1.0: Collaborative Efforts by Key Industry Players and Privacy Experts Promote Web Privacy and Commerce, < http://www.w3.org/Press/1998/P3P >.
44. Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data (available at http://aspe.os.dhhs.gov/datacncl/eudirect.html).See also Guidelines on the Protection of Privacy and Transborder Flows of Personal Data (available athttp://www.oecd.org//dsti/sti/it/secur/prod/PRIV-EN.html).
45. U.S. Dep't of Commerce, Draft, International Safe Harbor Privacy Principles (Nov. 15, 1999) (available athttp://www.ita.doc.gov/td/ecom/Principles1199.html).