|April 15, 1997
Re: Consumer Privacy 1997 -- Comment P954807 and
The Center for Democracy and Technology submits these preliminary comments and requests the opportunity to participate in Session two: Consumer Online Privacy and Session three: Children's Online Privacy of the upcoming Federal Trade Commission's Public Workshop on Consumer Information Privacy.
CDT is a non-profit, public interest organization working to protect and advance civil liberties and democratic values on the Internet. One of our core goals is to develop a privacy framework for the Internet. Towards this end, CDT is working with a broad public interest and industry coalition to develop and implement fair information principles and technical tools that foster individual control over personal information on the Internet.
The emerging global information infrastructure poses both difficult challenges and unique opportunities for protecting individual privacy. CDT believes that new technologies can be designed to enable citizens to exercise greater control over the collection and use of personal information. Through the development and implementation of strong privacy policies, and the design and implementation of technological mechanisms that facilitate individual choice, we believe that interactive digital media can empower citizens to make meaningful decisions about the flow of personal information.
Toward this goal CDT helped to establish the Internet Privacy Working Group shortly after the Federal Trade Commission's Bureau of Consumer Protection's June 1996 public workshop on Consumer Privacy on the Global Information Infrastructure (1996 Workshop). During the June 1996 Workshop a number of participants expressed interest in examining the potential for user-controlled applications to support fair information practices in the online environment.
The recently formed Internet Privacy Working Group is comprised of a broad cross-section of public interest organizations and private industry engaged in commerce and communication on the Internet. Coordinated by the Center for Democracy and Technology, IPWG's mission is to provide a policy framework addressing privacy concerns in the online environment. Towards this end IPWG is developing a language for users to communicate privacy preferences and Web sites to communicate information practices on the Internet. Staff members of the MIT-based World Wide Web Consortium (W3C) have been actively participating in the IPWG effort.1 The work of IPWG will contribute towards W3C's proposed project -- the Platform for Privacy Preferences (P3) -- that will enable computer users to make choices about the flow of their personal information on the Internet. In their capacity as IPWG coordinators CDT staff filed comments and a request to participate in response to questions 2.14 and 3.14.
IPWG's submission in response to question 2.14 and 3.14 reflects CDT's input; therefore, we will not duplicate those comments here.
TECHNOLOGICAL DEVELOPMENTS AND SELF-REGULATION
As the coordinator of IPWG, CDT is aware of both the strengths and limits of its efforts. Technological developments must be viewed within the larger context of other efforts to produce cohesive privacy protections in the online environment. CDT's comments will focus on: the role technological tools such as the Platform for Privacy Preferences can play in advancing fair information practices; the fair information practice principles that are unaddressed by current and anticipated technological developments and self-regulatory efforts; and, the benefits and risks of adopting technological tools to advance privacy. The goal is to compare current efforts to address privacy concerns with traditional self-regulatory models, legislative solutions, and other approaches to securing individual privacy.
2.13 & 3.13 What privacy concerns, if any, are not adequately addressed by existing guidelines and technological mechanisms?
2.14 & 3.14 Has interactive technology evolved since June 1996 in ways that could address online privacy issues? To what extent is it currently available and being used by consumers and commercial Web sites?
It is difficult to predict the ultimate success or failure of self-regulatory policies or technological mechanisms, either independently or in tandem, to address privacy concerns in the online environment. However, it is useful to ask whether current self-regulatory or technological mechanisms for protecting online privacy address what have historically been identified as the two primary shortcomings of industry self-regulation in the privacy area: 1) the lack of oversight and enforcement; and, 2) the absence of legal redress to harmed individuals.2 It is equally worthwhile to examine whether they respond to individuals' information privacy concerns as expressed in the HEW Fair Information Practice Principles3. Finally, it is useful to compare each approach with existing legislative and regulatory approaches to protecting privacy.
Debate over the capacity of self-regulation and market forces to adequately address privacy concerns continues to rage in the privacy and consumer protection arenas. Advocates often take the position that self-regulation is inadequate due to both a lack of enforcement and the absence of legal redress to harmed individuals. Industry tends to strongly favor self-regulation, stating that it results in workable, market-based solutions that respond directly to consumer's needs while placing minimal burdens on affected companies.4 These positions, while in tension, are not mutually exclusive, and, in the past, both have accurately described the self-regulatory process.
While industry associations have put forth guidelines to address a variety of online privacy concerns at this time implementation seems to be lagging.5 Current guidelines include: Joint Statement on Online Notice and Opt-Out, presented at the June 1996 Workshop by the Direct Marketing Association & the Interactive Services Association; the Interactive Services Associations' Guidelines for Online Services: The Renting of Subscriber Mailing Lists; and the Coalition for Advertising Supported Information and Entertainment's Goals for Privacy in Marketing on Interactive Media -- all presented at the June 1996 Workshop.
The self-regulatory efforts of industry while laudable do not address the full range of individual privacy concerns.6 The lack of oversight and enforcement, and the lack of meaningful redress to aggrieved individuals will continue to be sources of criticism of these industry efforts. Its important to note that oversight and enforcement on the Internet -- be it of self-regulations or government regulations or legislation -- is a challenge. At this juncture, the decentralized, open, and global characteristics of the Internet may make it resistant to traditional top-down regulatory regimes.7
In addition to self-regulatory efforts by industry it is important to consider a number of other non-regulatory attempts to address privacy concerns on the Internet. For example a number of tools for disabling or deleting "cookies" have appeared on the market. While they represent a technology response to privacy intrusions enabled by another technological mechanism, they have provided individuals a means of addressing a decision made about the flow of personal information without their consent or knowledge. Similarly, tools such as the Anonymizer continue to be used to protect privacy during web surfing by limiting the collection of transactional data . In addition a number of companies have begun to independently address privacy concerns. The PreferredMail service offered by America Online allows subscribers to block junk email from sites that have been the subject of vast numbers of member complaints about unsolicited email. A number of online people finder services are offering "opt-outs" in response to privacy concerns. Lexis-Nexis has begun to offer a suppression service to individuals who do not want their information available through the P-Trak service.8
The development of technical tools that allow individuals to respond to concerns about the flow of personal information are interesting and worthy of study. While they do not provide traditional legal remedies, they generally provide a "real" remedy that is uniquely responsive to the problem at issue. Many of them can be independently deployed by the individual and require no reliance on, or agreement with another party. Nor do they require the individual to engage in a formal adjudicatory process.
Acknowledging the benefits of these technological fixes, it is fair to say that overall they represent a piecemeal response to privacy concerns. For example, the Anonymizer is useful if one wants to engage in secret activities but is not useful if one wishes to reveal information for a single purpose without losing all control over it. Similarly, suppression options do not respond to the initial privacy concern -- the reuse and disclosure of information for unrelated purposes without individual consent.
Technological mechanisms coupled with policy guidance
In addition to these independent "fixes" deployed in response to privacy incursions, two collaborative efforts have emerged to address privacy concerns on the Internet. The Internet Privacy Working Group (IPWG) and eTRUST. Both of these efforts attempts to leverage the unique characteristics of the Internet -- interactivity, real-time communication, and capacity to facilitate and support end-user decisions about privacy. While distinct in many notable ways both of these efforts attempt to support privacy policies in the online environment. IPWG and eTRUST both depart from traditional aspects of self-regulatory models and non-regulatory responses.
Internet Privacy Working Group
The Internet Privacy Working Group (IPWG) is attempting to outline a framework for privacy on the Internet through the implementation of fair information practice policies and fostering the development of a technical specification -- the Platform for Privacy Preferences (P3). IPWG's goal is to craft policies and technical tools that give users the ability to make choices about the flow of personal information while supporting seamlessness, the free flow of information, and the development of global commerce.
While in the early stages, IPWG's efforts will if implemented promote fair information practices on the Internet by fostering the development of individual empowerment technologies that facilitate the communication of service and content providers' information practices to users, and users' privacy preferences to service and content providers. In addition, IPWG has indicated a commitment to the implementation of policies, and the development of practices and tools to ensure adherence to fair information practices.
When fully implemented the P3 project will embed the core Fair Information Practice Principles of notice and consent into the backbone of the Internet. The P3 project is an effort to provide a simple mechanism for users and providers to communicate about the handling of personal information. Following the PICS-model, P3 would be an open standard available to everyone operating on the Internet. By design it would put individuals in a position to make real-time decisions about the collection, use and disclosure of personal information. Individuals would be able to make decisions about whether and under what conditions to disclose personal information -- if at all.
While addressing these core concepts of information privacy, the P3 specification will not address other equally important information privacy principles such as access to and correction of personal information.9
Solutions are meaningless unless they are effective. At this point there is a commitment to ensure adherence to fair information practices. In addition, the P3 notice and choice model has the potential to create consumer demand for information policies. Enforcement is a crucial part of any privacy solution. We believe the FTC would, in appropriate circumstances, have the ability to enforce privacy policies that entities put and to enforce entities agreements to adhere to individual's preferences. Where a deviation or breach of the terms occurred it could be actionable as a deceptive and unfair practice. We believe the FTC has full jurisdiction to ensure that entities operate fairly on the Internet and conform to their stated information practice policies. This would place the FTC in the position to oversee and enforce privacy on the Internet. While the legal redress available through the FTC is far short of a full private right of action, it is considerably more than is currently available.10 It is important to note that in sharp contrast to existing legislative protections for privacy, this model would facilitate individual decisions about the flow of personal information.11
eTRUST is a labeling and certification program sponsored by the Electronic Frontier Foundation and CommerceNet. It has set out three-tiers of information handling guidelines from which entities operating on the Internet can subscribe too. To participate in the eTRUST project a company must first execute a contract with eTRUST. In the contract the company will choose from among the three eTRUST marks:
No exchange: the site will not capture any personally identifiable information for anything other than billing and transactions; 1-to-1 exchange: the service will not disclose individual or transaction data to third parties. Individual usage and transaction data may be used for direct customer response only; Third-party exchange: the service may disclose individual or transaction data to third parties, provided it explains what personally identifiable information is being gathered, what the information is used for, and with whom the information is being shared.
The company must agree to undergo an audit by an eTRUST-approved auditing firm, and agree to certain other conditions. In exchange the company will be given an icon -- Trustmark -- to display at their Web site. By clicking on the icon users will be able to access the policy to which the site subscribes. At this time it is unclear how a proprietary system of labeling and auditing will map on to the Internet. It is unclear how a proprietary system can work in this decentralized, global, multi-player environment. For example, will small businesses, non-profits, and government Web sites be able to use and afford the eTRUST system? Will ISP's who rent server space to many organizations be capable of complying with auditing requirements? Does a US based system work in this global medium?
The eTRUST marks address a narrow subset of the fair information practice principles. While the marks provide individuals with notice of a Web sites practices, they are a one way application. The system does not allow individuals to independently express privacy concerns. The narrow range of information practice policies that a Web site can select from may not be flexible enough to reflect the various policies of Web sites, nor granular enough to address the specific concerns of individual users.12 While individuals are given notice of a companies practices, their only options are to play on the Web sites terms or walk away from sites that do not meet their privacy concerns.
The eTRUST model will build upon the traditional self-regulatory model by providing auditing and oversight functions. The mandatory audit directly responds to concerns about a lack of oversight and adherence to industry and other guidelines.
2.15 & 3.15 What are the risks and benefits, to both consumers and commercial Web sites, of employing such technology? What are consumers' perceptions about the risks and benefits of using such technology to address online privacy issues?
CDT will be conducting a short online survey to gain insight into consumers' perceptions about the risks and benefits of technology to address online privacy issues. We will be gathering information more generally on Internet users' privacy concerns. We hope to be able to share this survey information and our opinion as to the risks and benefits of technology solutions with the Commission in June. Within the concept of risks and benefits it is important to look at affordability and availability. Technology solutions will only assist people in responding to privacy concerns if they are widely available, easy to use, and inexpensive or free. Solutions -- whether they be technological, policy or a combination of both -- are always subject to the law of unintended consequences. It is likely that technological tools to address privacy concerns will be constantly revised to address new concerns and fix unintended problems.
3.9 Do children's information practices in the online context differ from those implemented in other contexts? If so, describe the differences. Do the risks, costs, and benefits of these practices differ depending on the context?
Children are an increasingly large segment of the Internet user population. The nature of the Internet provides children with unprecedented opportunities for both receiving and sending information. The Internet offers children, like adults, a tremendous opportunity to exchange ideas and participate in a world outside their window. However, the ease with which children can access ideas, reveal information about themselves, and participate in a range of activities without being identified as children, and without parental supervision, has and will continue to be a subject of concern.
The transactional information generated during a child's visits to Web sites and participation in other Internet activities, offers an enormous opportunity to monitor and analyze the child's activities and behavior. Through games, contests, and other lures online content providers are requesting -- or requiring -- that children provide personal information such as name, address, email, information on likes and dislikes, and information on their families and friends, as the cost of participating. Through both passive and active information collection, online content providers are able to create detailed individual profiles on children which can be used and disclosed for a variety of purposes.
Parents often allow their children to engage in unsupervised activities both within the home -- such as reading a book, or surfing the Internet -- and in the outside world. The interactivity of the Internet raises a unique challenge to parents who are interested in mediating their children's experiences with the outside world.
There is a need to provide parents with tools and policies that respond to their concerns about their children's privacy. Doing so presents some unique challenges. Parents have different concerns. Children come in many ages. Policies and technical tools must address the diversity of values, views, and parenting styles.
In addition, on the Internet it is generally difficult to ascertain the age -- or anything else -- of the individual with whom you are interacting. Information providers on the Internet have no way of distinguishing children from adults -- unless they are actively requesting age. While Web sites that are actively targeting children should tailor polices and practices to the heightened privacy concerns of parents, many entities operating on the Internet have no reason and frequently no desire to know the age of the user -- however parents' concerns with their children's activities at these sites remains constant. There is a concern that efforts to address children's privacy issues could lead to the imposition of an unacceptably intrusive national ID system. Imposed identification procedures applied to the World Wide Web may limit all Internet users' ability to read, speak, receive information and interact online under constitutionally-protected conditions of anonymity.
We believe that empowering parents with technological tools to protect their children's privacy and fostering the adoption of fair information practices by entities operating on the Internet is a way to meet the privacy concerns of children and maintain the open nature of the Internet. For example, software already on the market such as Cyberpatrol, as well as those under exploration such as P3 address privacy concerns by restricting access to sites which practice objectionable marketing and information collection techniques, and allowing parent's to prevent their children from revealing personal information such as name, address, and e-mail address to others.
There are a number of tools -- some are free -- available to help parents address concerns with children's online privacy. There is a clear need for additional tools and concerted public education on this issue. Without market saturation and consumer use technology solutions will be inadequate to address the privacy of children in the online environment.
3.10 Do schools, libraries, and other settings in which children may have access to the Web, have a role to play in protecting children's privacy? What role do the currently play, and what role could they play in the future?
Schools, and other institutions often stand in the shoes of parents when entrusted with a child's care. Schools make decisions about the curriculum they teach, the field trips they take, and the speakers they invite based on the age of the school population, the communities values, and other factors. It is reasonable to believe that schools will continue to make decisions on behalf of children -- it is a task they are charged with. It is also reasonable to expect that at times schools will ask parents to make decisions about their children's activities -- be they field trips to the great outdoors or perhaps excursions on the Internet.
Libraries present a more complicated question. Libraries are available to both children and adults and are charged with the broad obligation of serving the full community. However, it most be noted that librarians and library boards make decisions about the books to stock on their shelves based on a variety of considerations. Many libraries provide children's areas, children's reading lists and offer other services directed toward children. In this fashion libraries assist parents in supervising their children's exposure to thoughts and ideas. However, unlike schools, libraries are rarely asked or expected to directly make decisions for children -- and in fact many librarians are quite resistant to assuming a monitoring function with respect to children. Because of the historic mission of libraries, it is unclear how parental empowerment tools will be deployed in the library setting.
Many of the Commission's questions regarding "unsolicited commercial email" seek facts or quantitative data. CDT is unable to provide such information, but encourages the Commission to consider the results of surveys such as Voter Telecommunications Watch's survey of users' and ISPs' concerns with "junk email", and the Privacy & American Business survey on "Commerce, Communication, and Privacy in Cyberspace," as well as the comments of ISPs, technologists, associations, and others who are devising solutions to this problem (as they define it) in the absence of legislation.
While unable to provide facts and data, CDT has comments which we believe may provide some useful context and are responsive to a number of the Commission's questions.
2.16 & 3.16 Are privacy or other consumer interests implicated by this practice?
2.17 & 3.17 What are the risks and benefits, to both consumers and commercial entities, of unsolicited commercial email?
Internet users and ISPs (Internet Service Providers) are grappling with the issue of junk email. The label "junk email" has loosely been affixed to email that has one or more of the following characteristics, it is: 1) unsolicited; 2) bulk; and/or, 3) commercial. Users and ISPs complaints about junk email differ slightly. Users tend to focus on either the unsolicited or commercial nature of the mail. They object to receiving unsolicited commercial email about products and services that they have not requested. ISPs, while sharing their users' concerns, lodge unique complaints focused on the inconvenience, cost, and damage to their services' reputation, that is caused specifically by junk email. They are in the unique position of being blamed at both ends of the problem: they are often the mechanism through which such mail is sent by unscrupulous customers and this same mechanism delivers it to other unhappy customers.
The junk email issue and the solutions that are found to address it, touch on both First Amendment and privacy concerns. An improper policy or technical resolution to the issues of junk email could have a negative impact on individual privacy, online free speech, and the free flow of information on the Internet.
In addition, a number of proposals to deal with junk email, such as system operator filtering, may raise liability issues similar to those faced in both the content and intellectual property areas. Crafting an appropriate solution to the junk mail issue requires full consideration of the privacy, free speech and liability ramifications of alternative proposals.
Free speech concerns
Several of the policy proposals floated thus far have suggested labeling email based upon content. Mandatory author self-labeling systems raise important First Amendment concerns. While commercial speech is given less protection under our constitutional jurisprudence, mandatory self-labeling systems may reach beyond the commercial realm and pose a broader threat to free speech.
Currently, most junk email is not sent from well-established businesses because it is per se reputation damaging. Anecdotal reports and our observation indicate that fly-by-night and unknown cottage businesses form the bulk of the today's junk email senders. Unless junk email loses this stigma it will remain an inherently reputation-damaging practice performed by small time operators. It is with some irony that CDT notes that providing Internet users an easy technological method of dealing with such email is at this point a market imperative if reputable companies are to make this communication tool useful for their own marketing strategies. Unless users are able to handle "junk email" -- however it is they define it -- in a satisfactory manner, this avenue of advertising will remain unavailable to businesses.
The ability to limit unwanted intrusions into one's private sphere is a core expression of individual privacy. The right to limit intrusions into one's home has been recognized both in case law, and in statutes that craft affirmative rights to privacy by placing limitations on phone calls and faxes coming into the home or business (privacy of course is not the only basis for these laws). The principle of individual control over both the content one receives and the personal information one reveals is central to CDT's mission. The issue of junk email presents an opportunity to explore the development of user based tools to facilitate individualized control over information coming into the home. Unlike content on the World Wide Web, email is not, at this time, under the control of the user. Individuals have limited ability to control the content that arrives in their inbox-although they may delete information based on sender or subject line without ever examining the entire document contents. Privacy advocates, consumers, Internet service and content providers, and marketers have all identified this as an issue that must be addressed. We look forward to a constructive discussion on this issue and are hopeful that useful solutions will be presented, discussed, and possibly acted upon.
Thank you for the opportunity to share our views on the important topic of individual privacy online. We hope to have the opportunity to participate in the upcoming Workshop. Please contact us if we can provide additional information or assistance.
1 Currently a W3C proposal for a formal project is under membership review. (See WIC submission)
2 Its worth noting that advocates have voiced similar concern with the lack of effective oversight and enforcement provisions in existing legislative privacy solutions, which often lack private rights of action, significant penalties, and/or require the individual to show actual harm or damages to seek redress.
3 In 1972, then-Secretary of HEW Elliot L. Richardson, appointed an Advisory Committee on Automated Personal Data Systems to explore the impact of computerized record keeping on individuals. In a report published in 1973, the Advisory Committee proposed a Code of Fair Information Practices published in Records, Computers and the Rights of Citizens: Report of the Secretary's Advisory Committee on Automated Personal Data Systems. The 1973 Code of Fair Information Practices supplied the intellectual and statutory framework for the Privacy Act of 1974 and served as a model for privacy legislation in this country and worldwide. The basic principles of the 1973 Code, as published in the Advisory Committee's Report, are:
Report of the Secretary's Advisory Committee on Automated Personal Data Systems, Records, Computers and the Rights of Citizens, US Dept. of Health, Education & Welfare, July 1973 at xxiii-xxvi.
4 For example, from the Direct Marketing Association's Personal Information Protection Guidelines: "These Guidelines are also part of the DMA's general philosophy that self-regulatory measures are more desirable than government mandates whenever possible."
5 At the Federal Trade Commission's (FTC) workshop on Privacy in Cyberspace (June 4-5, 1996) the DMA in conjunction with the Interactive Services Association, and the Children's Advertising Review Unit of the Council of Better Businesses released similar policy statements on the collection of information from children online. Both advocated: 1) providing notice of information collection and the marketing purpose behind it; 2) limiting the collection of data from children; and 3) supporting parents ability to limit data collection on children. Unfortunately few content and service providers operating on the Internet have heeded these guidelines. As the Center for Media Education (CME) and the Consumer Federation of America (CFA) aptly point out in a joint letter to FTC Chairman Pitofsky (November 25, 1996), "five months later. . . companies are continuing to collect personally identifiable information from children at their Web sites without disclosing how the information will be used or who will have access to it. . ."5 In their letter, CME and CFA provide a long list of Web sites aimed at children that fail to meet basic notice standards -- a long standing DMA principle, and a core component of the draft guidelines DMA and ISA released at the FTC workshop.
6 Most importantly, they fail to address a number of the HEW Fair Information Practice Principles, such as:
7 As countries are discovering in the First Amendment area, enacting a law limiting or criminalizing specific content domestically has little effect on citizens' ability to access the objectionable material.
8 P-Trak is a people finding service. It is not available via the Internet.
9 Providing access and correction rights to transactional data raises some complex policy, technical and security issues. Developing policies that address the privacy concerns inherent in the collection of personally identifiable data -- even transactional data -- but acknowledges some varying access rights that don't unintentionally disrupt normal system operations is a complex task.
10 Traditionally civil liberties and civil rights advocates have pushed for private rights of action for violations of individual's interests.
11 Large quantities of personal information are unprotected by privacy rules and vulnerable to misuse and abuse. Existing US law rarely gives individuals any direct say in how information about them is collected, used and disclosed. For example, the Right to Financial Privacy Act (RFPA) only limits government access to personal bank records leaving the private sector's use of personal financial information is unfettered. The Fair Credit Reporting Act, while establishing some limits on the private sector's use of credit information, does so by legislating so-called "permissible purposes" for which industry can use and disclose credit information rather than crafting a consent mechanism that would allow the individual to be the arbiter of access to her own information. Even where there has been an attempt to codify fair information practices through federal statutes the results have generally fallen far short of the desired goal of privacy advocates, which is to have individuals control the collection, use, and disclosure of personal information.
12 A simple example, users may be interested in setting different rules or Web sites may have different practices for handling different data elements -- for example email address v. billing information. 4/15/97 Center for Democracy and Technology 3