FTC Advisory Committee on Online Access and Security
Final Report - First Draft
03 May 2000
Table of Contents
The purpose of the Advisory Committee on Online Access and Security ("ACOAS" or the "Advisory Committee") is to give advice and recommendations to the Federal Trade Commission ("FTC") concerning providing online consumers reasonable access to personal information collected from and about them by domestic commercial Web sites, and maintaining adequate security for that information.
In particular, the Charter of ACOAS directs that the Advisory Committee "will consider the parameters of reasonable access to personal information and adequate security and will present options for implementation of these information practices in a report to the Commission." (Charter of the Federal Trade Commission Advisory Committee on Online Access and Security "Charter," attached hereto as addendum A).
This is the final report of ACOAS. The Advisory Committee considered access and security as it relates to online information. Its work relates to the online world and should not be seen as a specific road map for off-line records.
A wide range of discussion was held in four formal meetings of ACOAS and in numerous subcommittee-working groups not held in the presence of any official of the FTC. All substantive proposals have been made available to members of ACOAS and to members of the public by having been promptly placed on the FTC's Web site for ACOAS, http://www.ftc.gov/acoas.
The advice of this Advisory Committee and the options presented are in the context of implementation of Fair Information Practices by commercial Web sites. The Charter neither requested nor precluded suggestions for legislation or mandatory regulation. Access to private sector records, in the view of some on the Advisory Committee, is not yet appropriate for legislative recommendation. Others on the Advisory Committee believe that there should be immediate legislative implementation of some of the options. It is, therefore, not possible for this Committee to reach a consensus on legislative recommendations.
The context of this committee's consideration was not to provide consensus options for legislation, mandatory regulation or self-regulation. Rather, the Advisory Committee presents a range of options that have been identified as ways to implement Fair Information Practice principles of access and security. The report is silent on whether these recommendations should be implemented voluntarily, by industry self-regulation, or by legislation. Access options have some support from at least one committee member, but do not represent a majority position, and therefore no consensus was reached on any access option. There was consensus on one security recommendation. Each option has with it a brief discussion of its pros and cons..
To some on the Advisory Committee, the options identified here should be further examined and tested and applied before they are enacted with the force of law. To others, these options, or at least some of them, provide a road map to legislative action.
The value of this report is that it reflects a review of the issues of access and security by a wide range of experts, practitioners, and advocates from all sides of the issue. It provides an analysis of the issues and an identification of options that hopefully will be helpful to the FTC in its continued efforts in privacy and the application of Fair Information Practice principles.
"Access" to personal data is frequently invoked as a fundamental part of any privacy program. But as the Advisory Committee's deliberations revealed, this apparently simple concept hides layers of complexity - and in many cases disagreement. This Section seeks to unpack the concept of access in a way that helps Web sites and policymakers understand the difficult questions that must be answered in fashioning an access policy.
We first identify the questions that must be answered in defining access - does it mean only an ability to review the data or does it include authority to challenge, modify, or even delete information?
We next ask another apparently simple question, "Access to what?" Businesses gather a wide variety of data from many sources. Sometimes information is provided by the individual consumer, sometimes by a third party. Sometimes it is derived by the business itself, using its own judgment or processes. Sometimes the data is imperfectly personalized - it relates to a computer that may or may not be used only by one person. How much of that data is covered by the access principle?
With those concepts in mind, the Committee lays out four illustrative options that show the many different ways in which the access principle could be implemented - ranging from "total access" to a narrow access option aimed at ensuring correction of important information.
Finally, this section addresses three additional questions that must be answered before any access policy can be implemented.
Access is the individual's ability to view, edit(1), and/or delete(2) his or her personally identifying information. The scope of access will vary between each access option put forth in following sections and due to other considerations such as whether the website in question is a covered entity and the type of authentication deemed appropriate.
Both consumers and businesses have a shared interest in the provision of reasonable access to consumer personal information. Reasonable access benefits individuals, society and business due to the openness and accountability it helps to promote. If done properly, the provision of access can also help reduce the costs to businesses and consumers of improper decision-making due to poor data quality. Moreover, increased access may help promote consumer trust and deeper customer relationships, which benefit both consumers and businesses. However, the manner in which to provide access and to what degree access should be provided are complex questions given the numerous types of non-personally identifiable and personally identifiable information, the "sensitivity" of that information, the sources of that information, and the various costs and benefits associated with providing access.
The method by which access is provided should be consistent with its storage and use by the business. For example, if the business stores the information in online storage such that it is instantly available for use by the business (e.g. as part of online transaction processing system or a web based e-commerce system), then instantaneous online access should be provided to consumers via an appropriate online terminal (e.g., web browser, ATM machine, telephone voice response unit).
Should the ability to view, edit or correct data vary with the use of the data?
Some members of the Committee thought that the use of the data should not be a factor in determining whether or not to grant a consumer the ability to view, edit or correct data maintained about them. Although the way the data is being used is an important consideration, it is a slippery slope. What is collected today and not used, might be in the future. Some may consider a use or decision unimportant, while others might consider such use very important.
Should the provision of access be determined in terms of the type of data?
For purposes of this discussion we felt it was best to group data into three broad classes, namely:
There is a case for not having to provide a customer access to inferred data as this information may be the result of a proprietary model that provides the company competitive advantage; e.g. an indicator of a customer's future purchase behavior. The only counter would be when the derived data is used to make a decision about the customer that would result in an important denial of services - e.g. granting of a loan. However, it should be noted that consumers might be more interested in information that is derived about them than they are about the detailed information that they used to derive it in the first place.
With specific regard to correction, some Committee members believed that ascertaining whether inferences are right or wrong would be difficult and costly. Also, many inferences are not presumed by the inferer to be correct, but instead are useful to draw general conclusions, instead of conclusions of fact, and therefore this category of information is not practically corrected by the consumer. Other Committee members believe this is information formulated about a consumer and used in ways that affects their interaction with businesses. These members believe consumers have a strong interest in being able, at the very least, to view all the information that describes them in the hands of businesses.
The costs of providing access to other types of information such as click stream or log data could be considerable and fantastically expensive. In addition, some of the above options would create substantial authentication hurdles.
There are costs and benefits to both businesses and consumers that must be considered here. Consumers face a higher cost in not having correct data for certain types of information (credit information vs. marketing information, for instance). Some Committee members believe that there is a benefit in providing access in general to all types of information held by all businesses, and these benefits must be weighed against the costs.
Means of Access
Access should be provided via a means appropriate for the type of information and consistent with its storage and use by the business. If the business stores the information in online storage such that it is instantly available for use by the business (e.g., as part of an online transaction processing system or a web based e-commerce system), then instantaneous online access should be provided to consumers via an appropriate online terminal (e.g., web browser, ATM machine, telephone voice response unit).
If the business stores the information in storage for processing by batch processing systems (e.g., a batch billing system), then the information should be available to consumers via a frequently (e.g., once per week) scheduled batch process (e.g., a report run at regularly scheduled intervals and mailed to the consumer).
If the business stores the information in offline storage (e.g., magnetic tapes stored offsite), then the information should be available to consumers via an ad-hoc batch process (e.g., scheduled on demand).
An important first step in considering whether, how, and under what circumstances to provide individuals with access to information is defining the information at issue. Defining the term personal information is central to the task of considering various options for providing access. The Committee considered several approaches to defining the information to be considered personal information for the purpose of providing individuals access. The options below are listed in descending order from broad to narrow. As discussed in the access options, access to data covered by each of these definitions could still be limited under the "default" or "case by case" options due to mitigating circumstances. The options below illustrate several approaches to defining the scope of information under discussion. The charts provide for easy comparison between the various options. Green indicates that the information is included in the definition. Red indicates that the information is excluded from the definition.
Access should be provided to:
Information maintained by a business and attached to the individual or a proxy for the individual.
This definition includes all information regardless of the medium (online v. offline), method (passive v. active), or source (data subject v. third party) from which it is obtained. This definition covers both information tied to traditional identifiers such as names and addresses, and envisions the development of online identifiers that provide the same ability to collect information about particular individuals and use it to make decisions that impact the individual in the online environment such as mobile device or other unique identifiers. This would include global and local unique identifiers. This definition reflects the concepts that 1) information need not be unique to be considered capable of identifying an individual; and, 2) the concept of "identifying" is rapidly changing in the online environment.
Information maintained by a business about an individual that identifies him or her using a traditional identifier.
This definition includes all information that meets this definition regardless of the medium (online v. offline), method (passive v. active), or source (data subject v. third party) from which it is obtained. This definition would provide access to all information tied to an email address, a physical address, but would not provide access to information tied to a unique numeric identifier in the absence of additional identifying information. For example, click stream data tied to a unique number would not meet this definition unless it was associated with a name, email address or other traditional identifier.
Information collected online about an individual that identifies him or her using a traditional identifier.
This definition further narrows option two by limiting the medium of collection to "online." Information collected by the organization through offline methods is not covered. However, data actively or passively collected online would be covered.
Information collected online from an individual that identifies him or her using a traditional identifier.
This definition further narrows option three by limiting: 1) the medium of collection to "online"; and, 2) the source of collection to the individual. Information collected by the organization through offline methods is not covered. Information collected online or offline from a source other than the individual would not be covered. However, data actively or passively collected from the individual online would be covered.
There are situations where providing individuals with access to information (depending on the definition selected above) conflicts with other interests. The Committee identified and considered several interests that in some members' opinions merit consideration in access determinations.
Inferred or derived data is information that the business has not "collected" either passively or actively about the individual, but rather has inferred based on that observed behavior. It is the assumptions or conclusions that a business makes about an individual, not the factual record of the individual's action or behavior.
Briefly, inferred data can be defined as information gathered from sample data, not the data subject, that is calculated to result in a value applied to the data subject. Derived data can be defined as information gathered from the subject that is calculated to result in a value applied to the data subject.
The disclosure of inferred or derived data raises important considerations. Advocates of providing access to such data argue that it is used to make decisions about individuals and should therefore be available to individuals. Critics of providing access to such data argue that disclosing the assumptions or conclusions a business makes undermines competition by inviting competitors to attempt reverse engineering to unearth proprietary operations and allowing competitors to free-ride off the analytic work of rivals.
If we accept the concept that information can be both personal to an individual and not unique to the individual, we must grapple with the privacy considerations of providing access to this data. In some instances we tolerate "over disclosure", such as telephone numbers where we disclose all calls made from a number back to the individual named on the account despite the fact that in multi-person homes this discloses the calls of other family members. While providing access to data associated with a computer or browser could be considered loosely analogous to the phone number situation, the information collected about an individual's use of the Web can on its face reveal more about the individual than the numbers dialed. Therefore, in the online environment it is important to consider the potential risks to others privacy access to such data poses, and attempt to design methods to address access that promote privacy in all respects.
Should access obligations be extended to all parties with whom information has been shared?
The Committee felt that there were only three reasonable alternatives regarding which entities could be required to provide customers access to data maintained about them:
The Committee generally agreed that corporations should provide access to the data held by their agents (as defined above). However, several members of the Committee thought managing other third parties would be unduly burdensome, and that the consumers were better protected by requiring companies to provide notice about with whom they will share the information. Other members of the Committee held that the provision of notice is unduly burdensome to consumers who will be less likely to be aware of the existence of such third parties, let alone how to contact those companies and exercise access.
Still other members of the Committee believed the issue depended on whether the parent and/or subsidiaries are using this information. If they are, then they should make it accessible and protect it - if not, then no. With respect to "information intermediaries," it depends on how they treat and handle the data. If they use the information, view it and permanently store it then they should make it accessible and protect it. If not, then access is not required.
Is there an obligation to propagate corrections to incorrect data to other entities?
Whether or not third parties provide access to personal information, it is still possible for the original data collector - assuming they do provide the ability to access and correct data - to propagate any corrections made to those third parties. The Committee presented the following three options: (a) No obligation; (b) When reasonable; (c) Companies should always propagate corrections when passing information onto third parties that are receiving that information for the first time.
It would be desirable for a company when correcting errors to propagate these corrections to other entities, but it is recognized that the company may not be in a position to know all the entities that are currently maintaining related information about that individual, nor the state of that data (whether it has already been corrected or continues to be in error).
Should the access obligations of third parties vary with the use of the data?
With regards to different possible uses of data, the Committee puts forth the following four options:
However, some members of the Committee thought that the use of the data should not be a factor in determining whether or not to grant a consumer the ability to view, edit or correct data maintained about them. Although the way the data is being used is an important consideration, it is a slippery slope. What is collected today and not used, might be in the future. While some might consider use or decision unimportant, others might consider it very important.
Should the access obligations of third parties vary with the type of data?
With regard to different possible types of data, the Committee puts forth the following three categories:
There is a case for not having to provide customer access to inferred data as this information may be the result of a proprietary model that provides the company competitive advantage; e.g. an indicator of a customer's future purchase behavior. The costs of providing access to other types of information such as click stream or log data could be considerable and fantastically expensive. The only counter would be when the derived data is used to make a decision about the customer that would result in an important denial of services - e.g. granting of a loan. However, it should be noted that consumers might be more interested in information that is derived about them than they are about the detailed information that they used to derive it in the first place.
Consumers face a higher cost in not having correct data for certain types of information (credit information vs. marketing information, for instance). Some Committee members believe that there is a benefit in providing access in general to all types of information held by all businesses, and these benefits must be weighed against the costs.
Ease of access includes issues surrounding both whether access fees should be allowed, and the degree of effort required by the data access provider to ensure that the information can be easily accessed, understood and corrected by the consumer. It also includes non-economic costs of access, such as potential risks to privacy.
The Committee discussed whether businesses should charge consumers a fee for access. The range of options identified include:
Proponents would argue: The arguments for charging consumers a fee for access include:
Opponents would argue: The arguments against charging consumers a fee for access include:
The effort required of the data access provider to insure information is easily accessible by consumers might include:
Proponents would argue: The arguments for providing ease of access include:
Opponents would argue: The arguments against providing ease of access include:
As many companies that hold personal information are part of a larger corporate entity that may possess other data through different subsidiaries, there is a concern that access to information held by the parent company may force centralization of previously separated information. In some cases, combining and centralizing information poses an increased threat to personal privacy.
Centralizing and linking personal information is not a purpose or goal of access. The most expansive interpretation of access should not have the indirect effect of creating a new file or record on an individual. While concerns about avoiding centralization must be heeded, they should not prevent parent companies from implementing procedures that ease the ability of consumers to access information. This can be accommodated by providing central points to serve consumers access requests but not actually centralizing the maintenance or storage of data. For example, parent companies could create a central page (phone number, etc.), through which consumers could file requests to be processed by the various subsidiaries or through which they can easily identify and make requests directly to the subsidiaries. Subsidiaries may have different pieces of personal information in their records. Even this simple integration of information might increase the vulnerability of an individual's information to compromise - e.g., a bad actor, if they can guess the password, can get access to all the customer's private information from one convenient location - and therefore must be accompanied by a risk assessment and installation of appropriate security. Also, such a central point may be difficult to manage for companies that regularly acquire and divest subsidiaries.(3)
The Committee wishes to emphasize the difference between authentication and identification. As we seek to provide individuals with access to personal information we must not move toward increased identification of individuals.
Maintaining the ability of individuals to be anonymous on the Internet is a critical component of privacy protection. Access systems should not require identification in all instances. Biometrics raise additional privacy concerns that must be explored and addressed. Finally, third party authentication systems raise important privacy concerns (creating additional records of individual's access requests). Inserting a third party into the relationship creates an additional opportunity (at times it may be responsibility) to collect and maintain information about the individual's interactions. What policies govern these entities' use of personal information? On the other hand, third parties - intermediaries -- can also play a role in the protection of identity. Currently several companies have established themselves as intermediaries whereby they act as a protector of identity and privacy between the individual and other entities.
The "default to consumer access" approach works from the presumption that consumers should have access to their personal information. The presumption of access is limited where considerations such as the privacy of another individual, the proprietary nature of certain information, or the cost of providing access outweigh the individual's interest in access.
The "default to consumer access" approach recognizes that consumer access to personal information serves multiple purposes, including but not limited to ensuring accuracy. By promoting openness, access promotes awareness of business information practices, aids the compliance process, and promotes greater trust between businesses and their customers. The openness about data collection promoted by the "default to consumer access" approach may increase consumer awareness of the trustworthiness and responsibility of the businesses that collect information about them, or it may lead consumers to call for more limited collection of information. In this regard, the "default to consumer access" approach may act to promote privacy-sensitive business practices by encouraging businesses to limit the information they collect about customers.
The over-arching principle of the "default to consumer access" approach is that:
There are exceptions to this rule. Information is not accessible to a consumer unless access involves taking steps that are taken on a regular basis in the business with respect to the information, or that the organization is capable of taking, with the procedures it uses on a regular basis. This limitation on access is designed to ensure that businesses need not create new systems to create personal information solely to provide access. The creation of new systems that link personal information with transactional and other data that is not on its own tied to or used to make decisions about the individual would create additional risks to privacy. However, this means that information that may be defined as personal information may be out of reach under this option.
In addition, personal information is not retrievable in the ordinary course of business if retrieval would impose an "unreasonable burden" on the business. This is a narrow, but important, exception to access. It allows for a purpose or cost benefit analysis in rare situations where the ability to retrieve the information would be very costly or disruptive to the business, and the information at issue is of marginal significance. It is here that sensitivity of data, uses of data, the purpose of the request, etc. could be considered. If an organization uses this exception to limit access, it must refer the individual to the provisions in its privacy notice that discuss it's data collection use, and consent/choice policies, or provide the individual with information equivalent to the privacy notice.
Proponents of this approach would argue:
Opponents of this approach would argue:
The "total access" approach works from the presumption that consumers would benefit from having access to all their PII in the possession of commercial websites and that there is no information that should remain off-limits or confidential. Not only does this verify the accuracy of that data, it also places the consumer in the position of knowing how his or her personal information is collected and used. In keeping with the purpose of providing consumers as much access as possible, businesses would provide initial access for free, while charging for repetitive access requests or terminating access upon unduly repetitive access requests.
The "total access" approach should be interpreted to only apply to existing records. For example, off-line information in the possession of a data collector but not yet linked or joined with online information would not be subject to access. In addition, creating more comprehensive records of individuals should not be done in order to establish "total access".
The over-arching principle of the "total access" approach is that:
Proponents would argue:
Opponents would argue:
A third approach would be to treat different information differently depending on a calculation that takes into consideration, among other things, the content of the information, the holder of the information, the source of the information, and the likely use of the information. This approach is necessarily more complex, recognizing that whether access is appropriate depends on a variety of factors. Different sectors, record-keeping systems, and types of data raise different issues. The challenge, therefore, would be to develop an administrable set of rules.
Unlike the Default option that is premised on a presumption of access, under the case-by-case or sectoral approach there is no presumption for or against access. Rather, the access inquiry requires an analysis of the relevant factors. This approach would differ from others under consideration in that it most likely would not afford access to information that, if inaccurate, is unlikely to have an adverse effect. This determination will depend both upon the nature of the data in question (e.g., information regarding children) and the record-keeping system in question. On the other hand, it is clear that under this third approach, there would be categories of data to which access is more limited than in the other approaches. For example, inferred data, "non-factual data" or internal identifiers may be less accessible under this approach than under the other approaches.
Proponents would argue:
Opponents would argue:
Although an approach involving a default access rule would be easier to apply, it may not reflect the real purposes behind providing access. As a substantial number of participants in the larger committee meetings and this subgroup indicated, the purpose for providing access may be more limited than promoting consumer awareness. Access itself may not enhance "consumer privacy" per se, but rather ensure the accuracy of data and protect against adverse decisions based upon incorrect data. Indeed, this notion of access also is consistent with the Online Privacy Alliance Guidelines, which address access to help assure the accuracy of information.
In fact, the purpose may be as limited as providing consumers an opportunity to correct erroneous data and not to provide consumers an opportunity simply to know what's out there. A sectoral or case-by-case approach also may allow a more precise weighing, in light of the nature of the data and the sector involved, the consumer's reasonable expectations about the data, and the costs of providing access, of whether access to the particular type of data is warranted. As with all options, the cost of providing access is a consideration that must be factored into the analysis. As the purpose for the data use becomes more significant, however, cost may become less of a factor.
This approach is consistent with U.S. privacy laws, which adopt a sector-by-sector approach to data privacy based upon the sensitivity of the information and whether it is likely to be used in a way that could adversely affect the data subject. For example, access and correction are the norm with regard to financial information used to make credit granting and employment decisions (under the Fair Credit Reporting Act), and medical record information.
In addition, according to the U.S. Privacy Protection Study Commission, which addressed access issues in the mid-1970s, decisions about access to information are unique to the particular information system.
This approach would assign different access rights to different sectors or types of data. Depending on the number of factors in the calculation, the permutations could be extensive. This approach would afford access to all sensitive data such as financial information, health information, or information relating to children, and other data in sectors of the economy that may affect individuals in a materially adverse way if it is inaccurate. In these instances, it would yield the same result as the Default Rule and Total Access approaches.
This option takes a relatively narrow view of what constitutes "reasonable access." It is at the opposite end of the spectrum from the "total access" approach of Option 2.
The approach begins by asking why access to personal data is important to consumers. One reason for allowing access - correcting errors - is of interest to both the individual and the to Web site. If the Web site uses personal data to grant or deny some significant benefit to consumers, then errors in the Web site's files could cause real harm to the consumer. Giving the consumer access to the data allows the consumer to challenge or correct errors. Both the consumer and the Web site have an interest in the accuracy of such data, so allowing access and correction helps both parties. Thus, even if allowing access increases the Web site's costs, as it often will, and even if the costs cannot be passed on to consumers, the Web site itself will get some benefit from access designed to improve the accuracy of important data.
Error-correction is not the only justification for allowing access. A second justification goes under a variety of headings - "accountability," or "education," or even "consciousness-raising." These labels reflect a view that, if consumers can see the files maintained about them, Web sites will be more cautious about gathering sensitive or unnecessary information. An even more sweeping justification for access assumes that individuals retain some rights to data about themselves, even in the hands of others - "That's my data, and I have a right to see it".
In these justifications, the interests of Web site and consumer are no longer aligned. Maintaining an access system has costs, and the Web site gets relatively little benefit from these costs. While the Web site may benefit in a general way from consumer education, it can educate consumers less expensively by providing a detailed notice about what data it collects, rather than maintaining an individual access system.
The "access for correction" option treats correction as the touchstone for defining reasonable access. While the other justifications have force, this option treats them as insufficient to outweigh the costs of access - in money for the Web site and in privacy risks for consumers. Under this option, a Web site would grant access to personal data in its files only after answering two questions in the affirmative: (1) Does the Web site use personal data to grant or deny significant benefits to an individual? (2) Will granting access improve the accuracy of the data in a way that justifies the costs?
The first question resolves many of the issues that are more difficult to resolve under the other options presented here. Examples of information that is used to grant or deny significant benefits include credit reports, financial qualifications, and medical records. By focusing on whether the Web site uses personal data to grant or deny a significant benefit, the approach would largely exclude personal data that is not tied closely to an individual, because it is unlikely (though not impossible) that significant benefits will be granted anonymously. Similarly, the approach calls for access only to information that is collected and retrieved in the ordinary course of business. With some qualifications set out below, however, the approach would allow access to information that has been provided by a third party, as long as the information is used to grant or deny significant benefits.
The second question - whether allowing access to correct errors justifies the cost - raises a variety of possible exceptions to access. Inferred data, such as judgments made about the consumer by third parties or Web site employees or even expert information systems, are not usually susceptible to direct correction, although the underlying data can be corrected. Access is not justified if it would reveal trade secrets or would compromise the confidential communications of Web site employees or third parties. In general, as the likelihood of improving the accuracy of personal data declines and the cost of providing secure access increases, the cost-effectiveness of access also declines and the case for access grows weaker.
Proponents of this option would argue:
Opponents of this option would argue:
Complex as the access issue may seem at this point in our report, it is about to become even more so. For unlike the other Fair Information Practice principles, the access principle sometimes pits privacy against privacy.
Simply stated, the problem is this - On the one hand, privacy is enhanced if consumers can freely access the information that commercial Web sites have gathered about them. On the other hand, privacy can be destroyed if access is granted to the wrong person - an investigator making a pretext call, a con man engaged in identity theft, or, in some instances, one family member in conflict with another.
How can consumers get the benefits of access to their personal data without running the risk that others will also gain access to that data? The answer is to employ techniques that adequately authenticate consumers - that provide some proof of consumer's identity or authority before the consumer is given access to personal data.
But how much proof is enough? If the consumer must produce three picture IDs, privacy will be protected, but access will be difficult. If the standard of proof is set too low, access will be encouraged, but the risk of compromise will grow. This section of our report attempts to illustrate the ways in which these competing interests can be addressed. In the end, it will be clear that there is no single answer to the dilemma described above. The proper level of authentication depends on the circumstances.
To take one example, the level of authentication may depend upon whether the consumer will simply view the information or will correct or amend it as well. Allowing the wrong individual to view someone else's data is a violation of privacy - and may lead to additional harm ranging from embarrassment to loss of employment - but allowing the wrong person to "correct" that personal information can result in devastating consequences. For example, some criminals have gained access to an individual's credit card accounts by filling out a change of address card with the post office and diverting the individual's credit card statements to another location. With access to the individual's bank statements and credit card bills the crook has ample information to impersonate the victim. (The Postal Service has recently initiated changes to make this more difficult.(4)) For this reason, where correction or amendment is provided, an audit trail should be maintained to aid in identifying potential problems.
In judging the proper level of authentication, it is necessary to bear in mind that the risk of liability will heavily influence the Web site's choices. Sooner or later, whatever level is set, the Web site's authentication requirements will turn out to be either too strict or not strict enough. A business runs a risk of liability if it allows the wrong person to access personal information. Although it is not clear what specific remedy an individual might have under existing law, the lack of certainty regarding liability presents a problem for both individuals and businesses. If liability is strict and put upon businesses, they may raise the barrier to access very high, burdening individuals' access in an effort to avoid liability. On the other hand, if existing legal remedies do not provide sufficient penalties for inappropriate access, individuals' privacy may suffer. How to strike an appropriate balance that spurs good practices, encourages the deployment of robust authentication devices, and does not overly burden access is the question.
Another question that must be asked is how the effort to achieve strong authentication will affect anonymity on the Internet. Anonymity is a feature of modern life and one we have all relied upon to provide a level of privacy. Pseudonymity (the use of a pseudonym such as a "screen name") is a step away from anonymity but provides important privacy protections for identification information in the online environment. Authentication is in some cases the opposite of anonymity. [It is important to preserve both anonymity and pseudonymity in our quest for authentication tools and procedures. ] Technologies such as biometrics might provide very strong authentication, but at a cost in terms of anonymity. Similarly, relying on data held by a third party to authenticate a consumer may provide that third party with even more details about the consumer than before. This means that access and authentication raise questions about the privacy policies of third parties that may be involved in the authentication process. At the same time, third parties - intermediaries - can also play a role in the protection of identity. Currently several companies have established themselves as intermediaries protecting identity and privacy between the individual and other entities.
So, how can Web sites choose an authentication policy? There is no one right answer. In this section we look at two case studies to identify ways in which commercial Web sites might strike a balance in addressing the authentication problem. Often, the solutions chosen will depend on the Web site's relationship with the consumer, as well as the kind of data to which access is provided.
Perhaps the fewest difficulties arise where a subscriber establishes an account with a Web site. In many cases, the individual may be given access to information about his or her account if he or she simply provided only the information required to establish and secure the account. But relying on information such as name, address and phone number to authenticate the identity and authorization of an account holder is risky because the information is so widely available. In fact, many of the most common "shared secrets" (such as social security numbers or mother's maiden name) have been compromised by widespread use.
For this reason, it is common practice both offline and online to require some additional piece of information that is thought to be more difficult to compromise. Many businesses require individuals to use a shared secret (password) to access an account.
Even a password requirement, with all its inconveniences and costs, suffers from serious security flaws. Many consumers use the same password at multiple places, or leave themselves reminders on yellow stickies, or use obvious passwords (e.g., "password") - all of which compromise the integrity of the authentication system. Authenticating identity has become a far more complex endeavor than it once was.
Where an account requires a password it would be appropriate to provide access to the data when the requester appears to be the account holder (tested), has the password, and presents some verifiable information about recent account activity. Such an approach would provide a two-factor method of authentication, but preserve the privacy offered by the initial account.
The Committee discussed the feasibility of using authentication devices as a method for obtaining consumer access to personal data. Some Committee members expressed concern that "perfect" authentication tools may be prohibitively expensive or too cumbersome for widespread use. However, the Committee did hear from authentication vendors who said that a wide range of authentication solutions are available today that solve the password 'problem' described above. These solutions take the form of hardware tokens that are as easy to use as an ATM card or software tokens that can be downloaded easily to a PC, PDA or cell phone. Even so, the risks associated with misuse and misappropriation of such devices remains.
In such circumstances, how can a service authenticate that the individual is the person to whom the data relates? Can Web sites provide access in fashion that reflects the potential adverse consequences of disclosing information to someone other than the subject of that information. Should the level of access authorized be lowered due to the complexities of connecting the user to the data? Are there other policies that would address the privacy interest and have a lower risk of unintentionally disclosing data to the wrong individual? Does this concern vary from Web site to Web site?
Again, there is no single answer to these questions, as our case study shows.
The Advisory Committee examined how to ensure the security of personal data held by commercial Websites. This section first describes competing considerations in computer security. After then looking at some possibilities for regulating computer security in online systems, it discusses the importance of notice and education as supplements to standards for protecting personal data. It presents competing options for setting Website security standards and recommends a specific solution to protect the security of personal data.
Most consumers - and most companies - would expect companies that collect and hold personal data to provide security for that data. . But security, particularly computer security, is difficult to define, particularly in a regulatory or quasi-regulatory context. Identifying the most effective and efficient solution for data security is a difficult task. Security is application-specific and process-specific. Different types of data warrant different levels of protection.
Security - and the resulting protection for personal data - can be set at almost any level depending on the costs one is willing to incur, not only in dollars but in inconvenience for users and administrators of the system. Security is contextual: to achieve appropriate security, security professionals typically vary the level of protection based on the value of the information on the systems, the cost of particular security measures, the costs of a security failure in terms of both liability and public confidence.
To complicate matters, both computer systems and methods of violating computer security are evolving at a rapid clip, with the result that computer security is more a process than a state. Security that was adequate yesterday is inadequate today. Anyone who sets detailed computer security standards - whether for a company, an industry, or a government body - must be prepared to revisit and revise those standards on a constant basis.
When companies address this problem, they should develop a program that is a continuous life cycle designed to meet the needs of the particular organization or industry. The cycle should begin with an assessment of risk; the establishment and implementation of a security architecture and management of policies and procedures based on the identified risk; training programs; regular audit and continuous monitoring; and periodic reassessment of risk. These essential elements can be designed to meet the unique requirements of organizations regardless of size.
In our advice to the FTC, we attempt to reflect this understanding of security. Our work, and this report, reflects the various types of on-line commercial sites, and the fact that they have different security needs, different resources, and different relationships with consumers. The report reflects this understanding and seeks to identify the range of different possibilities for balancing the sometimes-competing considerations of security, cost, and privacy.
Before turning to the options it is worthwhile to comment on several issues that the Committee considered but did not incorporate directly into its list of options.
First, we considered whether guidelines or regulations on security should contain some specific provision easing their application on smaller, start-up companies or newcomers to the online environment, but we ultimately determined that new entries should not receive special treatment when it comes to security standards. In part, this is because organizations that collect personal data have an obligation to protect that data regardless of their size. In part, this is because we concluded that any risk assessment conducted to evaluate security needs should take into account the size of the company (or, more appropriately, the size of a company's potential exposure to security breaches). In many cases (but not all), a smaller Website or less well-established company will have fewer customers, less data to secure, and less need for heavy security. A smaller site may also have an easier time monitoring its exposure manually and informally. And of course, even a small site may obtain security services by careful outsourcing.
Second, we noted that several of the proposed options depend on or would be greatly advanced by inter-industry cooperation and consultation on appropriate and feasible security standards. Often, there are significant barriers to sharing information about adverse events, including fears of antitrust actions and liability exposure. In the past, the government's willingness to provide clarity on antitrust rules to allow useful cooperation among firms has been helpful, and similar guidance that will encourage industry members to cooperate in the development or enforcement of security standards and procedures without fear of antitrust liability will be helpful here.
Third, it is vital to keep in mind that companies need to protect against internal as well as external threats when considering solutions designed to secure customers' personal data. Many companies have already implemented information security policies that protect sensitive corporate data (i.e., compensation information) by limiting access to only those employees with a "need to know." Companies need to implement similar measures that protect customer data from unauthorized access, modification or theft. At the same time, mandated internal security measures can pose difficult issues. For example, it is not easy to define "unauthorized" employee access; not every company has or needs rules about which employees have authority over computer or other data systems. And many companies that have such rules amend them simply by changing their practices rather than rewriting the "rule book." Even more troubling is the possibility that internal security requirements that are driven by a fear of liability could easily become draconian - including background checks, drug testing, even polygraphs. We should not without serious consideration encourage measures that improve the privacy of consumers by reducing the privacy of employees.
Fourth, we are concerned about the risks of regulation based on a broad definition of "integrity." Some concepts of security - and some legal definitions - call for network owners to preserve the "integrity" of data. Data is typically defined as having integrity if it has not been "corrupted either maliciously or accidentally" [Computer Security Basics (O'Reilly & Associates, Inc., 1991)] or has not been "subject to unauthorized or unexpected changes" [Issue Update on Information Security and Privacy in Network Environments (Office of Technology Assessment, 1995, US GPO)]. These definitions, issued in the context of computer security rather than legal enforcement, pose problems when translated into a legal mandate. If integrity were read narrowly, as a legal matter, it would focus on whether a Website has some form of protection against malicious corruption of its data by external or internal sources. If the definition is read broadly, it could lead to liability for data entry errors or other accidental distortions to the private personal information it maintains.
Authentication and authorization controls for access to information are integral parts of system security. To establish appropriate authentication and authorization, businesses must consider the value of the information on their systems to both themselves and the individuals to whom it relates, the cost of particular security measures, the risk of inside abuse and outside intrusion, and the cost of a security failure in terms of both liability and public confidence. This discussion of security pertains both to information in transition and information in storage.
After considerable discussion, the Advisory Committee has developed a wide range of possible options for setting standards for protecting personal data held by commercial Websites. Before presenting these options, we will address two policy options that the group considered but determined were unsatisfactory on their own. While insufficient standing alone, the Advisory Committee concluded that development of programs to educate consumers on security issues and a requirement that companies post notice describing their security measures are approaches that should be examined as possible supplements to some of the options in the Security Options below.
Notice is viewed as an appropriate tool for informing individuals about the information practices of businesses. It is critical to the consumer's ability to make informed choices in the marketplace about a company's data practices. In the area of security, as in the area of privacy, there is not necessarily meaningful correlation between the presence or absence of a security notice statement and the true quality of a Website's actual security. A security notice could be more useful if it allows consumers to compare security among sites in an understandable way. Since it is difficult to convey any useful information in a short statement dealing with a subject as complex as the nuts and bolts of security, most such notices would be confusing and convey little to the average consumer. Further, providing too many technical details about security in a security notice could serve as an invitation to hackers. As was discussed at some length by the Advisory Committee, these considerations also mean that it is not possible to judge the adequacy of security at Websites by performing a "sweep" that focuses on the presence or absence of notices.
Notice is important in triggering one of the few enforcement mechanisms available under existing law. If a posted notice states a policy at variance with the organization's practices, the FTC may exercise its enforcement powers by finding the organization liable for deceptive trade practices. But security notices are ineffective standing alone. At the same time, we believe that they could be useful in conjunction with one of the other options discussed in Section D. The form such notice should take will vary depending upon the option selected.
In addition to notice, consumer education campaigns are also useful to alert consumers about security issues, including how to assess the security of a commercial site and the role of the consumer in assuring good security. Regardless of what security solutions the FTC decides to recommend, it would be extremely valuable for the FTC, industry associations, state attorneys general, and others to sponsor consumer education campaigns aimed at informing Internet users about what to look for in evaluating a company's security. In addition, no system is secure against the negligence of users, so consumers must be educated to take steps on their own to protect the security of their personal data.
The Advisory Committee has identified two sets of options for those seeking to set security standards. In essence, these options address two questions: How should security standards be defined? And how should they be enforced?
The question of how security standards should be defined requires consideration of the parties responsible for the definition as well as issues of the scope and degree of flexibility and changeability of the standards. The entries that could be responsible for setting security standards explicitly include government agencies, courts, and standards bodies. Furthermore, it could be left up to Websites themselves to develop security programs (perhaps with a requirement that each site develop some security program), or it could be left to market forces and existing remedies to pressure Websites into addressing security at an appropriate level.
In this section, we set forth five options for setting security standards that fall along a continuum from most regulatory to most laissez faire. Each of the proposals reconciles the three goals of adequate security, appropriate cost, and heightened protections for privacy in a different manner. For each option, we have presented the arguments deemed most persuasive by proponents and opponents of the option.
Before requiring any particular security steps, wait to see whether existing negligence law, state attorneys general, and the pressure of the market induce Websites that collect personal information to generate their own security standards. It is worth noting that the insurance industry has started to insure risks associated with Internet security. The emergence of network security insurance may force companies to seriously address security issues, as the presence or absence of adequate security will be taken into account in the underwriting process utilized to determine rates for premium.
Proponents would argue:
Opponents would argue:
Require all commercial Websites that collect personal information to develop and maintain (but not necessarily post) a security program for protecting customers' personal data. This option could take one of two forms:
The contents and methodology of the security program could be specified, and businesses could be required to post a brief notice indicating their compliance.
The requirement could be limited to a simple mandate that the Website adopt a security strategy without specifying the details or requiring that it be posted.
Proponents would argue:
In support of option 4 a., security professionals believe that any effective program, even if managed by only one person part time, should involve the elements of risk assessment, implementation of controls based on the risks, testing and monitoring of controls, and periodic re-assessment of risks.
Also in support of option 4 a., a statement that the company maintains a security program that assesses risks and implements appropriate controls to address the risks need not be incomprehensible to consumers or too burdensome for businesses to comply with and insures consumers and businesses that security has been considered in the system design.
Opponents would argue:
All businesses operating online that that collect personal information could be required to adhere to security standards adopted by a particular industry or class of systems. There are three quite different options for how the standards are developed:
A government-authorized third party could develop standards through a process that encourages public participation (notice and comment) and may include governmental review.
The standards could be established by any third-party but the FTC could require that the standards address specific topics (e.g. access, data integrity, notice, authentication, authorization, etc.).
The standards could be developed by any third-party as long as the identity of the standard-setting organization is revealed to consumers (this is in effect a security "seal" program).
Proponents would argue:
The three options presented under this heading are quite different, and c) is significantly better than the others. It associates a security standard with a "brand name" so that consumers can decide whether security at the site is sufficient. Option b) simply adds a requirement that the standards address certain issues. In most cases this will be unnecessary and in other cases insufficient. Option a) requires that the government license standard-setting organizations; it also requires notice and comment and perhaps government review for such standards. This option is nearly indistinguishable from requiring government-written standards and will require that the FTC or some other body make hundreds if not thousands of individualized decisions about what security practices should be required in which industries, decisions that will have to be remade every three months as security standards and challenges evolve.
Opponents would argue:
Opponents will find that options a-c do not address their general concerns with industry-generated standards. However, opponents may find that proposal "a" partially responds to criticisms 1 and 2 because it constructs a process for soliciting public and policy maker input and review and to a limited extent addresses concerns about industry capture and stakeholder participation. However, because it does not permit other stakeholders to participate in the formulation of the standards, it is unlikely to fully ameliorate these concerns. In addition, since the item to be protected, personal information, is likely to be considered less valuable by the business than individuals, the concern about lack of representation is heightened. Opponents may find that proposal "b" (while weaker than "a") provides some restraint on the standard-setting process by allowing outside interests to decide what issues must be addressed. Option "c" will garner the greatest opposition from opponents as it fails to address any of the concerns outlined above.
Require all commercial Websites holding personal information to adopt security procedures (including managerial procedures) that are "appropriate under the circumstances." "Appropriateness" would be defined through reliance on a case-by-case adjudication to provide context-specific determinations. As the state of the art evolves and changes, so will the appropriate standard of care. An administrative law judge of the FTC or another agency or a court of competent jurisdiction could adjudicate the initial challenge.
Proponents would argue:
This approach is designed to encourage increasingly strong security practices. If a bright line rule is adopted, there is little doubt that the pace of technical change will leave the adequacy of regulation in the dust, and what was intended to be a regulatory floor will become a ceiling in practice. Rising tides do raise all boats, except those that are anchored to the bottom.
Opponents would argue:
Require commercial Websites that collect personal information to adhere to a sliding scale of security standards and managerial procedures in protecting individuals' personal data. This scale could specify the categories of personal data that must be protected at particular levels of security and could specify security based upon the known risks of various information systems. In the alternative or as part of the standard, there could be minimum-security standards for particular types of data. The sliding scale could be developed by a government agency or a private sector entity and could incorporate a process for receiving input from the affected businesses, the public, and other interested parties.
Proponents would argue:
Opponents would argue:
Even if it could be prepared properly the first time, a sliding scale would have to be updated almost constantly, tasks for which bureaucracies are ill suited.
The great majority of the Committee believes that the best protection for the security of personal data would be achieved by combining elements from Options 2 and 4. (Of course, existing remedies would not be supplanted by this solution.) We therefore recommend a solution that includes the following principles:
There are significant new technologies appearing that will have privacy implications. For example, a unique identifier may or may not link to a specific individual. But with new wireless technology, it may be linked to an individual identified with a phone number, location information, or other identifier.
In general, the Committee agrees there are significant barriers to inter industry sharing of information about adverse events, including fears of anti-trust actions and liability exposure. During the discussion of the security section of the report, the issue of information sharing between the government and private sector was raised. There was disagreement about the inclusion of this issue in the report, mostly surrounding the need to clarify the Freedom of Information Act (FOIA) to encourage businesses to share information with governmental entities. The Committee took no position on this issue.
The Advisory Committee was asked to provide its views on access and security in the context of the Fair Information Practice principles and industry self-regulation. We did not examine legislative or enforcement options in any detail, but it was difficult to address some of the access and security issues without giving some thought to the question of enforcement. As part of the security discussion, in particular, we assembled a range of representative options for enforcement of security principles. Some of these options are consistent with self-regulation, and others would require government intervention. We record them here, not for the purpose of recommending any particular course of action but to show the range of possibilities open to industry and government.
Many of the options include the publication of the Website's security procedures or its adherence to particular standards. Such postings are subject to traditional FTC and state enforcement if the statements are false. It is also of course possible for consumers to bring their own actions for fraud, false statements, or underlying negligence in the handling of the data.
Rely on independent auditors to ensure compliance with standards. This structure could require security standards to be verified by an external body and could require public disclosure of the findings. This option would provide more flexibility and could adjust faster to the changing threat environment. It would, however, introduce an additional cost and overhead that may not be justified by all industries and for all levels of risk exposure. It would, on the other hand, introduce a neutral, objective assessment of a company's security infrastructure relative to its industry.
Congress could establish a private right of action enabling consumers to recoup damages (actual, statutory, or liquidated) when a company fails to abide by the security standard established through one of the options set out above.
The FTC or another agency could enforce compliance with standards using its current enforcement power or using newly expanded authority. The enforcement could establish civil or criminal fines, or both and other equitable remedies. (This option is, in some respects, modeled after the regulations governing the financial services industry as enforced by the Federal Financial Institution Examination Council (FFIEC). The FTC could establish a similar enforcement regime for other industries.)
The Committee found it useful to generate a table of terms in order to further our discussions. These terms have proven useful, but are not to be considered generally accepted definitions of the Committee or online industry.
1. Businesses should provide a process by which consumers can challenge the correctness of certified information and request changes to the information. Businesses should not be obligated to change information that is correct per the business' own certification, but should provide a process by which disagreements concerning the correctness of the information can be arbitrated.
2. Businesses should provide a process by which consumers can challenge the correctness or appropriateness of information from other sources and request deletion of the information. Businesses should not be obligated to delete third-party-sourced or self-sourced information that the business believes is correct and appropriate to retain, but should provide a process by which disagreements concerning the accuracy and appropriateness of the information can be arbitrated.
3. As general background on the issues raised in this document, the subcommittee recommends study of the Department of Commerce's European Union Directive on Data Protection FAQ #8. The current version of this FAQ can be found at http://www.ita.doc.gov/td/ecom/RedlinedFAQ8Access300.htm
4. See The Privacy Rights Clearinghouse http://www.privacyrights.org/AR/id_theft.htm for more information.