Resources and Publications

One Step Forward, Five Steps Back: An Analysis of the Draft Privacy Legislation

May 5, 2010
| Reports

Draft legislation on consumer privacy offer many opportunities for improvement.

FULL TEXT

On May 4, 2010, U.S. Representatives Rick Boucher (D-VA) and Cliff Stearns (R-FL) released a discussion draft of legislation governing data privacy.[1] The legislation would create specific data usage and handling requirements for nongovernmental organizations that collect, use or disclose consumer data. Organizations not following these requirements would be subject to penalties from enforcement actions brought forth by the Federal Trade Commission (FTC) or by state attorneys general and state consumer protection agencies. The legislation does not create a private right of action.

As consumer data increasingly is collected and stored electronically, it is important for Congress to consider the effect this has on privacy. The discussion draft provides a welcome opportunity to explore the best ways of protecting individual privacy while avoiding constraints on business innovation and unintended negative impacts on consumers. However, much of the concern over data privacy is speculative and consumers have experienced few, if any, harms because of the current privacy laws. Before Congress enacts new laws, it should first demonstrate that better enforcement of existing privacy regulations are insufficient to protect consumers. Enactment of this legislation as drafted would add yet another layer of complexity to the existing patchwork of federal laws regulating consumer privacy, including the Gramm-Leach-Bliley Act, the Fair Credit Reporting Act (FCRA), the Health Insurance Portability and Accountability Act (HIPAA) and the Fair Debt Collection Practices Act (FDCPA). Moreover, it represents yet another push for more government control over the private sector in the name of protecting consumers. Too often such legislation ends up imposing news costs on consumers and limiting innovation and the development of new online services.

This is not to say that a federal framework for consumer privacy would not be useful. However, policymakers should recognize that consumer privacy should not come at the expense of beneficial uses of individual data. For example, some organizations, such as LegiStorm, which provides salary information on Congressional staffers, and OpenSecrets.org, which tracks money in politics, use personal data to provide online tools to foster transparency and public accountability. Other organizations use consumer data for other beneficial purposes, such as providing a service or delivering targeted advertising.

In its current form, the draft legislation presents many problems including 1) raising costs for consumers while creating few benefits; 2) establishing affirmative consent (“opt in”) requirements for the collection, use and disclosure of certain types of information; 3) creating certain restrictions on behavioral target advertising; 4) granting the FTC the authority to establish a security standard to protect consumer information; and 5) failing to update privacy laws regarding government use of digital data. However, one positive element of this draft legislation is that it includes a preemption clause so that the proposed federal law would supersede any state regulations. To be effective, a federal framework for consumer data privacy should establish a single, nationwide standard for consumer privacy thereby reducing regulatory complexity for the private sector.

Bad: Raises costs for consumers while creating few benefits

The draft legislation includes certain provisions that create unnecessary costs for the private sector which will be borne by consumers.

First, in an effort to apply the rules not just to the Internet, the draft legislation mandates that, in certain instances, organizations provide offline notification of their privacy policy to consumers. The legislation states, “If the covered entity collects covered information by any means that does not utilize the Internet, the privacy notice required by this section shall be made available to an individual in writing before the covered entity collects any covered information from that individual.” The potential impact of this one sentence could be substantial as it would likely require many organizations to provide paper-based copies of their privacy policy to individuals. For example, this requirement would appear to limit the ability of organizations to collect registration forms or surveys from in-person events such as conferences or sporting events without first providing copies of the organization’s privacy policy. Not only would this requirement cost a significant sum of money to implement nationwide, it would also be a waste of paper.

Second, the draft legislation mandates that all organizations covered under the legislation (i.e. virtually all online businesses) must have a privacy notice on their website conforming to specific requirements. The legislation defines 15 specific items that each privacy policy must contain including, for example, “a hyperlink to or a listing of the [FTC’s] online consumer complaint form.” While privacy policies have been an industry best practice for many years, this legislation would impose a cost on organizations large and small as they would need to undertake a review of their privacy policy to ensure it conforms to this legislation.

Bad: Establishes affirmative consent (“opt in”) requirements for the collection, use and disclosure of certain types of information

Currently, organizations operate under a notice and choice regime, whereby consumers can review the privacy policies, if any, offered by an organization, and then decide whether to use the services offered by that organization. For example, if a new mobile application or online service does not provide a privacy notice on their website, consumers can decide that this does not meet their standards and not use the application or service. While many privacy advocates would like to see a more granular system in which consumers could opt out of specific types of data collection and use, the current privacy regime is effectively an opt-out system since consumers can decide whether or not to use a service based on the data usage and handling practices of an organization.

The draft legislation ends the current regime by establishing affirmative consent (“opt in”) requirements for certain situations including: collecting sensitive information and location-based information; sharing information with third-parties; and modifying an organization’s privacy policy. Others have shown how “opt-in is a rhetorical straw-man that cannot really be implemented by regulatory policies without creating a number of unintended side effects, many of which are suboptimal for individual privacy.”[2] In addition, opt-in requirements create an administrative burden on organizations as they must ensure that every user take a proactive step before they can offer their customers a specific service. Policymakers should endeavor to understand the costs of opt-in before enacting this requirement.

For example, the draft legislation unnecessarily restricts the collection of certain types of information related to an individual’s location or deemed “sensitive.” In addition, the restriction on sharing information with third parties would limit the ability of organizations to integrate their services with other providers. For example, organizations would find it more difficult to partner with outside entities to create a combined service. Mash-ups—remixing data across multiple external service providers—are one of the hallmarks of the Web 2.0. Organizations using services provided by another entity that require consumer information, for example an online mapping service, would possibly not be allowed without affirmative consent. Similarly, the requirement that covered entities obtain affirmative consent from users before making any material changes in their privacy policies would restrict the ability of service providers to rapidly develop and deploy new services, such as the changes recently introduced by Facebook.[3] These types of restrictions would effectively create speed bumps to innovation.

Finally, by requiring organizations to obtain affirmative consent for every material change in their privacy policy, this legislation would create an incentive for organizations to establish unrestrictive privacy policies so that future development would not be needlessly constrained by their own policies.

Bad: Creates certain restrictions on behavioral target advertising

While there is support in Congress for behavioral targeted advertising, the draft legislation includes provisions that would restrict this beneficial type of online advertising which provides consumers more relevant ads.

First, the restriction on the collection and disclosure of certain types of information categorized as “sensitive” means there is an entire class of targeted advertising that cannot be used. The draft legislation defines sensitive information as data that relates to an individual’s medical information, race or ethnicity, religious beliefs, sexual orientation, finances, and precise physical location. Collection of this information would require organizations to first obtain affirmative consent. In particular, the restrictions on using data related to medical information, sexual orientation, race or ethnicity, and religious beliefs without affirmative consent would restrict many types of potentially beneficial forms of advertising. For example, these restrictions could potentially prevent marketers from effectively creating targeted ad campaigns for services like online Christian bookstores, Brazilian music stores, or gay dating websites.

Second, the draft legislation requires organizations to follow specific guidelines regarding the use of data in profiles used to provide services such as targeted advertising. For example, the legislation requires organizations to provide “a readily accessible opt-out mechanism whereby, the opt-out choice of the individual is preserved and protected from incidental or accidental deletion.” This requirement goes against current industry practice. The Network Advertising Initiative (NAI), the online advertising industry organization that has been developing most of the standards for third-party ad networks, currently use cookies (small data files stored on a user’s computer) to allow consumers to opt out of participating online advertising networks’ behavioral advertising programs.[4] However, cookies do not fit the technology requirement as stated in the legislation since they can be accidentally deleted by a user, and so these third-party advertising networks would not be in compliance with the new legislation.

The legislation also requires websites to place a “symbol or seal” near every targeted ad that links to information about their advertising partner and information about any data associated with that user profile. Requiring targeted ads to have a special mark identifying them as such would unfairly disadvantage targeted ads against non-targeted ads. Given that targeted ads generate more than two times the revenue of non-targeted ads, this would have a negative impact on revenues for online publishers and service providers and would harm the Internet ecosystem, particularly the so-called “long tail” of small websites supported by ad revenues.[5] In addition, policymakers concerned with the decline of print media should note that greater revenue from targeted online advertising will likely be necessary for journalism to survive in the Internet age.

The legislation also states that users should be able to “review and modify” any preference profile created by an online ad network or other service provider. This requirement would force websites to build front-end systems to allow consumers to interact with data saved in their profile. Currently if consumers want to “opt out” of targeted advertising, they avoid websites that use this form of advertising or use various technical controls, such as web browser plug-ins, that block ads. This requirement would pose an unnecessary and unneeded cost on service providers (and ultimately consumers) and would generate little to no real benefit to consumers. Users that choose to opt out of targeted advertising but still access a website’s content or services are free riders, getting all of the benefits of a free service without bearing any of the costs. It does not make sense to require service providers to build a system to make it easier for users to free ride by opting out of targeted advertising. Unfortunately, this type of requirement reflects the prevailing message of privacy fundamentalists that privacy trumps all other values. However, policymakers should recognize that privacy, as with any other value, must be balanced against other competing interests and can, as it will here, comes at a real financial cost.

Bad: Grants the FTC the authority to establish a security standard to protect consumer information

The draft legislation grants the FTC authority to “establish, implement, and maintain appropriate administrative, technical, and physical safeguards” that it deems necessary. Such a broad authority over every nongovernmental organization maintaining consumer information effectively gives the FTC a far reaching authority over the information security practices of the private sector. Using this authority, the FTC could effectively set the standard for the security practices of private sector systems and networks. While the federal government does have a role in fostering good information security practices, the private sector is in a better position than the federal government to manage risk for its own systems and networks.

Bad: Fails to update privacy laws regarding government use of digital data

While the draft legislation attempts to enhance privacy for consumers, no mention is made of government use of consumer data. The legislation exempts government agencies from maintaining the same privacy standards that it would require from the private sector. Improper use of consumer data by government is arguably the greater threat preventing more widespread use of technologies like cloud computing. As ITIF and others have argued previously, Congress should act to reform laws such as the Electronic Communications Privacy Act (ECPA) to ensure that citizens have a right to privacy for their electronic data whether it is stored at home on a PC or remotely in the cloud.[6]

Good: Includes a preemption clause that supersedes state requirements

While legislators should look carefully at the concerns outlined here, the discussion draft does have a one positive element: it includes a preemption clause. As the legislation states, “This Act supersedes any provision of a statute, regulation, or rule of a State or political subdivision of a State that includes requirements for the collection, use, or disclosure of covered information.” If Congress does move forward with privacy legislation, it should ensure that any new regulations preempt state laws, otherwise online service providers will find themselves facing competing, and possibly contradictory, data use and handling requirements for consumers.

[1] “Staff discussion draft,” 111th Congress, 1st Session, May 3, 2010, http://www.boucher.house.gov/images/stories/Privacy_Draft_5-10.pdf.

[2] Nicklas Lundblad and Betsy Masiello, “Opt-in Dystopias,” SCRIPTed 7, no. 155 (2010), http://www.law.ed.ac.uk/ahrc/script-ed/vol7-1/lundblad.asp.

[3] Daniel Castro, The Right to Privacy is Not a Right to Facebook (Washington, DC: The Information Technology & Innovation Foundation, April 30, 2010), http://www.itif.org/publications/facebook-not-right.

[4] “Opt Out of Behavioral Advertising,” Network Advertising Initiative (2010), http://www.networkadvertising.org/managing/opt_out.asp.

[5] “Study finds behaviorally-targeted ads more than twice as valuable, twice as effective as non-targeted online ads,” Network Advertising Initiative, press release, March 24, 2010, http://www.networkadvertising.org/pdfs/NAI_Beales_Release.pdf.

[6] “ITIF Calls for Updates to Privacy Laws,” Information Technology and Innovation Foundation, press release, March 30, 2010, http://itif.org/pressrelease/itif-calls-updates-privacy-laws.