What is Privacy
The discussion of privacy can become muddled because different participants have different conceptions of what Privacy is. [Solove 2005] The conception of Privacy in the United States is very different from the conception of Privacy in the EU. Different "privacy" laws seek to protect against very different harms. Privacy can be viewed as
- Limitation on Government Power / Search and Seizure
- Torts [Restatement of the Law, Second, Torts, Sec. 652B, The American Law Institute (1977)]
- Intrusion on Seclusion
- Right to be left alone : A trespass notion where an individual "does not want their privacy interrupted." An example of this might be telemarketer calls which interrupt dinner time. See Do Not Call
- Right to be left alone / Intrusion into solitude (Warren & Brandeis 1890)
- False Light
- Private Lives:
- This includes the notion that there aspects of our lives that are reserved to private, and should not warrant public exposure.
- Examples of this might be the posting of pictures of children at an elementary school to a public website, without the permission of the children's' parents. A norm suggests that this public exposure is inappropriate and many photo hosting sites will remove such photos when an objection is lodged. Another example might be the President's children; there is a norm in journalism that the lives of the President's children are private and the Press should not cover what sports teams the President's children are on or how they are doing at school.
- Disclosure of intimate facts
- Audience (Context) management (what information gets shared with whom, when and where)
- Social networks = an invisible audience
- Information Management / Control
- Right to control information about oneself
- Collection of Information: The collection of personally identifiable information (PII) by a third party
- In the United States there is a cultural norm that the collection of information should not be objectionable unless one has something to hide. Objecting to such collections is tantamount to self incrimination. [see Solove 2007] In Europe which has experienced fascists governments, the collection of unnecessary information begs the question "why do you need to know." Europeans view personal information as something which has been used against them as a tool of oppression; those who unnecessarily collect it are met with suspicion.
- The ability to determine when, how, and to whom information about an individual is disclosed to others. [Westin]
- The collection of PII involves several situations:
- Collection of PII from an individual by a firm and how that firm uses (or abuses that information)
- Permissive Collection
- the collection of PII from children under age 13
- Compelled disclosure of information
- Non Permissive Collection
- Theft of PII (see identity theft)
- From an individual or
- from a firm that has previously collected the PII
- Data breach response / notification of the firm when a theft has occurred
- Use of Information
- How will the information be used
- Will it be shared with third parties
- Can individual review and revise information collected
- Data Security
- Data Storage
- Data Breach
- Federal Information Security Management Act
What is Personally Identifiable Information (PII)
Privacy policies generally address the collection of PII. But what is PII? What information identifies an individual and what information provides no personal information? According to NIST and GAO [NIST PII 2010 p 7, & Sec. 2.2 (This definition is the GAO expression of an amalgam of the definitions of PII from OMB Memorandums 07-16 and 06-19. GAO Report 08-536, Privacy: Alternatives Exist for Enhancing Protection of Personally Identifiable Information, May 2008)] Office of Management and Budget (OMB) Memorandum 07-16 (PII is "information which can be used to distinguish or trace an individual's identity, such as their name, social security number, biometric records, etc. alone, or when combined with other personal or identifying information which is linked or linkable to a specific individual, such as date and place of birth, mother's maiden name, etc.")].
PII is "any information about an individual maintained by an agency, including (1) any information that can be used to distinguish or trace an individual's identity, such as name, social security number, date and place of birth, mother's maiden name, or biometric records; and (2) any other information that is linked or linkable to an individual, such as medical, educational, financial, and employment information." Examples of PII include, but are not limited to:
- Name, such as full name, maiden name, mother's maiden name, or alias
- Personal identification number, such as social security number (SSN), passport number, driver's license number, taxpayer identification number, patient identification number, and financial account or credit card number
- Address information, such as street address or email address
- Asset information, such as Internet Protocol (IP) or Media Access Control (MAC) address or other host-specific persistent static identifier that consistently links to a particular person or small, well-defined group of people
- IP Addresses
- COPPA 16 C.F.R. § 312.2 Personal information means individually identifiable information about an individual collected online, including:... (7) A persistent identifier that can be used to recognize a user over time and across different Web sites or online services. Such persistent identifier includes, but is not limited to, a customer number held in a cookie, an Internet Protocol (IP) address, a processor or device serial number, or unique device identifier;
- In re Nickelodeon Consumer Privacy Litigation, -- F.3d -- (3rd Cir. 2016) ("The Video Privacy Protection Act, passed by Congress in 1988, prohibits the disclosure of personally identifying information relating to viewers' consumption of videorelated services. Interpreting the Act for the first time, we hold that the law permits plaintiffs to sue only a person who discloses such information, not a person who receives such information. We also hold that the Act's prohibition on the disclosure of personally identifiable information applies only to the kind of information that would readily permit an ordinary person to identify a specific individual's video-watching behavior. In our view, the kinds of disclosures at issue here, involving digital identifiers like IP addresses, fall outside the Act's protections.")
- Jane E. Brown, In re Nickelodeon Consumer Privacy Litigation: An IP Address Is Not Always Personally Identifiable Information, Beyond IP Law, July 29th, 2016
- Yershov v. Gannett Satellite Information Network, Inc., No. 15-1719, Slip at 6 (1st Cir. April 29, 2016) ("While there is certainly a point at which the linkage of information to identity becomes too uncertain, or too dependent on too much yet-to-be-done, or unforeseeable detective work, here the linkage, as plausibly alleged, is both firm and readily foreseeable to Gannett. The complaint therefore adequately alleges that Gannett disclosed information reasonably and foreseeably likely to reveal which USA Today videos Yershov has obtained. ")
- "The plaintiff there downloaded USA Today's free application onto his smartphone. He alleged that Gannett, which publishes USA Today, shared information about videos he watched on his phone with a third-party analytics company, Adobe Systems, Inc. The information did not include the plaintiff's name or address, but rather his cell phone identification number and his GPS coordinates at the time he viewed a particular video. 134 Rejecting the approach taken in Hulu, Yershov concluded that any unique identifier—including a person's smartphone ID— is personally identifiable information."
- Klimas v. Comcast Cable Comm’cns, Inc., 465 F.3d 271, 276 n.2 (6th Cir. 2006) (“The district court also noted as part of its standing analysis that dynamic IP addresses are not in and of themselves personally identifiable information because, "[u]nlike a subscriber's name, address, social security number, etc., a dynamic IP address is constantly changing." However, in basing the dismissal on the dynamic nature of the IP addresses, the court overlooked the fact that not all IP addresses are dynamic and that the complaint did not allege that such was the case. We pretermit discussion of this factor as irrelevant. We further note that IP addresses do not in and of themselves reveal "a subscriber's name, address, [or] social security number." That information can only be gleaned if a list of individual subscribers is matched up with a list of their individual IP addresses.”)
- Klimas v. Comcast Cable Communications, Inc., Case No. 02-CV-72054-DT, 2003 WL 23472182, *5 (E.D. Mich. July 1, 2003) (“[U]nless an IP address is correlated to some other information, such as Comcast’s log of IP addresses assigned to its subscribers (or a hotel registry in the analogy of hotel room numbers), it does not identify any single subscriber by itself. In other words, an IP address, by itself, is not ‘specific information about the subscriber.’ Therefore, Comcast’s collection of IP-URL linkages cannot constitute PII unless it is linked to the IP address/subscriber log.”);
- In re Vizio, Inc., Consumer Privacy Litigation, 238 F. Supp. 3d 1204 - Dist. Court, CD California 2017 (holding whether an IP number is PII is a finding of fact, denying motion to dismiss, stating "Plaintiffs have thus plausibly alleged that Vizio's provision of - to quote its own prospectus - "highly specific viewing behavior data on a massive scale with great accuracy" amounts to the disclosure of personally identifiable information")
- In re Hulu Privacy Litigation No. 11-cv-3764 (LB), 2014 WL 1724344 (N.D. Cal. Apr. 28, 2014) (static digital identifiers that could, in theory, be combined with other information to identify a person do not count as “personally identifiable information” under the Video Privacy Protection Act, at least by themselves.). Other cases in accord: Robinson v. Disney Online, --- F. Supp. 3d ---, 2015 WL 6161284, at *6 (S.D.N.Y. 2015); Eichenberger v. ESPN, Inc., No. 14-cv-463 (TSZ), 2015 WL 7252985, at *4–5 (W.D. Wash. May 7, 2015); Ellis v. Cartoon Network, Inc., No. 14- cv-484 (TWT), 2014 WL 5023535, at *3 (N.D. Ga. Oct. 8, 2014), aff'd on other grounds, 803 F.3d 1251 (11th Cir. 2015).
- Johnson v. Microsoft Corp. U.S. District Court for the Western District of Washington 2009 ("In order for “personally identifiable information” to be personally identifiable, it must identify a person. But an IP address identifies a computer, and can do that only after matching the IP address to a list of a particular Internet service provider’s subscribers. Thus, because an IP address is not personally identifiable, Microsoft did not breach the EULA when it collected IP addresses")
- Alma Whitten, “Are IP Addresses Personal?” Google Public Policy Blog, 22 February 2008
- Office of the Privacy Commissioner of Canada, “Legal Information Related to PIPEDA,” last modified 2 October 2013
- Trend 4 - The Growing Ambiguity of Personal Information, Transparent Lives Surveillance in Canada ("Although an IP address is rarely going to be directly related to one identifiable individual, it is how the IP address is combined with other information (or could reasonably be combined with other information) about tastes, behaviours, and interests that has privacy advocates concerned.")
- FTC’s 2012 Report on Protecting Consumer Privacy in an Era of Rapid Change
- Erika McCallister, Tim Grance, Karen Scarfone, Guide to Protecting Confidentiality of Personally Identifiable Information (PII), Sec. 3.2.2 NIST Publication 800-122 (April 2010) (citing as examples of linked PII first an intranet log that records users IP address, where "organization has a closely-related system with a log that contains domain login information records, which include user IDs and corresponding IP addresses. Administrators who have access to both systems and their logs could correlate information between the logs and identify individuals. Potentially, information could be stored about the actions of most of the organization‘s users involving web access to intranet resources." Second, a fraud, waste and abuse website that logs IP addresses, "However, the log information is not linked or readily linkable with the database or other sources to identify specific individuals.")
- FTC 2009 Self-Regulatory Principles for Online Behavioral Advertising ("In many cases, the information collected is not personally identifiable in the traditional sense – that is, the information does not include the consumer’s name, physical address, or similar identifier that could be used to identify the consumer in the offline world. Instead, businesses generally use “cookies” to track consumers’ activities and associate those activities with a particular computer or device. . . . [H]owever, it may be possible to link or merge the collected information with personally identifiable information – for example, name, address, and other information provided by a consumer when the consumer registers at a website.")
- Washington Court Rules that IP Addresses Are Not Personally Identifiable Information, Privacy and Information Security Law Blog, Hunton & Williams July 10, 2009
- IP Addresses May Be Subject to EU Data Protection Laws, White & Case (May 19, 2016) ("the AG found that dynamic IP addresses are personal data in the hands of a website operator if an internet service provider has further information, which, in combination with the dynamic IP address, could identify a user, since it was likely reasonable to use the information available at the internet service provider. ")
- Lindsey Tonsager, FTC's Jessica Rich Argues IP Addresses and Other Persistent Identifiers are "Personally Identifiable," Inside Privacy Covington April 29, 2016 ("a blanket characterization of browser and device identifiers as “personally identifiable” information for purposes of Section 5 of the FTC Act is in tension with certain privacy statutes as interpreted by the federal courts.")
- Jessica Rich, Keeping Up with Online Advertising Industry, FTC Blog (April 21, 2016) ("we regard data as “personally identifiable,” and thus warranting privacy protections, when it can be reasonably linked to a particular person, computer, or device. In many cases, persistent identifiers such as device identifiers, MAC addresses, static IP addresses, or cookies meet this test. For this reason, in the Commission’s 2013 amendments to the Children’s Online Privacy Protection Rule, it modified the definition of “personal information” to include “a persistent identifier that can be used to recognize a user over time and across different Web sites or online services [including but not limited to] a customer number held in a cookie, an Internet Protocol (IP) address, a processor or device serial number, or unique device identifier.”")
- Pruitt v. Comcast Cable Holdings, LLC, 100 Fed. Appx. 713, 716 (10th Cir. 2004) (“Without [additional information] one cannot connect the [information contained in the converter box] with a specific consumer”).
- Court Confirms that IP Addresses are Personal Data in some cases, White & Case (Oct. 31, 2016)
- MAC Addresses
- Ann Cavoukian, PhD, Kim Cameron, WiFi Positioning Systems: Beware of Unintended Consequences, Information and Privacy Commissioner (June 2011) ("Since the MAC address was designed to be persistent and unique over the lifetime of a Wi-Fi device, in a WPS, it identifies Wi-Fi devices that are closely associated with individuals – not only stationary routers, but personal laptops and mobile phones. When a unique identifier may be linked to an individual, it often falls under the definition of “personal information” through that data linkage and carries with it a host of regulatory responsibilities. The associated privacy issues range from lack of knowledge or consent of the mobile device owner for the use of the unique identifier, the possibility of unauthorized disclosure to third parties, or potential uses for secondary purposes")
Telephone numbers, including mobile, business, and personal numbers Personal characteristics, including photographic image (especially of face or other distinguishing characteristic), x-rays, fingerprints, or other biometric image or template data (e.g., retina scan, voice signature, facial geometry) Information identifying personally owned property, such as vehicle registration number or title number and related information Information about an individual that is linked or linkable to one of the above (e.g., date of birth, place of birth, race, religion, weight, activities, geographical indicators, employment information, medical information, education information, financial information).
Congress in COPPA specified what it considered PII to be, but also noted that the list was not exhaustive. CPNI also identifies what it considers PII. As technologies advance, new questions are raised about what should be added to PII. Some argue that IP numbers should be considered PII [McIntyre 2011]
Protection / Confidentiality of PII
The escalation of security breaches involving personally identifiable information (PII) has contributed to the loss of millions of records over the past few years. Breaches involving PII are hazardous to both individuals and organizations. Individual harms may include identity theft, embarrassment, or blackmail. Organizational harms may include a loss of public trust, legal liability, or remediation costs. To appropriately protect the confidentiality of PII, organizations should use a risk-based approach; as McGeorge Bundy once stated, "If we guard our toothbrushes and diamonds with equal zeal, we will lose fewer toothbrushes and more diamonds." [NIST PII 2010]
The Federal Government has a number of documents concerning the handling of PII. [NIST PII 2010]
- Protection of PII, NARA (Aug. 6, 2009)
- GSA Rules of Behavior for Handling Personally Identifiable Information (PII) (Aug 7, 2009)
- Jordan, Scott, Aligning Legal Definitions of Personal Information with the Computer Science of Identifiability (July 26, 2021). http://dx.doi.org/10.2139/ssrn.3893833
- Corey Ciocchetti, Just Click Submit: The Collection, Dissemination, and Tagging of Personally Identifying Information, 10 VAND. J. ENT. & TECH. L. 553 (2008).
- IP Addresses and Personally Identifiable Information, CircleID 2/25/2008
- Are IP addresses personal?, Google 2/25/2008
- What Information is Protected?
- Privacy From Whom?
- Individual Goofing Off
- What is the risk where data is lost?
Technology that will protection you from one threat vector may not protected you from another. An email service that offers encrypted email may protect you against surveillance by an authoritarian regimes, but not as against the corporation, or as against the divorce lawyer.
A lack of trust in the network / serviceinhibits use of the network / service.
- Rafi Goldberg, NTIA, Lack of Trust in Internet Privacy and Security May Deter Economic and Other Online Activities (May 13, 2016)
- Paul Starr, The Creation of the Media (2005) (discussing how Ben Franklin recognized the necessity of individuals trusting the postal service for the growth of the country and the economy).
- FTC, Staff Report, Internet of Things: Privacy & Security in a Connected World, at 55 (Jan. 2015)
- Written statement of Kevin Werbach, Associate Professor of Legal Studies & Business Ethics, The Wharton School, Hearing on ECPA Reform and the Revolution in Cloud Computing House Judiciary Committee, Subcommittee on the Constitution, Civil Rights and Civil Liberties September 23, 2010 at 8 (“A smooth transition to cloud computing requires users to continue feeling a sense of trust online.”)
- CISCO Comments to NTIA Internet of Things June 2,2016 at 22 (“end users must trust that their data is being securely transmitted”)
- Microsoft Comments to NTIA at 6 (March 13, 2017) ("One of the most significant barriers to adoption of new technologies such as IoT is a lack of consumer trust.")
- Tom Standage, The Victorian Internet 118 (2007) ("Worries about the security of telegraphic money transfers were holding back the development of on-line commerce ('The opportunities for fraud has been the chief obstacle,' declared the Journal of the Telegraph in 1872)")
- as against whom
- Individuals Goofing Off
- Discontinuity of protections
- Corporations versus Governments
Information Collection Concerns
Looks a lot like FIPPS
From N Doty, D Mulligan. E Wilde, Privacy Issues of the W3C Geolocation API, UC Berkeley School of Information Reports 2010-038, February 2010
- "Appropriateness: Is the collection of location information appropriate given the context of the service or application?
- "Minimization: Is the minimum necessary granularity of location information distributed or collected?
- "User Control: How much ongoing control does the user have over location information? Is the user a passive receiver of notices or an active transmitter of policies? Are there defaults? Do they privilege privacy or information ow?
- "Notice: Can requesters transmit information about their identity and practices? What information is required to be provided to the user by the requesting entity? What rules can individuals establish, attach to their location information and transmit? Is there a standard language for such rules?
- "Consent: Is the user in control of decisions to disclose location information? Is control provided on a per use, per recipient or some other basis? Is it operationalized as an opt-in, opt-out or opt model?
- "Secondary Use: Is user consent required for secondary use (a use beyond the one for which the information was supplied by the user)? Do mechanisms facilitate setting of limits or asking permission for secondary uses?
- "Distribution: Is distribution of location information limited to the entity with whom the individual believes they are interacting or is information re-transmitted to others?
- "Retention: Are timestamps for limiting retention attached to location information? How can policy statements about retention be made?
- "Transparency and Feedback: Are flows of information transparent to the individual? Does the specification facilitate individual access and related rights? Are there mechanisms to log location information requests and is it easy for individuals to access such logs
- "Aggregation: Does the standard facilitate aggregation of location information on specific users or users generally? Does the specification create persistent unique identifiers?
Right to Privacy / Media
- Invasion of privacy is a claim as against media and news reporters, arguing that the reported on material was not news worthy and constituted an invasion of privacy.
- Whether something is newsworthy has been defined frequently by the courts with the circular logic of whether it appeared in the news. A foundation of the courts hesitancy for finding truthful news reporting to be an invasion of privacy is the First Amendment.
- Restatement of Torts
- "When the subject matter is of legitimate public concern, there is no invasion of privacy.”
- "It seems clear that the common law restrictions on recovery for publicity given to a matter of proper public interest will now become part of the constitutional law of freedom of the press and freedom of speech. To the extent that the constitutional definition of a matter that is of legitimate concern to the public is broader than the definition given in any State, the constitutional definition will of course control."
- Restatement (Second) of Torts § 652D, comment d (1977).
- The Restatement sets forth reportedly four privacy interests
- Misappropriation of one's likeness
- Intrusion of one's privacy
- Public disclosure of highly private information and
- False light
- Bartnicki v. Vopper, 532 U.S. 514 (2001).
- Landmark Comm., Inc. v. Virginia, 435 U.S. 829 (1978) “The article published by Landmark provided accurate factual information about a legislatively authorized inquiry pending before the Judicial Inquiry and Review Commission, and in so doing clearly served those interests in public scrutiny and discussion of governmental affairs which the First Amendment was adopted to protect." p 839.
- Cox Broadcasting v. Cohn, 420 U.S. 469 (1975) (no cause of action where newspaper published name of rape victim)
- New York Times v. United States, 403 U.S. 713 (1970) (no cause of action where newspaper published name of rape victim).
- New York Times v. Sullivan, 376 U.S. 254 (1964) (defamation causes of action by public figures against news reporters must demonstrate that reporting was malicious or with reckless disregard to the truth).
- Griswold v. Connecticut, 381 U.S. 479 (1965)
- Winters v. New York, 333 U.S. 507 (1948) (striking down NY state law which prohibited publications “principally made up of criminal news, police reports, or accounts of criminal deeds of bloodshed, lust or crime.”)
- Pavesich v. New Eng. Life Ins. Co., 50 S.E. 68 (Ga. 1905) (public figures waive right to privacy)
- Erwin Chemerinsky, Rediscovering Brandeis’s Right to Privacy, 45 BRANDEIS L. J. 643 (2007)
- Jessica E. Jackson, Note: Sensationalism in the Newsroom: Its Yellow Beginnings, the Nineteenth Century Legal Transformation, and the Current Seizure of the American Press, 19 N.D. J. L. ETHICS & PUB. POL’Y 789 (2005)
- Patrick J. McNulty, The Public Disclosure of Private Facts: There is Life After Florida Star, 50 DRAKE L. REV. 93, 98 (2001)
- Rodney A. Smolla, Privacy and the First Amendment Right to Gather News, 67 GEO. WASH. L. REV. 1097 (1999)
- Randall P. Bezanson, The Right to Privacy Revisited: Privacy, News, and Social Change, 1890-1990, 80 CAL. L. REV. 1133 (1992)
- Jonathan B. Mintz, The Remains of Privacy’s Disclosure Tort: An Exploration of the Private Domain, 55 MD. L. REV. 425 (1996)
- John A. Jurata, Jr., Comment, The Tort That Refuses To Go Away: The Subtle Reemergence of the Public Disclosure of Private Facts Tort, 36 SAN DIEGO L. REV. 489 (1999)
- Robert C. Post, The Social Foundations of Privacy: Community and Self in the Common Law Tort, 77 CAL. L. REV. 957 (1989).
- Thomas I. Emerson, The Right of Privacy and Freedom of the Press, 14 HARV. CIV. RIGHTS CIV. LIB. L. REV. 329 (1979)
- Linda N. Woito & Patrick McNulty, The Privacy Disclosure Tort and the First Amendment: Should the Community Decide Newsworthiness, 64 IOWA L. REV. 185 (1979)
- William Prosser, Privacy, 48 CAL. L. REV. 383 (1960).
| The Technology of Privacy: When Geeks Meet Wonks
- Taxonomy [Eckersley EFF When Geeks Meet Wonks]
- What Data is protected (what you read, where you go, when you go, who you are, what is your religion, sexual orientation, who you talk to, what you buy)
- Privacy as against whom? (corporations, advertisers, governments, family, spouces, employers, law enforcement, lawyers, identity thieves, mafia, stalkers, data brokers)
- Purpose of Privacy ? (protection from authoritarian governments, social intolerance, crime, protection, individuals; what is risk where privacy is compromised)
- Data where (in storage, in the cloud, on your computer, in transmission)
- Blocking Resistant Tools
- on Computer
- of Communications
- HTTPS (Web)
- Encryption of web access, protects against interception of webbrowsing (including web based email), username and password interception, theft of financial information, ID theft, account hijacking. Useful with Wifi access points.
- Email: Avoids interception in transmission (by authoritarian regime); however stored email subject to subpoena
- Problem: Not widely deployed, or not deployed correctly [Eckersley EFF When Geeks Meet Wonks]
- HTTPS Everywhere: Attempts to force websites into HTTPS mode
- SSL Observatory
- VPN (transmissions)
- Web Browser based
Do Not Track
- "Do not track" is a response to behavioral advertising. Do Not Track would be "a persistent cookie on a consumer's browser, and conveying that setting to sites that the browser visits, to signal whether or not the consumer wants to be tracked or receive targeted advertisements"
- "Do not track" flag in your client browser signally to advtisers or website that you do not want to be tracked.
- Protects against tracking individuals web viewing activity
- Only works where websites participate
- Can also log into advertisers site (Google Ads) and set tracking preferences
- For Release:12/01/2010 FTC Staff Issues Privacy Report, Offers Framework for Consumers, Businesses, and Policymakers Endorses "Do Not Track" to Facilitate Consumer Choice About Online Tracking
- Federal Trade Commission (Bureau of Consumer Protection) A Preliminary FTC Staff Report on Protecting Consumer Privacy in an Era of Rapid Change: A Proposed Framework for Businesses and Policymakers (December 1, 2010)
- Text of the FTC Staff Report, and Concurring Statements of Commissioner Kovacic and Commissioner Rosch
- FTC Privacy Report : Remarks of Chairman Jon Leibowitz as Prepared for Delivery
- Dept of Commerce Internet Policy Task Force :: Commercial Data Privacy and Innovation in the Internet Economy: A Dynamic Policy Framework
- Prepared Statement of the Federal Trade Commission on Do Not Track, Presented by David Vladeck, Director, Bureau of Consumer Protection, Before the Subcommittee on Commerce, Trade, and Consumer Protection of the Committee on the Energy and Commerce, United States House of Representatives (December 2, 2010) Text of the Commission Testimony :: For Release: 12/02/2010 FTC Testifies on Do Not Track Legislation
- EFF on Do Not Track
- Do Not Track at W3C
- Do Not Track Workshop April 28/29 2011
- IETF Draft, Do Not Track: A Universal Third-Party Web Tracking Opt Out (March 7, 2011)
- CDT Do Not Track
- Do Not Track website is maintained by Stanford researchers Jonathan Mayer and Arvind Narayanan , affiliated with the Computer Science department and the Center for Internet and Society .
- Mahmood Sharif Carnegie Mellon University (Do Not) Track Me Sometimes: Users' Contextual Preferences for Web Tracking
- J. Mayer, A. Narayanan, S. Stramm, Do Not Track: A Universal Third-Party Web Tracking Opt Out, IETF RFC (Mar. 7, 2011)
- Hannes Tschofenig, Rob van Eijk, DO NOT TRACK, An Attempt to Frame the Debate at W3C
- HR 654 Do Not Track Me Online (Mr. Speier) 112th Cong. "Requires the Federal Trade Commission (FTC) to promulgate regulations to establish standards for the required use of an online opt-out mechanism to allow a consumer to prohibit the collection or use of any covered information and to require a covered entity to respect the choice of such consumer to opt-out of such collection or use..."
- December 2010 Hearing
- Dingell Examines the Feasibility of Do Not Track LegislationDecember 2, 2010 8:05 AM
- "Do-Not-Track" Legislation: Is Now the Right Time? Testimony of Daniel J. Weitzner Associate Administrator for Policy Analysis and Development National Telecommunications and Information Administration United States Department of Commerce Before the Subcommittee on Commerce, Trade and Consumer Protection Committee on Energy and Commerce United States House of Representatives December 2, 2010
- FTC Testifies on Do Not Track Legislation 12/02/2010
- The testimony states that while some in the industry have taken steps to improve consumer control of behavioral advertising, industry efforts have largely fallen short. Given the limitations of existing mechanisms, "the Commission supports a more uniform and comprehensive consumer choice mechanism for online behavioral advertising," sometimes referred to as "Do Not Track."
The most practical way to do that "would likely involve placing a setting similar to a persistent cookie on a consumer's browser, and conveying that setting to sites that the browser visits, to signal whether or not the consumer wants to be tracked or receive targeted advertisements," according to the testimony.
The testimony states that such a mechanism could be accomplished through legislation or potentially through robust, enforceable self-regulation. "If Congress chooses to enact legislation, the Commission urges Congress to consider several issues," including:
- It should not undermine the benefits online behavioral advertising provides consumers, including funding content and services;
- Unlike the FTC's Do Not Call Registry for telemarketers, it should not require a registry of unique identifiers; rather, the Commission recommends a browser-based mechanism;
- It should consider an option that lets consumers choose to opt out completely or to choose certain types of advertising they wish to receive or data they are willing to have collected about them;
- The mechanism should be simple, and easy to find and use;
- The FTC should be given Administrative Procedures Act rulemaking and the ability to fine violators to "provide a strong incentive for companies to comply with any legal requirements, helping to deter future violations."
- Dec 2, 2010: Rep. Markey Opening Statement at hearing on Do Not Track legislation
- Dec 2, 2010: Markey to Introduce Legislation to Protect Children's Online Privacy
© Cybertelecom ::