Public Comments: January 2012 – Regarding Face Facts: A Forum on Facial Recognition

 

—–

Comments of the World Privacy Forum
To the Federal Trade Commission
Regarding Face Facts: A Forum on Facial Recognition, Project No. P115406

Via ftcpublic.commentworks.com/ftc/facialrecognition

Federal Trade Commission
Office of the Secretary
Room H–113 (Annex P)
600 Pennsylvania Avenue, NW.
Washington, DC 20580

January 31, 2012

The World Privacy Forum appreciates the opportunity to comment on the issue of facial recognition pursuant to the FTC Face Facts Workshop held on December 8, 2011. [1] The World Privacy Forum spoke on Panel 4 of the workshop, and those comments are already on the record. In these written comments, we would like to submit several key documents for the record and reaffirm several ideas from the workshop. The documents we are including as part of these comments include the World Privacy Forum’s groundbreaking report on digital signage, The One Way Mirror Society. Also included as part of these comments are the consensus privacy principles for digital signage installations that were signed by the leading US consumer and privacy groups.

The World Privacy Forum is a non-profit, non-partisan public interest research and consumer education group. We are based in San Diego, California, and we focus our attention on a range of privacy topics, including technology, health care, finance, and workplace issues as well as emerging issues regarding big data and consumers. For more information see: www.worldprivacyforum.org.

 

I. Consent and Facial Recognition

In the workshop, one of the important issues that was highlighted in various respects was the issue of consumer consent. How is it best acquired? When should consent be required? We reiterate here what we stated at the time: the concept of a “walk-out opt-out” is not a viable way of managing consumer consent in the area of facial recognition or detection technologies.

We note that the digital signage industry self-regulatory principles include the idea of a walk-out opt-out, we believe this is not workable for a number of reasons.

First, facial recognition technologies are not always readily visible to the human eye. Cameras are getting smaller, and deployments can be quite stealthy. Consumers cannot opt out of what they do not know exists.

Second, facial recognition (and detection) technologies are already being deployed widely in retail and other spaces. (See our report on digital signage, The One-Way Mirror Society, which is included in this document. ) In five to ten years we expect that these technologies will be ubiquitous in certain public areas, thus making the “walk-out opt-out” point essentially moot. Instead of walking out or away from a single facial recognition or detection installation, consumers will be faced in some circumstances with multiple instances of these technologies within short distances, and walking out or physically leaving a space will not be possible. Retail and public spaces already exist that fit this scenario.

Third, the walk-out opt-out model burdens consumers with having to control data collection. The onus for privacy protection should not fall entirely on the consumer. It is difficult to envision a consumer education campaign surrounding this issue that does not verge on parody. “Consumers: if you don’t want to have your face print taken, please leave the store.” Consumers should not have to be wondering if and when their face print is being taken; it should not be a guessing game, and they should not have to edit their patronage of services based on fear of a camera or collection of facial biometrics.

In order to address this problem, some form of collection limitation or rules of the road will need to be imposed on the technology.

 

II. Face Prints and Consumer Rights

A striking development in the FTC hearings occurred on Panel 4, when Dr. Joseph Atick, a world-class biometrics expert, stated that consumers need to have the rights to their own face prints.

He said:

“The face print is ultimately the element to that allows a system to perform the identification of a person or even to temporarily know that this person is the same person that was in aisle three versus aisle seven. So if we begin to elevate the face print to the status of a PII and acknowledge its ownership to say that while my image may not be owned by me and can be taken by anybody in public, because my reasonable expectation of privacy doesn’t exist, my face print is supposedly an element, a unique code, that belongs to me. And therefore, if you are to exploit it in any way, by storing it in a database, you need my consent. By temporarily generating it and matching it against another instance in the last several hours, you need my consent. Therefore, in all of the analyses that we’ve heard today and the parting point for the International Biometric

Industry Association has always been recognition of this code as the most critical element that needs to be protected.”

And:

“Again, we strongly believe that face recognition is a viable technology, is an important technology in society, and should have a role to play, but it should be part of responsible use. All of the problems that we have heard about today result from the treatment or mistreatment of a face print. I’ll drill this back home again. Face print is a biometric. Just like all biometrics, it should be considered as a PII, owned by the identity from which it was generated from, and it should enjoy the protection, one, vested upon it by the status of PII, second, the ownership rights from which it was derived. Everything else could legitimately be derived subject to these principles.” [2]

Dr. Atick’s approach provides an important avenue of thinking that we urge the FTC to explore further. We believe it holds significant promise and has the most potential for a positive and fair outcome. The policy dialogue around facial recognition and detection technologies has been overlaid by approaches with roots in past technologies from past eras. Much of the policy discussions to date have not been informed by Dr. Atick’s level of knowledge, and as such have not taken into sufficient account the uniqueness of the face print, the nature of the technology, and the manner in which it is being deployed.

Just because a face print may be collected more readily and remotely than a fingerprint does not change the fact that a face print is a fundamental human biometric. Certainly, the ease of collecting the face print biometric has lent this particular metric to rapid commercial exploitation; but just because it can be used and is being used does not mean these uses should continue.

 

III. Digital Signage Privacy Principles for Consumers

The World Privacy Forum crafted a set of consensus principles around the issue of facial detection and recognition technologies used in digital signage in 2010. Seven leading consumer and privacy groups worked on these principles and signed on to them. We believe these principles provide a reasonable and balanced first step. These principles explicitly include language about children, which we believe is a key component of the principles. The issue of children and digital signage was not discussed at length in the FTC workshop. We also want to highlight that sensitive contexts remain an important consideration, for example, the non- treatment use of these technologies in health care settings.

Below are the Digital Signage Privacy Principles.

Digital Signage Privacy Principles (Originally published 2/25/2010)

New forms of sophisticated digital signage networks are being deployed widely by retailers and others in both public and private spaces. Capabilities range from simple people-counting sensors mounted on doorways to sophisticated, largely invisible facial recognition cameras mounted in flat video screens and end-cap displays. These digital signage technologies can gather large amounts of detailed information about consumers, their behaviors, and their characteristics.

Even though these technologies are quickly becoming ubiquitous in the offline world, few consumers, legislators, regulators, or policy makers are aware of the capabilities of digital signs or of the extent of their use. Currently there is little if any disclosure to consumers that information about behavioral and personal characteristics is being collected and analyzed to create highly targeted advertisements, among other things. The technology presents new problems and highlights old conflicts about privacy, public spaces, and the need for a meaningful debate. The privacy problems inherent with digital signage are profound, and to date these issues have not been adequately addressed by anyone.

Digital signage networks, if left unaddressed, have the potential to create a new form of secret and highly sophisticated marketing surveillance, with the prospect of unfairness, discrimination, and abuses of personal information. Industry has taken a small step with its draft code of conduct, but the concerns are too important to be left to industry control alone.

The consumer privacy principles below represent a starting point for discussion of what consumer protections need to be included in digital signage networks.

Scope: These principles apply to digital signage. Digital signage is a digital display, camera (including an endcap and a pinhole camera), sensor, network, or similar facility that collects data or images of an individual or of identifiable property owned by an individual and that is used by a commercial entity for targeting, information, entertainment, merchandising, or advertising purposes. A security camera used exclusively for security purposes is not digital signage.

Notice: All digital signage must have a readable label that clearly discloses its purpose to individuals in its vicinity.

Deletion: Any identifiable data about an individual collected from digital signage or linked to identifiable digital signage data by a digital signage operator or affiliate must be erased within 14 days of collection.

Privacy: The data must be subject to a privacy policy that addresses all eight fair information practice principles, and the privacy policy must be available at the time the images are collected.

Children: Any digital signage operator collecting images of or data about a child who appears to be under 13 must immediately erase all images of the child as well as any identifiable data about the child.

Prohibitions: No digital signage may be used in sensitive areas, including but not limited to bathrooms; areas where children congregate; changing rooms; locker rooms; or in health care facilities, including gyms, health food stores, and areas over-the-counter drugs are sold.

Display: No image or data of an individual from digital signage may be publicly displayed in a manner that would make the image or data visible to any person other than the subject of the image or data.

Accountability: A digital signage operator must be accountable for complying with these principles.

 

Pam Dixon,
World Privacy Forum

Jeff Chester,
Center for Digital Democracy

Michelle De Mooy,
Consumer Action

Susan Grant,
Consumer Federation of America

Deborah Pierce,
Privacy Activism

Ashley Katz,
Patient Privacy Rights

Beth Givens,
Privacy Rights Clearinghouse

 

 

Please continue reading by clicking on the report The One-Way-Mirror Society: Privacy Implications of the new Digital Signage Networks

 

 

____________________________

Endnotes

[1] <http://www.ftc.gov/bcp/workshops/facefacts/>.

[2] FTC Face Facts Transcript, <http://htc- 01.media.globix.net/COMP008760MOD1/ftc_web/transcripts/120811_FTC_sess4.pdf>.