All Things Privacy and Biometrics (Panel discussion highlights)

I spoke on a panel recently on the topic of all things privacy and biometrics. The Biometrics Institute hosted the discussion at the Australian Embassy in Washington, DC. The panel discussion was subject to Chatham House rules, and, in order to foster open dialogue, the audience did not include any members of the press. The panel roster was impressive, Dr. Joseph Atick moderated, and panelists included people from the government, private sector, and the NGO community.

The discussion ranged far and wide, covering passive and active collection of biometrics, consent, relative and direct identification, retail use of biometrics, risks of biometrics, and much more. Because I can only relay my portion of the conversation, I would like to broadly recap just a few of the key points I made and link to some excellent resources that came up in the discussion.

It’s About the Database

One of the points I discussed was the role and importance of biometric databases. A part of the discussion centered around databases that hold biometric templates. I stated then, and will likely state many times in the coming years, that a pivotal part of biometric privacy protections depends on how the biometric templates are created, stored, accessed, handled, secured, shared, and otherwise processed.

Many years ago in discussions around contactless Federal ID cards for government employees in the US, the issue of biometric templates came up and was negotiated with great care. (see our comments, and testimony for some background on this program.) It was then that I learned about the central importance of such databases, lessons which expanded when I did research in India on the Aadhaar card. (See my op ed on the Aadhaar card for Foreign Policy Magazine, here.)

The Aadhaar card brings me to another database point: some biometric databases are well-managed in regards to the biometric templates themselves. But if you attach transactional history to biometric data or store an identifier with the data, that presents new privacy risks and challenges. These transactional databases, too, need extensive privacy attention and safeguards.

Relative ID

I brought up the issue of relative identification on the panel. My concern is that relative identification can move to absolute identification through enrollment of additional information in a database. Relative ID in biometrics — and I refer especially to face recognition in my comments — means that the system will only recognize people that the system has seen before.

The system determines that individual A is individual A based on a variety of identifiers, ranging from weak to strong. A system making a relative ID match typically won’t match the name, home address, or other detailed information about individual A with the face image. For this reason, relative ID is sometimes seen as somewhat more protective of privacy. CCTV surveillance systems often use relative ID to determine if people or vehicles return to the same location. This data can be gathered and stored for years, and can reveal copious patterns of behavior without necessarily ever revealing a name.

The issue is that relative ID presents its own concerns, and in addition, is more troublesome if it turns into absolute ID. A system containing information about individual A can identify individual A uniquely and for other purposes if it adds to a biometric information like a credit card, or name, etc. associated with individual A through a corresponding record in a database. Enrolling an individual’s biometric and personal details in a database can be done without consent or knowledge.

So, relative ID can move to absolute ID quickly and silently. Meanwhile, individual A who has been tracked by relative ID for years can suddenly become a fully known and identified individual with years of history in the database. This issue needs attention and thoughtful policy work respectful of individual autonomy.

A good resource that came out during this part of the discussion was a study done by CSC — it discusses ID and biometrics in the context of retailers, and provides useful statistics. From the study: “27% of retailers are using facial recognition technology in-store. 25% of those are using facial recognition technology in-store to get existing customers back. This rose to 59% for fashion and apparel retailers. The larger retailers (101-250 stores) use facial recognition most frequently (43%).” From CSC, Next Generation In-Store Technology, Sept. 2015. A PDF version of the study is available here

National Biometric ID Cards

Dr. Joseph Atick posed many difficult questions for the panel. In response to one question, I noted that any country with a national biometric ID card should have data protection legislation that corresponds with that card. Too many countries have national biometric ID cards without adequate legislation.

The most significant case in point is India. India has already issued more than 900 million unique biometric national ID cards , an extraordinary number. Regrettably, India does not have legislation around its ID card, called the Aadhaar card or UID. For more background on this issue, see our blog post on the recent Supreme Court of India decision regarding India’s Aadhaar card, The itself decision is here.

Fair Information Practices and “FIPs Plus”

One last point. Fair Information Practices came up in the discussion. Robert Gellman’s paper on the history of FIPs remains the best history and overview of this topic, and I recommend it highly. Even people who know a lot about FIPs will learn something new. Regarding FIPs, I have testified about something I call “FIPs plus” and written about it in The Scoring of America report. I’ve also written about this idea in public comments to the FTC, and elsewhere.

In my comments to the FTC last year, I called FIPs Plus “statistical parity.” It is the idea of adding fairness around algorithmic processes to the standard FIPs. This is a complex topic with much nuance, but the basic idea is to facilitate fairness in the calculation and use of algorithmic factors. Biometrics protections could also comfortably fit into a “FIPs Plus” type of framework. FIPs, plus additional protections around biometric templates, algorithms, use of non-biometric data, and more would be a productive exploration when informed by and based on fact, solid technical knowledge, and resting on a baseline of standard FIPs.

The Biometrics Institute discussion was a good start in bringing together many disparate views on a challenging topic. It is in everyone’s interest for a knowledgeable discussion to continue. The panel made it clear to me that some common ground does exist. There is still a lot of room for consensus, and that is an encouraging thought.

Pam Dixon

Executive Director, World Privacy Forum