FTC proposes precedent-setting face recognition settlement: photo app company must delete consumers’ photos and the algorithmic models it developed using the photos

The FTC has proposed a crucially important settlement with a photo app developer, Everalbum Inc., which the FTC says “deceived consumers about its use of facial recognition technology and its retention of the photos and videos of users who deactivated their accounts.” 

The proposed FTC settlement will require the company to obtain consumers’ express consent prior to using face recognition technologies on consumers’ photos and videos. This is a significant decision in and of itself. Just as significant is a further step that the FTC took in this case: the FTC says the app developer must also delete its algorithmic models that were developed using the unconsented photos and videos.

This additional deletion requirement of the models is important. If an algorithmic model built on improperly consented data is allowed to continue to be used, it is quite similar to allowing a company to act improperly and still keep the gains from that behavior. In this case, the gains are in the form of an algorithmic model informed by data – in this case photos — that the FTC says were not used with proper consent. The FTC is also requiring that “face embeddings” derived from consumers’ photos be deleted. 

FTC Commissioner Rohit Chopra wrote a statement accompanying the proposed settlement. In his statement, Commissioner Chopra writes:

… the FTC’s proposed order requires Everalbum to forfeit the fruits of its deception. Specifically, the company must delete the facial recognition technologies enhanced by any improperly obtained photos. Commissioners have previously voted to allow data protection law violators to retain algorithms and technologies that derive much of their value from ill-gotten data. This is an important course correction.

WPF has written about situations in which companies use face recognition in ways that can be contrary to the privacy expectations of consumers. In our lengthy study of privacy in more than 5,000 US schools from K-University, released last year, we studied face recognition in the context of education, and found, for example, that some companies are utilizing face recognition in ways that have the potential to be surprising to some students and parents, depending on the data sets used and the various policies of the companies and schools. This proposed FTC settlement, when made final, will provide much-needed clarification of consumer biometric policy, and will provide support for consumer privacy in the area of face recognition, and by extension, other biometrics.

It is not too much to say that this particular FTC decision is likely to have far-reaching impacts on the expectation of privacy regarding biometrics and consent.  

Read: FTC Everalbum Proposed Settlement

See: WPF’s landmark school privacy study: Without Consent