FTC settlement with Ever orders data and AIs deleted after facial recognition pivot


Techcrunch-B40YXK.gif

The maker of a defunct cloud photo storage app that pivoted to selling facial recognition services has been ordered to delete user data and any algorithms trained on it, under the terms of an FTC settlement.

The regulator investigated complaints the Ever app — which gained earlier notoriety for using dark patterns to spam users’ contacts — had applied facial recognition to users’ photographs without properly informing them what it was doing with their selfies.

Under the proposed settlement, Ever must delete photos and videos of users who deactivated their accounts and also delete all face embeddings (i.e. data related to facial features which can be used for facial recognition purposes) that it derived from photos of users who did not give express consent to such a use.

Moreover, it must delete any facial recognition models or algorithms developed with users’ photos or videos.

This full suite of deletion requirements — not just data but anything derived from it and trained off of it — is causing great excitement in legal and tech policy circles, with experts suggesting it could have implications for  other facial recognition software trained on data that wasn’t lawfully processed.

Or, to put it another way, tech giants that surreptitiously harvest data to train AIs could find their algorithms in hot water with the US regulator.

The quick background here is that the Ever app shut down last August, claiming it had been squeezed out of the market by increased competition from tech giants like Apple and Google.

However the move followed an investigation by NBC News — which in 2019 reported that app maker Everalbum had pivoted to selling facial recognition services to private companies, law enforcement and the military (using the brand name Paravision) — apparently repurposing people’s family snaps to train face reading AIs.

NBC reported Ever had only added a “brief reference” to the new use in its privacy policy after journalists contacted it to ask questions about the pivot in April of that year.

In a press release yesterday, reported earlier by The Verge, the FTC announced the proposed settlement with Ever received unanimous backing from commissioners.

One commissioner, Rohit Chopra, issued a standalone statement in which he warns that current gen facial recognition technology is “fundamentally flawed and reinforces harmful biases”, saying he supports “efforts to enact moratoria or otherwise severely restrict its use”.

“Until such time, it is critical that the FTC meaningfully enforce existing law to deprive wrongdoers of technologies they build through unlawful collection of Americans’ facial images and likenesses,” he adds.

Chopra’s statement highlights the fact that commissioners have previously voted to allow data protection law violators to retain algorithms and technologies that “derive much of their value from ill-gotten data”, as he puts it — flagging an earlier settlement with Google and YouTube under which the tech giant was allowed to retain algorithms and other technologies “enhanced by illegally obtained data on children”.

And he dubs the Ever decision “an important course correction”.

Ever has not been fined under the settlement — something Chopra describes as “unfortunate” (saying it’s related to commissioners “not having restated this precedent into a rule under Section 18 of the FTC Act”).

He also highlights the fact that Ever avoided processing the facial data of a subset of users in States which have laws against facial recognition and the processing of biometric data — citing that as an example of “why it’s important to maintain States’ authority to protect personal data”. (NB: Ever also avoided processing EU users’ biometric data; another region with data protection laws.)

“With the tsunami of data being collected on individuals, we need all hands on deck to keep these companies in check,” he goes on. “State and local governments have rightfully taken steps to enact bans, moratoria, and other restrictions on the use of these technologies. While special interests are actively lobbying for federal legislation to delete state data protection laws, it will be important for Congress to resist these efforts. Broad federal preemption would severely undercut this multifront approach and leave more consumers less protected.

“It will be critical for the Commission, the states, and regulators around the globe to pursue additional enforcement actions to hold accountable providers of facial recognition technology who make false accuracy claims and engage in unfair, discriminatory conduct.”

Paravision has been contacted for comment on the FTC settlement.


Like it? Share with your friends!

What's Your Reaction?

hate hate
0
hate
confused confused
0
confused
fail fail
0
fail
fun fun
0
fun
geeky geeky
0
geeky
love love
0
love
lol lol
0
lol
omg omg
0
omg
win win
0
win

0 Comments

Your email address will not be published. Required fields are marked *

Choose A Format
Personality quiz
Series of questions that intends to reveal something about the personality
Trivia quiz
Series of questions with right and wrong answers that intends to check knowledge
Poll
Voting to make decisions or determine opinions
Story
Formatted Text with Embeds and Visuals
List
The Classic Internet Listicles
Countdown
The Classic Internet Countdowns
Open List
Submit your own item and vote up for the best submission
Ranked List
Upvote or downvote to decide the best list item
Meme
Upload your own images to make custom memes
Video
Youtube, Vimeo or Vine Embeds
Audio
Soundcloud or Mixcloud Embeds
Image
Photo or GIF
Gif
GIF format
Send this to a friend