this post was submitted on 02 Sep 2025
48 points (100.0% liked)

Privacy

2447 readers
195 users here now

Icon base by Lorc under CC BY 3.0 with modifications to add a gradient

founded 2 years ago
MODERATORS
 

Clearview AI built a massive facial recognition database by scraping 30 billion photos from Facebook and other social media platforms without users' permission, which law enforcement has accessed nearly a million times since 2017[^1].

The company markets its technology to law enforcement as a tool "to bring justice to victims," with clients including the FBI and Department of Homeland Security. However, privacy advocates argue it creates a "perpetual police line-up" that includes innocent people who could face wrongful arrests from misidentification[^1].

Major social media companies like Facebook sent cease-and-desist letters to Clearview AI in 2020 for violating user privacy. Meta claims it has since invested in technology to combat unauthorized scraping[^1].

While Clearview AI recently won an appeal against a £7.5m fine from the UK's privacy watchdog, this was solely because the company only provides services to law enforcement outside the UK/EU. The ruling did not grant broad permission for data scraping activities[^5].

The risks extend beyond law enforcement use - once photos are scraped, individuals lose control over their biometric data permanently. Critics warn this could enable:

  • Retroactive prosecution if laws change
  • Creation of unauthorized AI training datasets
  • Identity theft and digital abuse
  • Commercial facial recognition systems without consent[^1]

Sources:

[^1]: Business Insider - Clearview AI scraped 30 billion images from Facebook and other social media sites

[^5]: BBC - Face search company Clearview AI overturns UK privacy fine

top 7 comments
sorted by: hot top controversial new old
[–] GreenShimada@lemmy.world 1 points 4 days ago

Fun fact: if you go to Clearview's website to try and opt out, you can only opt out of individual images. So either you collect every possible image of yourself from the internet, or you fail.

[–] DmMacniel@feddit.org 7 points 1 week ago (3 children)

we should just blow up the internet by this point and start anew.

[–] Mika@piefed.ca 7 points 1 week ago

Honestly, we should just blow the companies that do this.

Removing the web won't fix shit, haven't you learn from cyberpunk, next web is gonna be corpo run and privacy world be worse than ever.

[–] TommySoda@lemmy.world 4 points 1 week ago* (last edited 1 week ago)

Honestly, I'm kinda down for that. Not for the sake of a new Internet, but for the sake of all the tech companies like Meta and Google that could just turn into absolutely nothing in an instant. Like the Thanos snap but for big tech.

[–] otacon239@lemmy.world 2 points 1 week ago (1 children)

I’m finna start a new internet! With Blackjack! And hookers!

[–] cyborganism@piefed.ca 1 points 1 week ago

You mean the darkweb?

[–] AntiBullyRanger@ani.social 1 points 1 week ago

↓ article fails t mention shadow profiles Meta made from every candid ϕoto📸 Meta users took. So even more innocent folks are affected by ↓θan ð NPDs o it.

CONTR¹↓: this t: to ϕ: ph/f θ: th ð: the o: in/on ¹ CONTR. Might abbr. t “CC,” or “Z.” “CTX” is mostly used o ⚕. Reply if u want a diff..

ama if u need moa help decipherŋ. raison