36
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 30 Jul 2023
36 points (92.9% liked)
Out of the loop
10983 readers
73 users here now
A community that helps people stay up to date with things going on.
founded 1 year ago
MODERATORS
Worldcoin is a company who's goal is to use AI to be able to identify a person by taking a picture of their eyes.
They are a for profit company which is extremely bad, and they have also started doing things which can be seen as very bad : like installing orbs as work of art which can take pictures and scan the eyes of people who gaze into the lens of the ball. https://news.artnet.com/art-world/worldcoin-orb-ai-2341500
Their "objective" of what they are saying is getting a way to identify a person. However they give to each person who dies it their eye hash, and 1 crypto coin, I think named after them. However them being a for profit company is very scary as we cannot know what they are doing with that data and if they will sell it or not.
Thanks for the answer!
I just don't get what's their goal, though. Seems like such a silly thing. Like made just to get a buyer and then retire for life.
Well if their goal was to train an AI capable to be integrated in mobiles phones to be able to scan the eyes of people to unlock it, then it would have been a great goal.
Tho I am not sure of what they really want.
Maybe a a way to get data and sell that data later. They have the hash for the eyes of the people who got scanned. So in the future maybe they can sell that (or get hacked), and advertising companies could use that for tracking people with theit AI.
Maybe what they are trying is getting a lot of noise to increase the value of their crypto (no idea what crypto and identity have in common) and do a quick cash grab with people buying in it.
Maybe their real goal is making an ai capable to identify everyone and sell that ai as said above in security devices but not in advertising.
Who knows. But right now the way they are collecting the data, is pretty bad.
So, there's a lot of reasons we don't want our eye prints stolen.
One is causing it to be impossible for you to be anonymous in public.
And not for everyone else, just you - if you once looked at this art piece, and happen to partially match a demographic that someone decided needs to be tracked carefully.
Another is establishing a credit score for you based on your eyeprint which can never be cleared, since you can't change your eyeprint.
And of course, there's reliably being sure they assassinated the right person, by verifying the eyeprint of the corpse against their database of eye prints.
There's lots of profitable uses for stolen eyeprints, and none of them are nice.
I may see the usefulness of being able to identify someone. If that picture is real or not. Being able to unlock your phone in a more secure way...
However, if that company starts to sell the identity of the people which volunteered in the ai training, then it's a huge privacy concern for those people.