664
submitted 5 months ago* (last edited 5 months ago) by Five@slrpnk.net to c/lgbtq_plus@lemmy.blahaj.zone

Via LeopardsAteMyFace on Reddit

App covered by Them and Pink News

you are viewing a single comment's thread
view the rest of the comments
[-] ClockworkOtter@lemmy.world 20 points 5 months ago* (last edited 5 months ago)

I wonder if the AI is detecting that the photo is taken from further away and below eye level which is more likely for a photo of a man, rather than looking at her facial characteristics?

[-] drcobaltjedi@programming.dev 17 points 5 months ago

Yeah, this is a valid point, if this is the exact case or not I don't know, but a lot of people don't realize a lot of the weird biases that can appear in the training data.

Like that AI trained to detect ig a mole was cancer or not. A lot of the training data that was cancer had rulers in them. So the AI learned rulers are cancerous.

I could easily see something stupid like angle the picture was taken from being something the AI erroniously assumed to be useful for determining biological sex in this case.

[-] Tyoda@lemm.ee 17 points 5 months ago

It's possible to manipulate an image in a way that the original and the new one are indistinguishable to the human eye, but the AI model gives completely different results.

Like this helpful graphic I found

Or... edit the HTML...

[-] Alexstarfire@lemmy.world 6 points 5 months ago

You think someone would do that? Just go on the internet and lie?

this post was submitted on 05 Jun 2024
664 points (100.0% liked)

LGBTQ+

2683 readers
112 users here now

founded 1 year ago
MODERATORS