504
submitted 1 year ago by lntl@lemmy.ml to c/science@lemmy.ml

Was this AI trained on an unbalanced data set? (Only black folks?) Or has it only been used to identify photos of black people? I have so many questions: some technical, some on media sensationalism

you are viewing a single comment's thread
view the rest of the comments
[-] nieceandtows@programming.dev 12 points 1 year ago

I don’t think this is some systematic racism. Rather, it’s the technology itself that’s lacking. I remember even those motion activated bathroom sinks had problem working well with black hands. I think they’re just not good enough at differentiating between darkness and black skin.

[-] fishidwardrobe@mastodon.me.uk 10 points 1 year ago
[-] nieceandtows@programming.dev 1 points 1 year ago

Just because it can be doesn’t mean that’s the first place we go to. Stupidity is more likely than maliciousness and all that.

[-] fishidwardrobe@mastodon.me.uk 8 points 1 year ago

@nieceandtows ::gestures at mankind's enormously long and complex history of systemic racism::

People are going to read your post and assume that you are apologising for the police *because* they are the police. You know, one of those "they must have done something wrong or the police wouldn't have arrested them" people.

I'm sure you're not one of those. Just saying how it looks.

[-] nieceandtows@programming.dev 0 points 1 year ago

Eh. I have a loved one with some undiagnosed mental health issues, and it’s a constant struggle because they always assume the worst of anyone and everyone. Watching/living with them, I’ve learned that it’s always better to assume good about people than assuming bad about them. Assuming bad things without proof only ever ruins your happiness and relationships. People can read my comments and understand what I’m saying like you do. If they don’t and assume I’m racist, it only proves my point.

[-] fishidwardrobe@mastodon.me.uk 6 points 1 year ago

@nieceandtows Plenty of proof of systemic racism in the police if you want to see it.

[-] nieceandtows@programming.dev -1 points 1 year ago

Not talking about systemic racism in general. I know there's a lot of that. I'm talking about systemic racism causing this particular issue. I'm saying because there have been cases of motion sensors not detecting black hands because of technical issues. I'm not apologizing for anyone, just pointing out the fact that it has happened before due to technical deficiencies.

[-] fishidwardrobe@mastodon.me.uk 2 points 1 year ago

@nieceandtows The fact that there have been issues with sensors (which is true) does not disprove systemic racism (which exists). That's like saying that because I put vinegar in the dressing the lemon juice wasn't sour. It doesn't follow.

[-] nieceandtows@programming.dev -1 points 1 year ago

Putting the same thing the other way around: The fact that there have been issues with systemic racism (which is true) does not disprove technical malfunction (which exists). That’s like saying that because the lemon juice is sour it means it has vinegar in it. It doesn’t follow. Lemon juice can be sour just because it has lemons in it, without need of any vinegar in it.

[-] fishidwardrobe@mastodon.me.uk 3 points 1 year ago

@nieceandtows But we know that there is systemic racism in the police. There *is* vinegar in it.

[-] nieceandtows@programming.dev 1 points 1 year ago

Again, I'm not disagreeing on systemic racism in the police at all. That is a big issue that needs to be solved. Just saying that this doesn't have to be related to it, because the technology itself has some issues like this. The vinegar is in the food, yes, but lemon is naturally sour. Even if there is no vinegar, it's gonna be sour. Attributing everything to vinegar wouldn't make food better. It would just make it difficult to identify issues with individual ingredients.

[-] gruff@stroud.social 1 points 1 year ago

@fishidwardrobe As far as the UK is concerned (re facial recognition) I recall the latest study has found false positives disproportionately higher for Black people and statistically significant.

The UK Police thought this acceptable and have continued the roll out of this tech. A judgement call that bakes a little bit more systemic racism into UK Policing with little to no accountability.

https://science.police.uk/site/assets/files/3396/frt-equitability-study_mar2023.pdf

PS. I'm not academically qualified to comment on the paper, but take an interest in these things.

@nieceandtows

[-] vrighter@discuss.tchncs.de 9 points 1 year ago

haha this is reminding me of an episode of Better off Ted, where they replaced all sensors with optical based ones that did not recognize black people. Their solution was to hire white guys to follow them around to open doors and turn on lights for them

[-] nieceandtows@programming.dev 2 points 1 year ago

That’s hilarious

[-] SevenSwell@beehaw.org 1 points 1 year ago

That show was so underrated. Pity it ended so quickly.

[-] cobra89@beehaw.org 8 points 1 year ago* (last edited 1 year ago)

IMO, the fact that the models aren't accurate with people of color but they're putting the AI to use for them anyway is the systemic racism. If the AI were not good at identifying white people do we really think it would be in active use for arresting people?

It's not the fact that the technology is much much worse at identifying people of color that is the issue, it's the fact that it's being used anyway despite it.

And if you say "oh, they're just being stupid and didn't realize it's doing that " then it's egregious that they didn't even check for that.

[-] nieceandtows@programming.dev 2 points 1 year ago

That part I can agree with. These issues should have been fixed before it was rolled out. The fact that they don’t care is very telling.

[-] DessertStorms@kbin.social 7 points 1 year ago

it’s the technology itself that’s lacking.

the technology is designed by people, people who didn't consider those with dark skin and so designed a technology that is lacking.
Lets not act as if technology just springs spontaneously in to being.

[-] lntl@lemmy.ml 3 points 1 year ago

I think it is come systematic racism. AI didn't arrest this person, police officers did. They did no further investigation before making the arrest because they didn't have to: the person has black skin. Case closed.

this post was submitted on 07 Aug 2023
504 points (97.7% liked)

Science

13160 readers
23 users here now

Subscribe to see new publications and popular science coverage of current research on your homepage


founded 5 years ago
MODERATORS