this post was submitted on 03 Jul 2025
262 points (97.5% liked)

Technology

73287 readers
6552 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

The Basque Country is implementing Quantus Skin in its health clinics after an investment of 1.6 million euros. Specialists criticise the artificial intelligence developed by the Asisa subsidiary due to its "poor” and “dangerous" results. The algorithm has been trained only with data from white patients.

top 41 comments
sorted by: hot top controversial new old
[–] D4MR0D@lemmy.world 26 points 3 weeks ago (1 children)

If someone with dark skin gets a real doctor to look at them, because it's known that this thing doesn't work at all in their case, then they are better off, really.

[–] ryannathans@aussie.zone 6 points 3 weeks ago

Doctors are best at diagnosing skin cancer in people of the same skin type as themselves, it's just a case of familiarity. Black people should have black skin doctors for highest success rates, white people should have white doctors for highest success rates. Perhaps the next generation of doctors might show more broad success but that remains to be seen in research.

[–] Tollana1234567@lemmy.today 13 points 3 weeks ago* (last edited 3 weeks ago)

you cant diagnosed melanoma just by the skin features alone, you need biopsy and gene tic testing too. furthermore, other types of melanoma do not have typical abcde signs sometimes.

histopathology gives the accurate indication if its melonoma or something else, and how far it spread in the sample.

[–] phoenixz@lemmy.ca 5 points 3 weeks ago (4 children)

Though I get the point, I would caution against calling "racism!" on AI not being able to detect molea or cancers well on people with darker skin; its harder to see darker areas on darker skins. That is physics, not racism

[–] zout@fedia.io 53 points 3 weeks ago (1 children)

The racism is in training on white patients only, not in the abilities of the AI in this case.

[–] TimewornTraveler@lemmy.dbzer0.com 6 points 3 weeks ago* (last edited 3 weeks ago)

if only you read more than three sentences you'd see the problem is with the training data. instead you chose to make sure no one said the R word. ben shapiro would be proud

[–] Dojan@pawb.social 1 points 3 weeks ago (1 children)

It is a direct result of structural racism, as it's a product of the treatment of white men as being the default. You see it all the time in medicine. There are conditions that disproportionately affect black people that we don't know enough about because time and money hasn't been spent studying it.

Women face the same problem. Lots of conditions apply differently in women. An example of this being why women historically have been underrepresented in e.g. autism diagnoses. It presents differently so for a while the assumption was made that women just can't be autistic.

I don't think necessarily that people who perpetuate this problem are doing so out of malice, they probably don't think of women/black people as lesser (hell, many probably are women and/or black), but it doesn't change the fact that structural problems requires awareness and conscious effort to correct.

[–] phoenixz@lemmy.ca 1 points 4 days ago (1 children)

Again, no.

There are actual normal reasons that can explain this. Don't assume evil when stupidity (or in this case, physics) does it. Darker patches on darker skin are harder to detect, just as facial features in the dark, on dark skin are garder to detect because there is literally less light to work with

Scream racism all you want but you're cheapening the meaning of the word and you're not doing anyone a favor.

[–] Dojan@pawb.social 1 points 4 days ago

Don’t assume evil when stupidity

I didn't, though? I think that perhaps you missed the "I don’t think necessarily that people who perpetuate this problem are doing so out of malice" part.

Scream racism all you want but you’re cheapening the meaning of the word and you’re not doing anyone a favor.

I didn't invent this term.

Darker patches on darker skin are harder to detect, just as facial features in the dark, on dark skin are garder to detect because there is literally less light to work with

Computers don't see things the way we do. That's why steganography can be imperceptible to the human eye, and why adversarial examples work when the differences cannot be seen by humans.

If a model is struggling at doing its job it's because the data is bad, be it the input data, or the training data. Historically one significant contributor has been that the datasets aren't particularly diverse, and white men end up as the default. It's why all the "AI" companies popped in "ethnically ambiguous" and other words into their prompts to coax their image generators into generating people that weren't white, and subsequently why these image generators gave us ethnically ambigaus memes and German nazi soldiers that were black.

[–] Melvin_Ferd@lemmy.world -3 points 3 weeks ago (1 children)

Think more about the intended audience.

This isn't about melanoma. The media has been pushing yellow journalism like this regarding AI since it became big.

It's similar to how right wing media would push headlines about immigrant invasions. Hating on AI is the left's version of illegal immigrants.

[–] goldenbug@fedia.io 4 points 3 weeks ago (1 children)

Reading the article, it seems like badly regulated procurement processes with a company that did not meet the criteria to begin with.

Poor results on people with darker skin colour are a known issue. However, the article says its training data containes ONLY white patients. The issue is not hate against AI, it's about what the tools can do with obviously problematic data.

Unless the article is lying, these are valid concerns that have nothing to do with hating on AI, it has all to do with the minimal requirements for health AI tools.

[–] Melvin_Ferd@lemmy.world -1 points 3 weeks ago* (last edited 3 weeks ago) (1 children)

Do you think any of these articles are lying or that these are not intended to generate certain sentiments towards immigrants?

Are they valid concerns to be aware of?

The reason I'm asking is because could you not say the same about any of these articles even though we all know exactly what the NY Post is doing?

Compare it to posts on Lemmy with AI topics. They're the same.

[–] goldenbug@fedia.io 1 points 3 weeks ago (1 children)
[–] Melvin_Ferd@lemmy.world 0 points 3 weeks ago* (last edited 3 weeks ago)

Media forcing opinions using the same framework they always use.

Regardless if it's the right or the left. Media is owned by people lik the Koch and bannons and Murdoch's even left leading media.

They don't want the left using AI or building on it. They've been pushing a ton of articles to left leaning spaces using the same framework they use when it's election season and are looking to spin up the right wing base. It's all about taking jobs, threats to children, status quo.

[–] Imgonnatrythis@sh.itjust.works 4 points 3 weeks ago (1 children)
[–] surewhynotlem@lemmy.world 2 points 3 weeks ago (1 children)

i only see a blank comment.