this post was submitted on 14 Aug 2025
90 points (98.9% liked)

Technology

3848 readers
574 users here now

Which posts fit here?

Anything that is at least tangentially connected to the technology, social media platforms, informational technologies and tech policy.


Post guidelines

[Opinion] prefixOpinion (op-ed) articles must use [Opinion] prefix before the title.


Rules

1. English onlyTitle and associated content has to be in English.
2. Use original linkPost URL should be the original link to the article (even if paywalled) and archived copies left in the body. It allows avoiding duplicate posts when cross-posting.
3. Respectful communicationAll communication has to be respectful of differing opinions, viewpoints, and experiences.
4. InclusivityEveryone is welcome here regardless of age, body size, visible or invisible disability, ethnicity, sex characteristics, gender identity and expression, education, socio-economic status, nationality, personal appearance, race, caste, color, religion, or sexual identity and orientation.
5. Ad hominem attacksAny kind of personal attacks are expressly forbidden. If you can't argue your position without attacking a person's character, you already lost the argument.
6. Off-topic tangentsStay on topic. Keep it relevant.
7. Instance rules may applyIf something is not covered by community rules, but are against lemmy.zip instance rules, they will be enforced.


Companion communities

!globalnews@lemmy.zip
!interestingshare@lemmy.zip


Icon attribution | Banner attribution


If someone is interested in moderating this community, message @brikox@lemmy.zip.

founded 2 years ago
MODERATORS
 

AI is not benign.

top 2 comments
sorted by: hot top controversial new old
[–] lime@feddit.nu 18 points 2 days ago* (last edited 2 days ago)

just like how lumberjacks get worse at using an axe after leaning on chainsaws.

Edit:
just to be a little less facetious, i'll note that this is not related to the current ai hype at all. the medical field has been using machine learning for well over two decades at this point, and generally in the form of classifiers rather than generators. you feed it a bunch of x-rays and whether they show, say, lung cancer, and the system will automatically sort out things that look normal from things that don't. this is a good thing because it means doctors can spend more time with patients. doctors also got worse at manually diagnosing broken bones when x-ray machines became common.

Edit 2:
a classifier basically just cooks an image down to some basic characteristics, then places it on a graph, and checks if it's above or below a line it has used other images to refine. it looks like this:

say blue dots, are images that don't show lung cancer, and red dots are images that do. where they end up in the graph is based some amount of factors that are determined by a medical professional. it doesn't have to be 2D, it can be any number of dimensions. then, using one or more of the methods in the graph, the machine learning algorithm figures out where to draw the line between blue and red. then, when you feed in a new image, it can tell you whether it's definitely in the blue area, and therefore normal, or maybe in the red area, and therefore worth a closer look by a doctor.

[–] hedgehog@ttrpg.network 2 points 2 days ago

Summary of my comment: the study showed that the AI tool in question was an effective tool for the task, nothing more.

I didn’t read this particular article, but I recently read a different one about the same study. I also clicked into the study itself and read the abstract and everything else that was freely available. The study was paywalled, but as far as I could tell:

  • Performance immediately displayed a sustained increase of 24% relative to baseline while using the AI tool in question
  • Immediately after the tool was taken away (after using it for three months), performance was 20% lower than the baseline
  • The study did not check to see what level performance returned to after three months without it, nor when it returned to baseline levels
  • The study also did not compare performance drops after returning from a three month vacation
  • The study did not compare performance drops when losing access to other tools

This outcome is expected if given a tool that simplifies a process and then losing access to it. If I were writing code in Notepad and using _v2, _v3, etc for versioning, was then given an IDE and git for three months, then had to go back to my old ways with Notepad, I’d expect to be less effective than I had been. I’ve been relying on syntax highlighting, so I’m going to be paying less attention to the specific monochrome text than I used to. I’ll have fallen out of practice from using the version naming techniques that I used to use. All of the stuff that I did to make up for having worse tooling, I’m out of practice with.

But that doesn’t mean that I should use worse tools.