93

OpenAI’s Whisper tool may add fake text to medical transcripts, investigation finds.

top 10 comments
sorted by: hot top controversial new old
[-] ChihuahuaOfDoom@lemmy.world 15 points 1 week ago

Regular transcription software is finally respectable (the early days of dragon naturally speaking were dark indeed). Who thought tossing AI in the mix was a good idea?

[-] SpikesOtherDog@ani.social 13 points 1 week ago

I work in judicial tech and have heard questions of using AI transcription tools. I didn't believe AI should be used in this kind of high risk area. The ones asking if AI is a good fit for court transcripts can be forgiven because all they see is the hype, but if the ones responding greenlight a project like that there will be some incredibly embarrassing moments.

My other concern is that the court would have to run the service locally. There are situations where a victim's name or other information is redacted. That information should not be on an Open AI server and should not be regurgitated back out when the AI misbehaves.

[-] FatCrab@lemmy.one 2 points 1 week ago

Don't court stenographers basically use tailored voice models and voice to text transcription already?

[-] SpikesOtherDog@ani.social 2 points 1 week ago

I don't get too technical with the court reporter software. They have their own license and receive direct support from their vendor. What I have seen is that there is an interpreting layer between the stenographer machine and the software, literally called magic by the vendor, that is a bit like predictive text. In this situation, the stenographer is actively recording and interpreting the results.

[-] EleventhHour@lemmy.world 10 points 1 week ago* (last edited 1 week ago)

Private hospitals care about only one thing: profit. These error-ridden tools serve that purpose.

[-] SubArcticTundra@lemmy.ml 10 points 1 week ago

God I hope this isn't the AI plan that the NHS adopts

[-] ladicius@lemmy.world 8 points 1 week ago* (last edited 1 week ago)

This is the AI plan every healthcare entity worldwide will adopt.

No joke. They are desperate for shit like this.

[-] ShareMySims@sh.itjust.works 4 points 1 week ago

Errors and Hallucinations are definitely serious concerns, but my biggest concern would be privacy. If my GP is using AI, I no longer see my medical information as private, and that is unacceptable.

[-] FigMcLargeHuge@sh.itjust.works 4 points 1 week ago

If anyone needs to know the state of AI transcription, just turn on closed captioning for your local tv channel. It's atrocious and I am sorry that people who need closed captioning are subjected to that.

[-] sgibson5150@slrpnk.net 1 points 1 week ago

Years ago, I worked in a tech role at a medical transcription company. It hadn't occurred to me that AI would render their jobs irrelevant. This used to be an area where women in particular could make decent money after a bit of training, and there were opportunities for advancement into medical coding and even hospital administration.

I worked with some good people. Hope they landed on their feet.

this post was submitted on 04 Nov 2024
93 points (98.9% liked)

Fuck AI

1346 readers
151 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

founded 8 months ago
MODERATORS