OpenAI’s Whisper tool may add fake text to medical transcripts, investigation finds.

  • ChihuahuaOfDoom@lemmy.world
    link
    fedilink
    arrow-up
    16
    ·
    2 months ago

    Regular transcription software is finally respectable (the early days of dragon naturally speaking were dark indeed). Who thought tossing AI in the mix was a good idea?

  • SpikesOtherDog@ani.social
    link
    fedilink
    arrow-up
    14
    ·
    2 months ago

    I work in judicial tech and have heard questions of using AI transcription tools. I didn’t believe AI should be used in this kind of high risk area. The ones asking if AI is a good fit for court transcripts can be forgiven because all they see is the hype, but if the ones responding greenlight a project like that there will be some incredibly embarrassing moments.

    My other concern is that the court would have to run the service locally. There are situations where a victim’s name or other information is redacted. That information should not be on an Open AI server and should not be regurgitated back out when the AI misbehaves.

    • FatCrab@lemmy.one
      link
      fedilink
      arrow-up
      3
      ·
      2 months ago

      Don’t court stenographers basically use tailored voice models and voice to text transcription already?

      • SpikesOtherDog@ani.social
        link
        fedilink
        arrow-up
        2
        ·
        2 months ago

        I don’t get too technical with the court reporter software. They have their own license and receive direct support from their vendor. What I have seen is that there is an interpreting layer between the stenographer machine and the software, literally called magic by the vendor, that is a bit like predictive text. In this situation, the stenographer is actively recording and interpreting the results.

    • ladicius@lemmy.world
      link
      fedilink
      arrow-up
      9
      ·
      edit-2
      2 months ago

      This is the AI plan every healthcare entity worldwide will adopt.

      No joke. They are desperate for shit like this.

  • ShareMySims@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    5
    ·
    2 months ago

    Errors and Hallucinations are definitely serious concerns, but my biggest concern would be privacy. If my GP is using AI, I no longer see my medical information as private, and that is unacceptable.

  • FigMcLargeHuge@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    1
    ·
    2 months ago

    If anyone needs to know the state of AI transcription, just turn on closed captioning for your local tv channel. It’s atrocious and I am sorry that people who need closed captioning are subjected to that.

  • sgibson5150@slrpnk.net
    link
    fedilink
    arrow-up
    2
    arrow-down
    1
    ·
    2 months ago

    Years ago, I worked in a tech role at a medical transcription company. It hadn’t occurred to me that AI would render their jobs irrelevant. This used to be an area where women in particular could make decent money after a bit of training, and there were opportunities for advancement into medical coding and even hospital administration.

    I worked with some good people. Hope they landed on their feet.