As a medical doctor I extensively use digital voice recorders to document my work. My secretary does the transcription. As a cost saving measure the process is soon intended to be replaced by AI-powered transcription, trained on each doctor’s voice. As I understand it the model created is not being stored locally and I have no control over it what so ever.

I see many dangers as the data model is trained on biometric data and possibly could be used to recreate my voice. Of course I understand that there probably are other recordings on the Internet of me, enough to recreate my voice, but that’s beside the point. Also the question is about educating them, not a legal one.

How do I present my case? I’m not willing to use a non local AI transcribing my voice. I don’t want to be percieved as a paranoid nut case. Preferravly I want my bosses and collegues to understand the privacy concerns and dangers of using a “cloud sollution”. Unfortunately thay are totally ignorant to the field of technology and the explanation/examples need to translate to the lay person.

  • DontMakeMoreBabies@kbin.social
    link
    fedilink
    arrow-up
    19
    arrow-down
    1
    ·
    edit-2
    11 months ago

    You’re going to lose this fight. Admin types don’t understand technology and, at this point, I imagine neither do most doctors. You’ll be a loud minority because your concerns aren’t concrete enough and ‘AI is so cool. I mean it’s in the news!’

    Maybe I’m wrong, but my organization just went full ‘we don’t understand AI so don’t use it ever,’ which is the other side of the same coin.

    • FlappyBubble@lemmy.mlOP
      link
      fedilink
      arrow-up
      11
      ·
      11 months ago

      I understand the fight will be hard and I’m not getting into it if I cant present something they will understand. I’m definetly in a minority both among the admin staff and my peers, the doctors. Most are totally ignorsnt to the privacy issue.