“A reasonably good clone can be created with under a minute of audio and some are claiming that even a few seconds may be enough.” Mom wary about answering calls for fear voice will be cloned for future virtual kidnapping.

  • BassTurd@lemmy.world
    link
    fedilink
    English
    arrow-up
    50
    ·
    1 year ago

    Unless I know who you are, I’m not answering your call. 90% of the time it’s a robot and the other 10% can leave a voicemail.

    • Coliseum7428@kbin.social
      link
      fedilink
      arrow-up
      6
      ·
      1 year ago

      I have had calls from similar numbers to my own, and seen caller ID’s for people that aren’t contacts. I haven’t picked them up, but the temptation was there to do so.

    • radix@lemm.ee
      link
      fedilink
      English
      arrow-up
      6
      ·
      1 year ago

      Isn’t a voicemail worse for detecting deepfakes because it doesn’t require it to dynamically listen and respond?

      • BassTurd@lemmy.world
        link
        fedilink
        English
        arrow-up
        21
        ·
        1 year ago

        I’m not personally concerned about getting duped by a deep fake. I just don’t want to talk to any robots, solicitors, or scammers.

    • bluekieran@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      You might know the number. My wife used to live in Kenya and renamed her “Mum”/“Dad” contacts after they once got a call from her stolen phone saying she’d been arrested and they needed to send money for bail.

    • 857@fedia.io
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      I’ll go one farther - unless it’s my doc, my wife, or my boss, I’m neither answering the call nor listening to the voicemail. That’s what easily skimmable voicemail transcription is for…

      I don’t love the privacy implications of transcribed voicemail, ofc, but it’s better for my own privacy/threat model than answering the phone to robots, scammers, and etc. It’s also a hell of a lot better for my mental health, vs listening to them.

  • chamaeleon@kbin.social
    link
    fedilink
    arrow-up
    19
    ·
    1 year ago

    Real kidnappers will not be happy about this as deepfake becomes more prevalent and calls for ransom gets ignored more and more. Do they have union that can go on strike to raise awareness about this?

    • Johnny Utah@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      As awful as it sounds this needs to be setup between family members. Agree on a phrase or code word to check and make sure they are who they say they are. This is already common when it comes to alarm system monitoring companies, got to make sure the intruder isn’t the one answering the phone and telling them false alarm.

  • RFBurns@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 year ago

    The ‘hostage-taker’ will never be able to duplicate my family’s grammar and sentence structure quirks, so I won’t care how it “sounds”…

    • Flying Squid@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 year ago

      Not even necessarily dead actors. They used AI to bring young Luke Skywalker back in The Book of Boba Fett. And it was not great, but it was serviceable. Now give it 10 years.

  • Raphael@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    5
    ·
    1 year ago

    I’m pro-AI but any technology that can lead to the creation of deepfakes must be explicitly banned.

    Naturally, we’re already talking about criminals but you combat this issue the same way you combat school shootings. Banning the root of the issue and actively persecuting anyone who dares acquire it illegally.

  • solarview@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    9
    ·
    1 year ago

    Perhaps there should be government controlled licenses for some technologies, like for gun ownership? Although there’s probably all sorts of ways that be circumvented. Not sure how best to control this though.

    • Gray@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      1
      ·
      1 year ago

      Ah yes because making something illegal stops criminals from using it. Problem solved.

    • Barbarian@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      2
      ·
      1 year ago

      Basically impossible.

      It’s against the ToS to use tools like teamviewer to run user support scams, for example, but people do it anyway. You can’t legislate against criminals, because if they’re already breaking the law, why would they care about another law?

      The only way forward here is enforcement. There needs to be better coordination between governments to track down and prosecute those running the scams. There’s been a lot of pressure on India, for example, to clean up their act with their very lax cybercrime enforcement, but it’s very much an uphill battle.

    • SmashingSquid@notyour.rodeo
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 year ago

      Not really comparable to guns, making it harder to get a physical object is much different from preventing people from downloading software. Even 3d printed guns require equipment and knowledge to make use of the download.