• adhocfungus@midwest.social
    link
    fedilink
    English
    arrow-up
    15
    ·
    2 hours ago

    This is obviously funny, but I think the end result will be a bit sad. Spammers will (or already are) start to use similar AI programs to cold call people, then transfer to the scammer if they’ve got a live one. Eventually we’re just going to be heating the Earth so that invisible chatbots can have conversations no human will ever hear.

  • rumba@lemmy.zip
    link
    fedilink
    English
    arrow-up
    21
    ·
    4 hours ago

    Sir, after the latest round of training to the new LLM it hallucinates all the time talking about nonsense that never happened, and every time you ask it any questions, It gets preoccupied with the first answer it comes up with and won’t take any more input.

    Wait I have an idea…

  • .Donuts@lemmy.world
    link
    fedilink
    English
    arrow-up
    27
    ·
    edit-2
    6 hours ago

    “Daisy” is claimed to be indistinguishable from a real person, fooling scammers into thinking they’ve found perfect prey thanks to its ability to engage in “human-like” rambling chat, the biz claims.

    lmao okay. This will work for maybe a week, and then they will smarten up

    • Hubi@feddit.org
      link
      fedilink
      English
      arrow-up
      17
      ·
      4 hours ago

      I’ve watched an 18 minute video of an Indian scammer talking to an answering machine randomly repeating one of like 10 pre-recorded voice lines. This one is most likely much better and will steal hours of their time.

    • illi@lemm.ee
      link
      fedilink
      English
      arrow-up
      4
      ·
      4 hours ago

      Idk, elderly are prone to ramblingnon and sometimes not making much sense. Perfect use of AI if you ask me.

  • Cris@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    ·
    edit-2
    4 hours ago

    Once again, one of the very rare few areas where ai isn’t a completely shitty solution: tasks that are worthwhile and important, but that require labor no one is willing to pay for.

    Others include translation, transcription, and image descriptions. Things people won’t put resources into but that should happen anyway

    The only problem is that these have nothing to do with why massive companies are investing in this tech. If AI didn’t enable the equivalent of money laundering for intellectual labor, the billionaires wouldn’t give a shit

  • NeoNachtwaechter@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    ·
    6 hours ago

    The Grandma-Honeypot LOL

    But I don’t believe in it. If I were the scammer, I would have maybe 2 or 3 of these lengthy talks with “her”, but afterwards I would recognize her and of course avoid her anytime.

      • NeoNachtwaechter@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        3 hours ago

        That “fleet” would need many significant differences, like when you recognize different people, then many features are different. If your grandma talks to you with a different tone, you would still recognize your grandma easily.

        • nyan@lemmy.cafe
          link
          fedilink
          English
          arrow-up
          2
          ·
          3 hours ago

          Depends on how much effort the average scammer puts into remembering the prospective victims that don’t bite. My guess is that they don’t waste too many brain cells on that.