• haruki@programming.dev
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      It’s sad to see it spit out text from the training set without the actual knowledge of date and time. Like it would be more awesome if it could call time.Now(), but it 'll be a different story.

      • Blackmist@feddit.uk
        link
        fedilink
        English
        arrow-up
        10
        ·
        1 year ago

        if you ask it today’s date, it actually does that.

        It just doesn’t have any actual knowledge of what it’s saying. I asked it a programming question as well, and each time it would make up a class that doesn’t exist, I’d tell it it doesn’t exist, and it would go “You are correct, that class was deprecated in {old version}”. It wasn’t. I checked. It knows what the excuses look like in the training data, and just apes them.

        It spouts convincing sounding bullshit and hopes you don’t call it out. It’s actually surprisingly human in that regard.

        • tjaden@lemmy.sdf.org
          link
          fedilink
          English
          arrow-up
          5
          ·
          1 year ago

          It spouts convincing sounding bullshit and hopes you don’t call it out. It’s actually surprisingly human in that regard.

          Oh great, Silicon Valley’s AI is just an overconfident intern!

          • ram@lemmy.ca
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 year ago

            Oh great, Silicon Valley’s AI is just a major tech executive!

        • panCatQ@lib.lgbt
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          They are mostly large language models , I have trained few smaller models myself, they generally splurt out next word depending on the last word , another thing they are incapable of, is spontaneous generation, they heavily depend on the question , or a preceding string ! But most companies are portraying it as AGI , already !

    • panCatQ@lib.lgbt
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Well obviously a language model is trained on old data , google has been webscraping the data to provide this!