• 0 Posts
  • 154 Comments
Joined 1 year ago
cake
Cake day: June 11th, 2023

help-circle








  • Let me tell you something, folks. Scott Walker, he’s like a giant, stinking pile of shit. It’s unbelievable. Nobody’s seen anything like it before. You walk past it, and you know it’s bad right away. Everyone says it. People are talking about it. It’s huge, just sitting there, doing nothing, and it stinks—worse than anyone thought. And guess what? He thinks it’s good! Can you believe it? He’s out there, pretending like everything’s fine, while people can’t even stand to be around him. Total disaster, folks. Total mess. We’re going to clean it up, because that’s what we do. We clean up the mess left by people like Scott Walker, the human pile of shit. Believe me.




  • They’re supposed to be good a transformation tasks. Language translation, create x in the style of y, replicate a pattern, etc. LLMs are outstandingly good at language transformer tasks.

    Using an llm as a fact generating chatbot is actually a misuse. But they were trained on such a large dataset and have such a large number of parameters (175 billion!?) that they passably perform in that role… which is, at its core, to fill in a call+response pattern in a conversation.

    At a fundamental level it will never ever generate factually correct answers 100% of the time. That it generates correct answers > 50% of the time is actually quite a marvel.