

also, holy fuck their post history is essentially nothing but unsubtle dogwhistles and pro-AI garbage
also, holy fuck their post history is essentially nothing but unsubtle dogwhistles and pro-AI garbage
how about you go fuck yourself
The only thing LLMs typically are bad at
is everything. including summarizing research since it’s pretty fucking obvious you didn’t read shit. now fuck off
somehow it got even worse
Google appears to have faked AI output in a commercial set to run during the Super Bowl. The ad shows a business owner using Gemini to write a website description, but the text portrayed as generated by AI has been available on the business’s website since at least August 2020
also they doubled down on the bad stat
The ad originally had Gemini present copy stating that Gouda accounts for “50 to 60 percent of the world’s cheese consumption” — which is not true. Google later edited the commercial to take out the stat, while the business owner also removed it from their website.
[…]
But Google maintained that the website description was written by Gemini all along. In addition to showing Gemini “generate” the description in the commercial, Google Cloud apps president Jerry Dischler said on X that the Gouda stat was “not a hallucination,” adding that “Gemini is grounded in the Web.”
also also they later doubled down on lying that Gemini wrote the whole page? it’s… really embarrassing that Google’s marketing team doesn’t know about web archives
standard “fuck off programming.dev” ban with a side of who the fuck cares. deepseek isn’t the good guys, you weird fucks don’t have to go to a nitpick war defending them, there’s no good guys in LLMs and generative AI. all these people are grifters, all of them are gaming the benchmarks they designed to be gamed, nobody’s getting good results out of this fucking mediocre technology.
this is utterly pointless and you’ve taken up way too much space in the thread already
It sounds to me like you have a very clear bias, and you don’t care at all about whether or not what they said is actually true or not, as long as the headlines about AI are negative
oh no, anti-AI bias in TechTakes? unthinkable
also:
So in that thinking, Wikipedia is not open source, if the editor used a proprietary browser?
fucking no! how in fuck do you manage to misunderstand LLMs so much that you think the weights not being reproducible is at all comparable to… editing Wikipedia from a proprietary browser??? this shit isn’t even remotely exotic from an open source standpoint — it’s a binary blob loaded by an open source framework, like how binary blob modules taint the Linux kernel (you glided right past this reference when our other poster made it, weird that) or how loading a proprietary ROM in an open source emulator doesn’t make the ROM open source. the weights being permissively licensed doesn’t make them open source (or really make any sense at all) if the source literally isn’t available.
my fucking god how have you missed the point this hard. fuck off
fuck off promptfan
off you fuck
what if none of it’s good, all of it’s fraud (especially the benchmarks), and having a favorite grifter in this fuckhead industry is just too precious
as the other poster pointed out: Iron Man 2 came out in 2010, and Musk’s personality cult was already in full swing since his PR team at the time did everything they could to associate him with the first film. I believe Charlie when he says his friend and fellow sci-fi author knew an asshole when he saw one.
my bad, I was working on the awful.systems psychic energy collector and it must have backlashed
fuck yes. I don’t think my brain knows what to do with the happy/excited neurotransmitters that news caused it to produce
this feels like a pattern too — so many naturally divergent or non-standard (from the perspective of a white American who thinks they own the English language) elements of writing are getting nonsensically trashjacketed as telltale signs that a text must be generated by an LLM. see also paully g trashjacketing “delve” for purely racist reasons and the authors of the Nix open letter having the accusation of LLM use leveled at them by people who didn’t read the letter and didn’t want anyone else to either.
React has entered the chat (don’t try talking to it yet though, it has to “asynchronously” load every individual UI element in the jankiest way possible)
I ask for the dumbest things like “decrease the padding on the sidebar by half” because I’m too lazy to find it
this is so much slower (in both keystrokes and raw time, not to mention needing to re-prompt) and much more expensive than just going into the fucking CSS and pressing the 3 buttons needed to change the padding for that selector, and the only reason why this would ever be hard is because they’re knee deep in LLM generated slop and they can’t find fucking anything in there. what a fucking infuriating way to interact with a machine.
it’s amazing how intensely these assholes want to end Wikipedia and pollute all other community information sources beyond repair. it feels like it’s all part of the same strategy:
It is not for those easily duped by what they find in their minds or online.
fuck I hate when my own brain dupes me into getting on the internet
courtesy of 404media: fuck almighty it’s all my nightmares all at once including the one where an amalgamation of the most aggressively mediocre VPs I’ve ever worked for replaces everything with AI and nobody stops them because tech is fucked and the horrors have already been normalized
it’s turning out the most successful thing about deepseek was whatever they did to trick the worst fossbro reply guys you’ve ever met into going to bat for them