

(I’ll read the rest of it later as my brain boots and the day’s bullshit allows on time)
I’m @froztbyte more or less everywhere that matters
(I’ll read the rest of it later as my brain boots and the day’s bullshit allows on time)
merely from seeing the domain and author, probably all of it - casey newton’s got a real bad case of access syndrome, and keeps writing fluff/puff pieces uncritically amplifying tons of bayfucker nonsense
(it’s even beyond the usual levels of what one may refer to as useful idiot)
…I did it again. I looked.
oof.
others have said the bits that matter already, but for my part: what in the fuck kind of post is this
Google appears to have faked AI output in a commercial set to run during the Super Bowl.
google? lying in their autoplag ads? a fine tradition
… that the Gouda stat was “not a hallucination,” adding that “Gemini is grounded in the Web.”
it’s astounding how much this shows a lack of knowledge/grok regarding their own goddamn operations on the current web
Look, I get your perspective, but zooming out there is a context that nobody’s mentioning
I’m aware of that yeah, but it’s not a field I’m actively engaged in atm and not likely to be any time soon either (from no desire to work in it follows no desire to wade through the pool of scum). but also not really the place to be looking for insight. it is the place wherein to ridicule the loons and boosters
we should expect somebody to eventually demonstrate that the Transformers paradigm sucks
been wondering whether that or the next winter will get here first.
If you want to critique something, critique the gradient worship
did that a couple of years ago already, part of why I was already nice and burned out on so much of this nonsense when midjourney/stablediffusion started kicking around
it’s like whenever Chinese folks do anything the rest of the blogosphere goes into panic
[insert condensed comment about mentality of US/SFBA-influenced tech sector (and, really, it is US specifically; eurozone’s a somewhat different beast), american exceptionalism, sinophobia, and too-fucking-many years of “founder” stories]
it really is tedious though, yeah. when it happens, I try to just avoid some feeds. limited spoons.
but I’m so fucking tired of mathlessness
as you know, the bayfucker way (for getting on close to 20y now) is to get big piles of money and try to outspend your competition. why bother optimising or thinking about things if you can just throw another 87345243 computers at the problem? (I do still agree with you, but see above re desire and intent)
re the open source thing: it’s a wider problem than just that, and admittedly I’m peeved about it from this larger scope. I didn’t expound on it in my previous comment because (as above) largely not really the place. that said, soapbox:
there’s a thing I’ve been noticing as a creeping trend lately. I call it “open source veneer”, which is still a bit imprecise[0] but I think you’ll get what I mean. it’s the phenomenon of shit like this. of “projects” on github that are no more than a fancy readme and some “contributors” and whatnot, but no actual code (or ability to make full use of what is provided). of companies that build “open source” and then as soon as something (usually VC-/“earnings”-related decisions) happens, the entire project gets deeply buried (links disappear off main sites, leaving product/service only), actively hobbled (“oh you want to set this up yourself? glhf gfy”, done in oh so many ways[1]), or often even entirely disappeared[2]
[0] - still working through the thought, should probably write about it soon
[1] - backend codebases lagging because “not feature priority”, entirely missing documentation, wholly missing key sections of code which are “conveniently” left out, etc etc; examples off the top of my head: zotero, signal, firefox weave for a while. there’s plenty more if you look
[2] - been noticing this especially frequently with some security stuff, but it’s hardly the only example set
shot:
majority of it on the actual training (hardware, …)
chaser:
And that’s (supposedly) only $6M for Deepseek.
After experimentation with models with clusters of thousands of GPUs, High Flyer made an investment in 10,000 A100 GPUs in 2021 before any export restrictions. That paid off. As High-Flyer improved, they realized that it was time to spin off “DeepSeek” in May 2023 with the goal of pursuing further AI capabilities with more focus.
So where is the lie?
your post is asking a lot of questions already answered by your posting
[to the tune of Fort Minor’s Remember The Name]
10% senseless, 20% post
15% concentrated spirit of boast
5% reading, 50% pain
and a 100% reason to not post here again
literally begging people to relearn the terms shareware and freeware
couldn’t ask a bitcoin a question and get answers like you do with prompts! checkmate, atheists
“is this ideological project which has directly incentivised burning books and harming atypicals the same as the fascist projects which did the same? the answer may surprise you!”
weirdly early for the revisionist PR to start, though, they’re barely done setting shit on fire
the build artifact is distributed MIT-licensed, that’s substantially different (and intentionally subversive). there is no reproducibility. which, you know, hint hint nudge nudge that thing that I already said
I realize that outsourced thinking is why you want LLMs, but it clearly still doesn’t help. maybe you should try the old brainmeat. just stop huffing your farts first, those are bad for you
you do know that you don’t have to be a pliant useful idiot like this, right? doing the free “open source” pr repetition (when it’s none of that)? shit’s more like shareware (if that at all - certainly doesn’t have the same spiritual roots as shareware. for them it’s some shit thrown over the wall to keep the rabble quiet)
(it’d be nice if we could popularise something like how kernel will go “tainted”, but unfortunately the entire fucking llm field is so we’d need a stronger word)
yeah, it’s a common refrain. has a very simple rebuttal too
sounds a bit of a xy question imo, and a good answer of examples would depend on the y part of the question, the whatever it is that (if my guess is right) your friend is actually looking to know/find
“AI” is branding, a marketing thing that a cadaverous swarm of ghouls got behind in the upswing of the slop wave (you can trace this by checking popularity of the term in the months after deepdream), a banner with which to claim to be doing something new, a “new handle” to use to try anchor anew in the imaginations of many people who were (by normal and natural humanity) not yet aware of all the theft and exploitation. this was not by accident
there are a fair of good machine learning systems and companies out there (and by dint of hype and market forces, some end up sticking the “AI” label on their products, because that’s just how this deeply fucked capitalist market incentivises). as other posters have said, medical technology has seen some good uses, there’s things like recommender[0] and mass-analysis system improvements, and I’ve seen the same in process environments[1]. there’s even a lot of “quiet and useful” forms of this that have been getting added to many daily use systems and products all around us: reasonably good text extractors as a baseline feature in pdf and image viewers, subject matchers to find pets and friends in photos, that sort of thing. but those don’t get headlines and silly valuation insanity as much of the industry is in the midst of
[0] - not always blanket good, there’s lots of critique possible here
[1] - things like production lines that can use correlative prediction for checking on likely faults
come on don’t you like waiting 1s+ for every single action you ever want to take? it’s the hot new thing
ah yes, content from the well-known community mod Cursid Meier
1990s dialup: barely only once
it’s above-baseline among the tpots (at least relative to other areas I’ve observed it)
as I read it, it’s an attempt at reference to economy of scale under the thesis “AI silicon will keep getting cheaper because more and more people will produce it” as the main underpinning for how to reduce their unit economics. which, y’know, great! that’s exactly what people like to hear about manufacturing and such! lovely! it’s only expensive because it’s the start! oh, the woe of the inventor, the hard and expensive path of the start!
except that doesn’t hold up in any reasonable manner.
they’re not using J Random GPU, they’re using top-end purpose-focused shit that’s come into existing literally as co-evolution feedback from the fucking industry that is using it. even some hypothetical path where we do just suddenly have a glut of cheap model-training silicon everywhere, imo it’s far far far more likely to be an esp32 situation than a “yeah this gtx17900 cost me like 20 bucks” situation. even the “consumer high end” of “sure your phone has a gpu in it” is still very suboptimal for doing the kind of shit they’re doing (even if you could probably make a great cursed project out of a cluster of phones doing model training or whatever)
falls into the same vein of shit as “a few thousand days” imo - something that’s a great soundbite, easily digestible market speak, but if you actually look at the substance it’s comprehensive nonsense