

Nothing good is allowed to happen ever again.
Nothing good is allowed to happen ever again.
Please stop giving the concentration camp a cutesy name.
For anyone else who just went “Hang on–,” no, Judge Judy is alive.
(Absolutely nobody is protected by this. It’s just spying on adults “for the children.”)
Theft.
‘How stupid does he think we are?,’ asks professional simpleton who promoted him.
Absolutely return to a big pile of MP3s. Just not for this silly bullshit reason. Control, privacy, and reliability matter. The carbon footprint of charging your phone does not.
Jesus Christ, it’s like wagging a finger over using a Game Boy Advance to play Game Boy Color games. It’s 5v instead of 3.3v! The letterboxing wastes power on black pixels! You’re gonna burn through those AAs slightly faster!
I think one serving of tofu has more carbon footprint than a whole day streaming video.
Konrad Zuse invented the first proper computer.
Alan Turing later invented computing.
This distinction is why computer science exists.
From the “no shit” files.
will the next big trend also happen to rely on GPUs.
Almost certainly, just because there’s so many. It’s an absurd amount of parallel computing power to throw at… whatever. Personally, I would bet on training different forms of neural network. Nothing so general that people call it “AI,” because the cult of MBAs will sour on that label. But running any model smaller than a gigabyte seems to be quick and efficient - and there’s a lot of hard problems with short answers where good guesses work well enough.
Unless you meant ‘will the next big trend also inflate GPU prices,’ in which case, ha ha, no. They’re already wildly oversold. New ones aren’t that much better than old ones. Secondhand datacenter GPUs will suppress new bulk sales for a good while.
Tone policing is trolling, even when it’s not spat at polite and impersonal criticism. It’s a no-effort, all-purpose dismissal you’re liable to repeat in response to this comment, because the nature of bad faith is that there is no right answer, and engaging with arguments would be hard.
CUDA-on-AMD has happened half a dozen times, but it keeps getting shut down over legal fears.
CUDA is Nvidia’s worst anticompetitive abuse.
Doctor Rockso, hand sliding down: “I do AI.”
Serious question: is ChatGPT worse at ending stories than Stephen King?
Yes, he’s been saying ‘it’s the whole thing!’ from two years ago until this very headline… ignoring whatever doesn’t fit his monomania. DeepSeek gets an offhand reference and is not investigated further. Nvidia’s wild overvaluation doesn’t affect AMD, who was not invited, despite making goddamn near the same product. Every company he’s rambling about could disappear, and the tech would still be a big fucking deal.
Ed seems to think the tech itself is a bubble. It’s hard not to reflect on the aforementioned ‘people doubted the internet!’ crap, when he’s not just saying Pets.com and Webvan are unsustainable… ‘it’s the entirety of the web trade.’ Like it’s as empty as NFTs.
Meanwhile, you can download a thing and it does the thing. A startling variety of perverts have demonstrated that diffusion models work. That shit’s gonna follow the same trajectory as CGI, from ‘ugh, they used computers’ to ‘holy shit how’d they do that.’ Language models are a lot dumber than we’d hoped, but the fact they work at all is borderline miraculous, and they will half-ass anything you ask for.
None of this is going to make anyone a trillionaire. But knowing that is a long way from becoming a professional hater.
Ed embodies how anti-AI sentiments slide from “justifiable venting backlash” to “identarian circlejerk.” And he’s putting it all up-front with a headline that concludes ‘yes I am over-reaching, my motte has no bailey, do not offer benefit of the doubt.’ Every post is the same rant about a few companies spending more than they make. He’s forgotten how to talk about anything else. He refuses to tolerate nuance regarding the underlying technology, as distinct from marketroid figureheads making shit up.
No kidding there’s a bubble. When it pops - the tech’s not going anywhere. Local models work fine. Pick any benchmark and they’re only a year behind the behemoths trying to justify their own scale. ‘What’s the next word?’ is the silliest fucking approach to a neural network, and it’ll still do anything you ask. Not perfectly! Sometimes, not even passably. And yet it moves.
Movie studios will in fact use video generators to describe CGI into existence. It’s already happened and nobody noticed. Audio filters will let two voice actors play a whole cast, none of whom sound like a specific human being. Music generators can spit out a three-minute song in four minutes. Bespoke subject, arbitrary genre, finished recording. Four minutes. Nitpicking the quality is kinda missing the point.
Little of that shows up as economic activity because the most interesting applications are local. For some reason nobody has even tried to sell local models. They just hand ‘em out. What people do with them seems to involve a lot of pornography… allegedly a major driver of tech adoption since shortly after the printing press. When corporate studios seek experts in generating video that would be impractical to film legally, I suspect they won’t look too closely at new hires’ resumes.
Generative AI Has No Business Model If It Can’t Do Software As A Service
Local models work fine.
Even these specific doomed companies could slash their VC-powered spending if they let customers do their own hardware. This whole thing only happened because consumer gaming cards are accidental supercomputers. Any of these companies could sell the open-weight models they are currently giving away. It’s just software. Products don’t have to be services.
These companies pursued scale to avoid competition, and it’s mostly worked. But they still got scooped by DeepSeek, by several orders of magnitude. The expected quality for any amount of training keeps going up. Half this money was spent on engineering greater efficiency. The other half was spent doing more of the more-efficient training. See: Jevon’s paradox.
If the hyperscalers cratered tomorrow, none of the weirdos tweaking published models would disappear. That decentralized infrastructure is noncommercial.
If Nvidia disappeared the day after… CUDA is not required for matrix algebra. Their monopoly on this whole mess was already a criminal scheme, fifteen fucking years ago. And Microsoft has demonstrated that quantization shenanigans make CPUs viable, if not competitive.
I started writing this newsletter with 300 subscribers, and I now have 67,000 and a growing premium subscriber base.
But you’d never do anything for clicks. Like… re-write the same frothing opinion piece, three times a month, for two years. This man spent three thousand words, in this fifteen-thousand-word article, railing against tech-bro ‘people doubted the internet!’ crap. The existence of success doesn’t prove some new thing will succeed. There. Done.
No need for a wall of text, over and over and over.
Have a PNG version.
And another.
Have another.
Team Rocket coded.
The legal battle over arbitrary exclusion is a difficult fight by innocent victims.
Not having backups is a confession by morons with nobody to blame but themselves.
These two things can coincide.