There is no need for this many data centers.
LLMs are a scourge on humanity as they are currently implemented, and what will they do when these are no longer needed?
I can't wait until OpenAI, NVIDIA and Microsoft all go belly up.
But the negatives are spiraling out of control. Pollution and energy and the amplification of structural social problems like wealth stratification, authoritarianism, media manipulation...
With great power comes great responsibility, and we're living in an era in which our culture has shifted dramatically towards accepting immoral, short-sighted, and reckless behaviour.
Certainly commensurate to the price. It's up to the companies to bring the cost under the price.
AFAICT, fears of the marginal costs of LLM inference being high are dramatically overblown. All the "water" concerns are outlandish, for one—a day of moderately heavy LLM usage consumes on the order of one glass of water, compared to a baseline consumption of 1000 glasses/day for a modern human. And the water usage of a data center is approximately the same as agriculture per acre.
I don't think anyone has a single agreed upon number for the water consumption, with the higher estimates focusing on a lot of wider externalities and the lower estimates ignoring them, such as ignoring the cost of training.
At what cost? See discussion here. And who bears the burden of that cost?
Sure you can look away from child labor providing you the latest iphones or lithium mines for the same or electric cars destroying pristine tropical jungles and entire ecosystems, many folks do so very comfortably. Then some others don't.
You participate in $the_thing so surely you must support $the_thing, right?
I would get value from stealing, I don't steal from people. The argument or question isn't about if it has value to some people, the question is, does the value to some people outweigh the costs that are imposed on others.
It’s no more useful than when google and stack overflow was at its peak! All I want is to find docs. The coding performance is lackluster, oversold and under delivered. Everything else gen AI is dystopian.
The compute will find a use case; if the AI bubble bursts I'm sure all the excess capacity will be rerouted to crypto again. But also, there's still plenty of usage in chatbots or image / video generation, I'm not convinced that will just stop.
> Between long COVID and ai, nobody will be able to make fizzbuzz in Java, let alone code a frontend by hand.
I've been doing front-end stuff since getting free trials/demos of Dreamweaver and of the mac equivalent of Visual Basic* on a magazine cover CD with pocket money while in high school in the 90s.
IMO, the stuff you need on your CV as a front-end developer, is much less productive than the stuff we had back in the late 90s. Well, except for localisation (while Unicode technically existed back then, support for it seemed to be minimal) and version control. Everything else feels like a regression that has only been partially compensated for by hardware and network speed improvements.
If anything, AI will let us go back to actually performant systems, because the AI doesn't need to show off how many years of experience it has with Gorebyss-on-Arvados (or whatever other buzzword bingo you want to insert here).
Or think critically. Or write proper emails. Or a multitude of other things. Why bother when you can outsource everything to the computer. If this trend continues is gonna be interesting to see how people will evolve in 10 or 15 years.
It may not seem like it now, but that's because a big chunk of software industry is making money on introducing friction, and preventing automation, because the user interface that sits between a person and some outcome they desire, makes for a perfect marketing channel.
It kind of isn't? If I read your comment and rather than taking the time to think about what you said and respond to you I simply prompted one of the many tools to "write a comment that disagrees with TeMPOraL" something would be lost.
And the point of the computer is not to replace me everywhere it can. Also, automating something is one thing. It requires deliberate actions. Outsourcing is another thing.
I'm glad you wrote it up. Thanks! But I feel like the folks behind the HTML5 spec and the comprehensive test suite deserve the lion's share of the credit for this (very neat) achievement.
Most projects don't have a detailed spec at the outset. Decades of experience have shown that trying to build a detailed spec upfront does not work out well for a vast class of projects. And many projects don't even have a comprehensive test suite when they go into production!
Having a comprehensive spec and test suite is an absolute requirement, without it all you got is vibe-testing, LGTM feels. As shown by the OP, you can throw away the code and regenerate it back from tests and specs. Our old manual code is now the new machine code.
In the image they showed for the new one, the mechanic was checking a dipstick...that was still in the vehicle.
I really hope everyone is starting to get disillusioned with OpenAI. They're just charging you more and more for what? Shitty images that are easy to sniff out?
In that case, I have a startup for you to invest in. Its a bridge-selling app.
Sure but there are only a couple leading providers worth considering for coding at least, and there will be consolidation once investment pulls back. They may find a way to collude on raising prices.
Where switching will be easier is with casual chat users plus API consumers that are already using substandard models for cost efficiency. But there will also always be a market for state of art quality.
>GPT‑5.2 sets a new state of the art across many benchmarks, including GDPval, where it outperforms industry professionals at well-specified knowledge work tasks spanning 44 occupations.
We built a benchmark tool that says our newest model outperforms everyone else.
Trust me bro.
reply