Man, I really don't miss working in C++. Used to be my daily driver until I ended up in C# land. I understand why C++ is the way it is, I understand why it's still around and the purposes it serves, but in terms of the experience of using the language... I wouldn't want to go back.
A year ago I swapped out a 5800x for a 5800x3d to get more stable frame rates in Counterstrike 2. Made a sizable difference, especially to 1% lows, so these large caches can clearly be a big boon. Granted it's also obvious the game is poorly optimized, the gains look less significant for most other titles.
A bit optimistic I'd say. It's put some software engineering within reach of some people who couldn't do it prior. Where 'some' might be a lot, but still far from all.
I was thinking the other day of how things would go if some of my less tech savvy clients tried to vibe code the things I implement for them, and frankly I could only imagine hilarity ensuing. They wouldn't be able to steer it correctly at all and would inevitably get stuck.
Someone needs to experiment with that actually: putting the full set of agentic coding tools in the hands of grandma and recording the outcome.
It's still going to take a knowledgeable person to steer an LLM. The point is that code written entirely by humans is finished as a concept in professional work—if you're writing it yourself you're not working efficiently or employing industry best practice.
I think it's dramatic to say it's the end of hand written code. That's like saying it's the end of bespoke suits. There are scenarios where carefully hand written and reviewed code are still going to have merit - for example the software for safety critical systems such as space shuttles and stations, or core logic within self-driving vehicles.
Basically when every single line needs to be reviewed extremely closely the time taken to write the code is not a bottleneck at all, and if using AI you would actually gain a bottleneck in the time spent removing the excess and superfluous code it produces.
And my intuition is that the line between those two kinds of programming - let's call them careful and careless programming to coin an amusing terminology - I think that line may not shrink as far back as some think, and I think it definitely won't shrink to zero.
The code lets you shoot yourself in the foot in a lot more ways than a spec does, though. Few people would make specs that include buffer overflows or SQL injection.
That is akin to saying if you aren't using an IDE you are not working efficiently or employing industry best practice, which is insane when you consider people using Vi often run rings around people using IDEs.
AI usage is a useless metric, look at results. Thus far, results and AI usage are uncorrelated.
I keep hearing anecdata that suggest significant to huge productivity increases—"a task that would have taken me weeks now takes hours" is common. There is currently not a whole lot of research that supports that, however:
1) there hasn't been a whole lot of research into AI productivity period;
2) many of the studies that have been done (the 2025 METR study for example) are both methodologically flawed and old, not taking into account the latest frontier models
3) corporate transitions to AI-first/AI-native organizations are nowhere near complete, making companywide productivity gains difficult to assess.
However, it isn't hard to find stories on Hackernews from devs about how much time generative AI has saved them in their work. If the time savings is real, and you refuse to take advantage of it, you are stealing from your employer and need to get with the program.
As for IDEs, if you're working in C# and not using Visual Studio, or Java and not using JetBrains, then no—you are not working as efficiently as you could be.
More broadly the dimension of time is always a problem in gamedev, where you're partially inching everything forward each frame and having to keep it all coherent across them.
It can easily and often does lead to messy rube goldberg machines.
There was a game AI talk a while back, I forget the name unfortunately, but as I recall the guy was pointing out this friction and suggesting additions we could make at the programming language level to better support that kind of time spanning logic.
This is more evident in games/simulations but the same problem arises more or less in any software: batch jobs and DAGs, distributed systems and transactions, etc.
This what Rich Hickey (Clojure author) has termed “place oriented programming”, when the focus is mutating memory addresses and having to synchronize everything, but failing to model time as a first class concept.
I’m not aware of any general purpose programming language that successfully models time explicitly, Verilog might be the closest to that.
> I’m not aware of any general purpose programming language that successfully models time explicitly
Step 1, solve "time" for general computing.
The difficulty here is that our periods are local out of both necessity and desire; we don't fail to model time as a first class concept, we bring time-as-first-class with us and then attempt to merge our perspectives with varying degrees of success.
We're trying to rectify the observations of Zeno, a professional turtle hunter, and a track coach with a stopwatch when each one has their own functional definition of time driven by intent.
> There was a game AI talk a while back, I forget the name unfortunately, but as I recall the guy was pointing out this friction and suggesting additions we could make at the programming language level to better support that kind of time spanning logic.
Sounds interesting. If it's not too much of an effort, could you dig up a reference?
A lot of folks wax sympathetic for the employees who've been laid off. But rare is the company which grows large and doesn't develop a lot of entropy in the process. Hiring beyond its needs, bloating, and mismanagement of resources.
Does the company owe a living to those people that it doesn't actually benefit from having on board? Sometimes it sounds like people think so.
1. It's more a matter of respect in the process than the act. People are notified out of nowhere, irrespective of performance, and they need to quickly change many plans. You do that to a company and it's "unprofessional". The double standards are real
2. Given the economic conditions, I am more sympathetic. Normally a large severance would he good reassurance that they'd land on their feet. But I see more and more devs (especially game devs) going through year long gauntlets just to find something not as good. Tim, in comparison, will manage.
This will be an unpopular take but I agree with it mostly. Always remember that you are not entitled to a job just because you need it to live. Always make sure you stay sharp and prepared for the worst case
If you're not entitled to a job, neither is anyone in the capital class entitled to own shit, they own the best houses, all the means of production - they're not entitled to any of that either.
Many of them, no. Tim's been in the weeds, but most billionaires inherited millions and had the money make money for them. They start in positions with more money than many will see in a full career.
A job is a mutually beneficial agreement between two parties. Either side can generally sever the agreement if it's not viewed as beneficial.
Owning things like houses and companies is more about the compact between people and the government. People are entitled to "own shit", because that's how our government is set up.
The topic of AI triggers people in various ways - anxiety and uncertainty about the future, frustration with excessive hype, and the debate between people on each side of the fence.
It will calm down once the dust starts to settle and there's some kind of consensus on how the chips have fallen.
Also there is an irony that talking about being sick of talking about AI is still talking about AI.
reply