> Its like going 70 mph in a sleepy subdivision because a road sign on the interstate says you can go 70 there.
> Trump is taking an law that says "You can do X if Y" and saying "I can do X"
I think it's more like going 70mph downtown because there's a sign saying "if onn an interstate you can do 70mph" -- the "if on an interstate" is pretty important there!
Do you have examples of deadlocks/livelocks you've encountered using SERIALIZABLE? My understanding was that the transaction will fail on conflict (and should then be retried by the application - wrapping existing logic in a retry loop can usually be done without _too_ much effort)...
Haven’t kept history from the bug tracker back that far, but we definitely hit some pretty awful issues in prod trying to solve race issues with “serialisable”. Big older codebases end up with surprising data access patterns.
I guess I'd say -- I think you're right that you shouldn't (ideally) be able to trigger true deadlocks/livelocks with just serializable transactions + an OLTP DBMS.
That doesn't mean it won't happen, of course. The people who write databases are just programmers, too. And you can certainly imagine a situation where you get two (or more) "ad-hoc" transactions that can't necessarily progress when serializable but can with read committed (ad-hoc in the sense of the paper here: https://cacm.acm.org/research-highlights/technical-perspecti...).
I’m not sure they were _introduced_ by switching to serialised, but it means some processes started taking long enough that the existing possibilities for deadlocks became frequent instead of extremely rare.
I think it's a bit hard to say that this is definitively true: people have always been interested in running linear algebra on computers. In the absence of NVIDIA some other company would likely have found a different industry and sold linear algebra processing hardware to them!
It's pretty interesting that consumer GPUs started to really be a thing in the early 90s and the first Bitcoin GPU miner was around 2011. That's only 20 years. That caused a GPU and asic gold rush. The major breakthroughs around LLMs started to snowball in the academic scene right around that time. It's been a crazy and relatively quick ride in the grand scheme of things. Even this silicone shortage will pass and we'll look back on this time as quaint.
Of course you are right, but in addition they wouldn't have even made them if GPUs hadn't made ML on CPU so relatively incapable. Competition drives a lot of these decisions, not just raw performance.
I'm not missing the point. If you recall your computer architecture class there are many vector processing architectures out there. Long before there was nvidia the world's largest and most expensive computers were vector processors. It's inaccurate to say "gaming built SIMD".
You are missing the point - it's an economic point. Very little R&D was put into said processors. The scale wasn't there. The software stack wasn't there (because the scale wasn't there).
No one is suggesting gaming chips were the first time someone thought of such an architecture or built a chip with such an architecture. They are suggesting the gaming industry produced the required scale to actually do all the work which lead to that hardware and software being really good, and useful for other purposes. In chip world, scale matters a lot.
The Cray-1, which produced half a billion USD in revenue in today's dollars, at a time when computing was still science fiction, did not demonstrate scale? I just can't take you in good faith because there has never been a time when large scale SIMD computing was not advanced by commercial interests.
In this context scale = enough units/revenue to spread fixed costs.
I'll take your word on lifetime revenue numbers for Cray 1.
So yes, in todays dollars, $500 million of lifetime revenue - maybe 60-70 million per year, todays dollars - is not even close to the scale we are seeing today. Even 10 years ago Nvidia was doing ~$5 billion per year (almost 100x your number) and AMD a few bill(another 60-70x ish)
Even if you meant $500m in annual (instead of lifetime), Nvidia was 10x that in 2015. And AMDs GPU revenue which was a few billion that year, so it's more like 17x.
That's a large difference in scale. At the low end 17x and at the high end 170x. Gaming drove that scale. Gaming drove Nvidia to have enough to spend on CUDA. Gaming drove NVidia to have enough to produce chip designs optimized for other types of workloads. CUDA enabled ML work that wasn't possible before. That drove Google to realize they needed to move away from ML on CPU if they wanted to be competitive.
You don't need any faith, just understand the history and how competition drives behavior.
Even when you build cool things it's respectful not to plant them in HN comments :)
I think the usual solution to this is to talk about cool stuff you've done that is only incidentally relevant to the product you're selling. For example, some detail on how you built a technical system or solved a problem, etc...
Converge towards what though... I think the level of testing/verification you need to have an LLM output a non-trivial feature (e.g. Paxos/anything with concurrency, business logic that isn't just "fetch value from spreadsheet, add to another number and save to the database") is pretty high.
In this new world, why stop there? It would be even better if engineers were also medical doctors and held multiple doctorate degrees in mathematics and physics and also were rockstar sales people.
Not at all. I don't even know why someone would be incentivized by promoting Nvidia outside of holding large amounts of stock. Although, I did stick my neck out suggesting we buy A6000s after the Apple M series didn't work. To 0 people's surprise, the 2xA6000s did work.
Go has a critical mass that Swift clearly doesn't (i.e. there are many, many companies who have net profits of >$1bn and write most of their server software in Go).
Additionally Google isn't selling Go as a product in the same way as Apple does Swift (and where Google does publish public Go APIs it also tends to use them in the same way as their users do, so the interests are more aligned)...
> Additionally Google isn't selling Go as a product in the same way as Apple does Swift
Hmm, Apple isn't selling Swift as a product either; it's literally what they needed for their own platform, much like how GOOG needed Go for their server works.
I suspect that Mozilla being the primary developer and sponsor for many years actually meant that compatibility with all major platforms was prioritised; Mozilla obviously care about stuff working on Windows, and run lots of builds on Windows + I imagine a number of Firefox developers (if not drive) at least own a Windows machine for testing Windows-specific stuff!
I call out Windows because I think generally software people go for Mac > Linux > Windows (although Mac > Linux may be slowly changing due to liquid glass).
Is liquid glass really that bad? I left Mac years ago due to other annoyances. It was my daily driver for a decade and change. But I couldn't get used to the iOSification and the dependence on apple cloud services for most new features. When I started with macOS jaguar it was just a really good commercial UNIX. It got even better with Tiger and leopard.
But the later years I spent every release looking at new fancy features I couldn't use because I don't use apple exclusively (and I don't use iOS at all, too closed for me). So almost no features that appealed to me while usually breaking some parts of the workflow I did use.
While I did hate the 'flat' redesign after Mavericks that on its own was not really a deal-breaker though. Just an annoyance.
I'm kinda surprised liquid glass is so bad people actually leave for it. Or is it more like the last drop?
No, but every release of MacOS has a noisy minority declaring it, or some features of it, as the end of Macs. Some people will genuinely hate it in the way that nothing can be universally loved, some people will abandon Macs over it, most people don't feel strongly about it at all.
Maybe there's some people out there that love it, even.
I can barely tell the difference between the Mac I use that's been upgraded, and the Mac that hasn't due to its age, because I'm not spending my time at the computers staring at the decor. The contents of the application windows is the same.
I don’t like it, but I think the claims of mass exodus are unlikely.
It feels a lot like the situation when Reddit started charging for their API: Everywhere you looked you could find claims that it was the end of Reddit, but in the end it was just a vocal minority. Reddit’s traffic patterns didn’t decline at all.
Liquid Glass really is that bad. Not because the visual design is especially bad (not my cup of tea but it's okay); but because all of macOS is now incredibly janky. Even Spotlight is a janky mess now with lots of broken animations.
It's unfinished. For example, the more rounded windows would require that scrollbars or other widgets are more inset and things like that. The system doesn't seem to handle this automatically, so many apps look broken, even Apple's first party ones.
> If the median UK salary is >£35,000 I really wonder how arrive at the conclusion that missing a flight will set you back "years or decades"...
Ok, now take that figure and deduct tax, housing, food, utilities and so on - how much do you think is disposable/saveable? And then take the typical cost of a last-minute replacement flight and compare those two numbers.
> Trump is taking an law that says "You can do X if Y" and saying "I can do X"
I think it's more like going 70mph downtown because there's a sign saying "if onn an interstate you can do 70mph" -- the "if on an interstate" is pretty important there!
reply