Hacker Newsnew | past | comments | ask | show | jobs | submit | more Yokolos's commentslogin

You clearly haven't met a lot of your average PC or phone user then. Most people don't care about getting the newest and best thing. If a thing still works, they'll use it until it doesn't anymore, however long that is. You have no idea the kinds of PCs I saw people using when I worked as a technician. People just don't have an interest in getting new tech unless they're forced to, because they largely aren't interested in tech. They're interested in document processing, watching videos, listening to music and dealing with their pictures. And they don't care how old the device is they're doing it on.

In addition, they don't want to spend money on it. They'd rather spend money on things they actually care about. Festivals, clubs, vacations, a new TV, a car, restaurants, whatever. Your average non-tech person is happy if they don't have to spend anything on gadgets for 10 years.


My mum was still happily on some 8 year old iphone, I'm not even sure which one that was, and then got really annoyed that she had to upgrade just because her banking apps stopped updating and wouldn't log in anymore. It's just pure and complete e-waste.


Somebody I know had asthma while she lived in Jakarta. It went away when she moved to Europe. I really liked Jakarta, but the air quality is one of the reasons why I won't go back again.


At the very least, you can usually still get the data off of them. Most SSDs I've encountered with defects failed catastrophically, rendering the data completely inaccessible.


Any given TBW/DWPD values are irrelevant for unpowered data retention. Afaik, nobody gives these values in their datasheet and I'm wondering where their numbers are from, because I've never seen anything official. At this point I'd need to be convinced that the manufacturers even know themselves internally, because it's never been mentioned by them and it seems to be outside the intended use cases for SSDs


Example: https://www.ssd.group/wp-content/uploads/2022/07/PM1733-25-S...

> Component Design Life 5 years

> TBW 14 PB for 7.68 TB drives

> Data Retention 3 months

And then 2.7.3 explains that this number applies for 40 °C ambient, not the operating/non-operating range (up to 85 °C).


That's literally just the JEDEC spec.


Point being?


Give it a month or two and it might be cheaper to get the bus.


I bought 32GB of DDR5 SODIMM last year for 108€ on Amazon. The exact same product that I bought back then is now 232€ on Amazon. I don't like this ride.


Yeah, similar for me. I bought 64 gigs of DDR5 laptop RAM about a year ago; it ended up costing about $190. Now the exact same listing is going for $470. https://a.co/d/fJH1GkW

I guess I'm glad I bought when I did; didn't realize how good of a deal I was getting.


It's crazy how good the writing in that game is and how applicable it is today. It makes me wonder who wrote it and what their influences were.


According to Wikipedia, the writing was (mostly?) by Brian Reynolds [1]. There's an interview [2] which is linked to by the Wikipedia page about the game [3].

[1]: https://en.wikipedia.org/wiki/Brian_Reynolds_(game_designer)

[2]: https://web.archive.org/web/20050221102608/http://pc.ign.com...

[3]: https://en.wikipedia.org/wiki/Sid_Meier%27s_Alpha_Centauri#I...


Seems like a distinction without a difference to me?

> And aside from a license renewal snafu in 1980, the device made no waves until its existence was shared with the local newspaper—it wasn’t a secret, just unpublicized.


I think "secret nuclear device" and "license renewal" are kind of conceptually incongruous, even if only at a surface level.


The license renewal has nothing to do with anything. It's not related to the incident where its existence was revealed. This is about "secret" vs "unpublicized", not "secret" vs "license renewal".


In this context "secret" implies they didn't tell the government. Merely not publicising an internal project is totally normal and doesn't warrant "secret project!!"


I guess that depends on how hysterically you read the word secret (including projecting hysterics on others using it). But we at work have a lot of secret projects. Basically everything is given a project code name until it’s public and if you work in R&D you are told not to discuss your work on such projects either outside the company with friends or inside the company with people who don’t work in R&D. That is the closest to the definition of secret I can imagine. And it sounds like this nuclear lab was in a similar category.

If someone freaks out about it, it’s because they think you’re abusing normal, run of the mill product development secrecy, whether to develop a product that shouldn’t exist or to hide a practice that is never intended to be public and is just called secret to avoid scrutiny from an interested public (who, in this hypothetical scenario, feel that they have a right to be interested — think research into dangerous pathogens next to an unprotected public aquifer).


Is there a penalty to discussing the secret projects? Like if your manager/director/vp knew you were talking specifics without some authorized, what would happen?

It sounds like there is no penalty to the nuclear labs except, if you blab to the wrong person, it’s going to stir up trouble.


I’ve never heard of anything more happening than being reminded not to do that (pretty much the only time it happens is when someone is talking with product support and lets slip a feature or product they’re working on will solve a complaint about an existing product). I’m sure you’d be fired if it was thought you did it intentionally to spread knowledge of the secret though.

I guess in this case the question comes down (for me) to whether employees at this lab were asked by their managers not to tell friends and acquaintances what they worked on. Even if not with an explicit threat of harm, asking someone not to tell something is pretty much exactly what asking them to keep it a secret means.


yeah but is this internal project one where we kidnap homeless people and torture them in the name of science or is it because we spent $50,000 to make a new logo? Some secret are meant to be kept. Others are meant to be blown wide open. Others... Other just are, and nobody need to know. Posting that a particular woman's a slut is a shitty thing to do on Facebook, but if one of my male friends is feeling extra lonely and ready to end it all, there's a date or two I could set him up on.


I wouldn't be so sure. I've seen analyses making the case that this new phase is unlike previous cycles and DRAM makers will be far less willing to invest significantly in new capacity, especially into consumer DRAM over more enterprise DRAM or HBM (and even there there's still a significant risk of the AI bubble popping). The shortage could last a decade. Right now DRAM makers are benefiting to an extreme degree since they can basically demand any price for what they're making now, reducing the incentive even more.

https://www.tomshardware.com/pc-components/storage/perfect-s...


The most likely direct response is not new capacity, it's older capacity running at full tilt (given the now higher margins) to produce more mature technology with lower requirements on fabrication (such as DDR3/4, older Flash storage tech, etc.) and soak up demand for these. DDR5/GDDR/HBM/etc. prices will still be quite high, but alternatives will be available.


> produce more mature technology ... DDR3/4

...except current peak in demand is mostly driven by build-out of AI capacity.

Both inference and training workloads are often bottlenecked on RAM speed, and trying to shoehorn older/slower memory tech there would require non-trivial amount of R&D to go into widening memory bus on CPU/GPU/NPUs, which is unlikely to happen - those are in very high demand already.


Even if AI stuff does really need DDR5, there must be lots of other applications that would ideally use DDR5 but can make do with DDR3/4 if there's a big difference in price


I mean, AI is currently hyped, so the most natural and logical assumption is that AI drives these prices up primarily. We need compensation from those AI corporations. They cost us too much.


It is still an assumption.


> The shortage could last a decade.

Do we really think the current level of AI-driven data center demand will continue indefinitely? The world only needs so many pictures of bears wearing suits.


The pop culture perception of AI just being image and text generators is incorrect. AI is many things, they all need tons of RAM. Google is rolling out self-driving taxis in more and more cities for instance.


Congrats on engaging with the facetious part of my comment, but I think the question still stands: do you think the current level of AI-driven data center demand will continue indefinitely?

I feel like the question of how many computers are needed to steer a bunch of self-driving taxis probably has an answer, and I bet it's not anything even remotely close to what would justify a decade's worth of maximum investment in silicon for AI data centers, which is what we were talking about.


Data center AI is also completely uninteresting/non-useful for self driving Taxis, or any other self driving vehicle.


Do you know comparatively how much GPU time training the models which run Waymo costs compared to Gemini? I'm genuinely curious, my assumption would be that Google has devoted at least as much GPU time in their datacenters to training Waymo models as they have Gemini models. But if it's significantly more efficient on training (or inference?) that's very interesting.


My note is specifically for operating them. Training the models, certainly can help.


A decade is far from indefinitely.


AI is needed to restart feudalism?


No, the 10% best scenario return on AI won't make it. The bubble is trying to replace all human labor, which is why it is a bubble in the first place. No one is being honest that AGI is not possible in this manner of tech. And Scale won't get them there.


There's not a difference between "consumer" DRAM and "enterprise" DRAM at the silicon level, they're cut from the same wafers at the end of the day.


Doesn't the same factory produce enterprise (i.e. ECC) and consumer (non-ECC) DRAM?

If there is high demand for the former due to AI, they can increase production to generate higher profits. This cuts the production capacity of consumer DRAM, and lead to higher prices in that segment too. Simple supply & demand at work.


Conceptually, you can think of it as "RAID for memory".

A consumer DDR5 module has two 32-bit-wide buses, which are both for example implemented using 4 chips which each handle 8 bits operating in parallel - just like RAID 0.

An enterprise DDR5 module has a 40-bit-wide bus implemented using 5 chips. The memory controller uses those 8 additional bits to store the parity calculated over the 32 regular bits - so just like RAID 4 (or RAID 5, I haven't dug into the details too deeply). The whole magic happens inside the controller, the DRAM chip itself isn't even aware of it.

Given the way the industry works (some companies do DRAM chip production, it is sold as a commodity, and others buy a bunch of chips to turn them into RAM modules) the factory producing the chips does not even know if the chips they have just produced will be turned into ECC or non-ECC. The prices rise and fall as one because it is functionally a single market.


That makes sense, thank you.


At the silicon level, it is the same.

Each memory DIMM/stick is made up of multiple DRAM chip. ECC DIMMs have an extra chip for storing the error correcting parity data.

The bottleneck is with the chips and not the DIMMs. Chip fabs are expensive and time consuming, while making PCBs and placing components down onto them is much easier to get into.


Got it now, thanks!


Yes, but if new capacity is also redirected to be able to be sold as enterprise memory, we won't see better supply for consumer memory. As long as margins are better and demand is higher for enterprise memory, the average consumer is screwed.


Does it matter that AI hardware has such a shorter shelf life/faster upgrade cycle? Meaning we may see the ram chips resold/thrown back into the used market quicker than before?


Is there still a difference? I have DDR5 registered ECC in my computer.


I mean, the only difference we care about is how much of it is actual RAM vs HBM (to be used on GPUs) and how much it costs. We want it to be cheap. So yes, there's a difference if we're competing with enterprise customers for supply.

I don't really understand why every little thing needs to be spelled out. It doesn't matter. We're not getting the RAM at an affordable price anymore.


Anytime somebody is making a prediction for the tech industry involving a decade timespan I pull out my Fedora of Doubt and tip my cap to m’lady.


Maybe we'll get default to ECC in everything with this?


A LOT of businesses learned during Covid they can make more money by permanently reducing output and jacking prices. We might be witnessing the end times of economies of scale.


The idea is someone else comes in that's happy to eat their lunch by undercutting them. Unfortunately, we're probably limited to China doing that at this point as a lot of the existing players have literally been fined for price fixing before.

https://en.wikipedia.org/wiki/DRAM_price_fixing_scandal


It seems more likely that someone else comes in and either colludes with the people who are screwing us to get a piece of the action or gets bought out by one of the big companies who started all this. Since the rare times companies get caught they only get weak slaps on the wrist where they only pay a fraction of what they made in profits (basically just the US demanding their cut) I don't have much faith things will improve any time soon.

Even China has no reason to reduce prices much for memory sold to the US when they know we have no choice but to buy at the prices already set by the cartel. I expect that if China does start making memory they'll sell it cheap within China and export it at much higher prices. Maybe we'll get a black market for cheap DRAM smuggled out of China though.


I think in part it is a system level response to the widespread just-in-time approach of those businesses' clients. A just-in-time client is very "flexible" on price when supply is squeezed. After that back and forth i think we'll see return to some degree of supply buffering(warehousing) to dampen down the supply levels/price shocks in the pipelines.


I thought that, too, but then the Nexperia shitstorm hit, and it was as if the industry had learned nothing at all from the COVID shortages.


In that case it's far simpler - even IF they wanted to met the demand, building more capacity is hideously expensive and takes years.

So, it would happen even with best intentions and no conspiracies. AI boom already hiked GPU prices, memory was next in line.


It's not disabled in the sense many people are thinking. The codecs just aren't installed by default. The hardware is present and still functional. You just have to use software that directly supports HEVC or buy your own HEVC license on the Microsoft store for $1 to get system-wide hardware accelerated HEVC codecs.


The hardware acceleration is disabled in driver. Even using VLC you won't have acceleration for HEVC.


That seems like the opposite of what the quoted Reddit post says:

>those with newer machines needed to either have the HEVC codec from the Microsoft Store removed entirely from [Microsoft Media Foundation] or have hardware acceleration disabled

From this it sounds like it's been disabled at a lower level, but Windows still expects it to be there and so fails to decode streams unless hwaccel is disabled


Even on Linux?


Linux doesn't use the same drivers as windows


Is it confirmed that it is being disabled through drivers?


It doesn't work in windows, but does work in linux


I don't understand why people downvote questions like this rather than just answer the question. It's a perfectly reasonable question imo given that it's not clear how this feature is being disabled. It appears that most of this is based on reddit speculation and the OEMs don't provide a definitive answer.

Meta: recently it seems like the community has been way too loose with the downvote button, but I'm not sure if I'm just noticing it more because it's getting on my nerves, or if there has actually been a change in behavior.


There has been a change in behavior in the past few years, in fact it used to be that you could only become a HN member that can comment thus vote by posting a select number of threads before being able to comment. This actually kept the community on the more intelligent, factual, and serious side. Now it's not so serious.

This used to be the only place that I could visit to get away from Reddit behavior. It seems like the more obscure a social gathering is, the less Eternal September it suffers.


> Meta: recently it seems like the community has been way too loose with the downvote button, but I'm not sure if I'm just noticing it more because it's getting on my nerves, or if there has actually been a change in behavior.

The term "orange reddit" feels more and more like reality as time goes on.


Sure it's not

> "a semi-noob illusion, as old as the hills."?

:)

https://news.ycombinator.com/newsguidelines.html


Why does anyone care about downvotes? Is there somewhere I can cash in my karma points here for something actually valuable?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: