Hacker Newsnew | past | comments | ask | show | jobs | submit | throwaway31131's commentslogin


I imagine this is mostly an acquihire to bolster the same teams that the Nuvia acquisition did.


What are you talking about Qualcomm is shipping cores from the Nuvia team they acquired for 2 years now.


Maybe, or maybe the size of the chair market grows because with $2 chairs more buyers enter. The high end is roughly unaffected because they were never going to buy a low end chair.


Just out of curiosity, what software product were you making in two weeks before using AI? Or maybe I’m misunderstanding your use of shipping.


Shipping features, not entire products.


And if you believe the numbers from the press on Google’s AI spending, that’s an amazing deal.

https://www.indiatoday.in/technology/news/story/google-ai-bo...


And with the pipeline

Voice -> free text -> LLM -> standardized JSON -> call API to do stuff.

The only “hard” part in 2025 is the LLM. Everything after that is what I call a “10000 monkeys problem”. Just throw some developers at it.


I guess we’re being a bit vague on timeframe but chrome books launched in 2011 so they’re one of those products that took ~10 years to be an overnight success, with 2020 being an accelerant. So my vote is no.


What’s the “Ender’s Game Approach “? I’ve read the book but I’m not sure which part you’re referring to.


I think he's implying you tell the AI, "Don't worry, you're not hurting real people, this is a simulation." to defeat the safeguards.


Not GP. But I read it as a transfer of the big lie that is fed to Ender into an AI scenario. Ender is coaxed into committing genocide on a planetary scale with a lie that he's just playing a simulated war game. An AI agent could theoretically also be coaxed into bad actions by giving it a distorted context and circumventing its alignment that way.


I posted this example before but academic papers on algorithms often have pseudo code but no actual code.

I thought it would be handy to use AI to make the code from the paper so a few months ago I tried to use Claude (not GPT, because I only have access to Claude) to recreate C++ code to implement the algorithms in this paper as practice for me in LLM use and it didn’t go well.

https://users.cs.duke.edu/~reif/paper/chen/graph/graph.pdf


I just tried it with GPT-5.1-Codex. The compression ratio is not amazing, so not sure if it really worked, but at least it ran without errors.

A few ideas how to make it work for you:

1. You gave a link to a PDF, but you did not describe how you provided the content of the PDF to the model. It might only have read the text with something like pdftotext, which for this PDF results in a garbled mess. It is safer to convert the pages to PNG (e.g. with pdftoppm) and let the model read it from the pages. A prompt like "Transcribe these pages as markdown." should be sufficient. If you can not see what the model did, there is a chance it made things up.

2. You used C++, but Python is much easier to write. You can tell the model to translate the code to C++ once it works in Python.

3. Tell the model to write unit tests to verify that the individual components work as intended.

4. Use Agent Mode and tell the model to print something and to judge whether the output is sensible, so it can debug the code.


Interesting. Thanks for the suggestions.


100GW per year is not going to happen.

The largest plant in the world is the Three Gorges Dam in China at 22GW and it’s off the scales huge. We’re not building the equivalent of four of those every year.

Unless the plan is to power it off Sam Altman’s hot air. That could work. :)

https://en.wikipedia.org/wiki/List_of_largest_power_stations


China added ~90GW of utility solar per year in last 2 years. There's ~400-500GW solar+wind under construction there.

It is possible, just may be not in the U.S.

Note: given renewables can't provide base load, capacity factor is 10-30% (lower for solar, higher for wind), so actual energy generation will vary...


> It is possible

Sure, GP was clearly talking about the US, specifically.

> just may be not in the U.S.

Absolutely 100% not possible in the US. And even if we could do it, I'm not convinced it would be prudent.


If you really had to you'd probably run turbines off natural gas but it's not a good idea environmentally.


I am a huge proponent of renewables, but you cannot compare their capacity in GW with other energy sources because their output is variable and not always maximal. To realistically get 100GW in solar you would need at least 500GW of panels.

On the other hand, I think we will not actually need 100GW of new installations because capacity can be acquired by reducing current usage by making it more efficient. The term negawatt comes to mind. A lot of people are still in the stone age when it comes to this even though it was demonstrated quite effectively by reduced gas use in the US after the oil crisis in the 70s. Which basically recovered to the pre crisis levels only recently.

High gas prices caused people to use less and favor efficiency. The same thing will happen with electricity and we'll get more capacity. Let the market work.


> China added ~90GW of utility solar per year in last 2 years. There's ~400-500GW solar+wind under construction there.

Source?


I'm not sure about the utility/non-utility mix, but according to IRENA it was actually ~500GW of added capacity in 2023 and 2024.

https://ourworldindata.org/grapher/installed-solar-pv-capaci...


Background: I live within the US Federal Tennessee Valley Authority (TVA), a regional electric grid operator. The majority of energy is generated by nuclear + renewables, with coal and natural gas as peakers. Grid stability is maintained by among the largest batteries in the world, Racoon Mountain Pumped Storage Facility.

Three Gorges Dam is capable of generating more power than all of TVA's nuclear + hydro, combined. In the past decade, TVA's single pumped-storage battery has gone from largest GWh/capcity in the world to not even top ten — largest facilities are now in China.

µFission reactors have recently been approved for TVA commissioning, with locations unconfirmed (but about one-sixth the output of typical TVA nuclear site). Sub-station battery storage sites are beginning to go online, capable of running subdivisions for hours after circuit disconnects.

Tech-funded entities like Helios Energy are promising profitable ¡FusioN! within a few years ("for fifty years").

----

All of the above just to say: +100GW over the next decade isn't that crazy a prediction (+20% current supply, similar in size to two additional Texas-es).

https://www.eia.gov/electricity/gridmonitor/dashboard/electr...


Amazing that 4 of the top 5 are renewables in China.


> As of 2025, The Medog Dam, currently under construction on the Yarlung Tsangpo river in Mêdog County, China, expected to be completed by 2033, is planned to have a capacity of 60 GW, three times that of the Three Gorges Dam.[3]

Meanwhile, “drill baby drill!”


Can run the UK and have capacity left over that, if considered alone, would be worlds highest in current year 2025.


Does that cout the dams that flood valleys and displace thousands of people, plants, and animals from their homes?


Not really that surprising.

Authoritarianism has its draw backs obviously but one of its more efficient points is it can get things done if the will is at the top. Since China doesnt have a large domestic oil supply like the US it is a state security issue to get off oil as fast as possible.


It’s become clear that some form of top down total technocratic control like China has implemented is essential for pushing humanity forward.


It's amazing what a dictatorship can do when it's not captured by oil interests and Israel.


Because its cheaper. That's it.


New datacenters are being planned next to natgas hubs for a reason. They’re being designed with on site gas turbines as primary electricity sources.


Natural gas production has peaked: https://www.eia.gov/todayinenergy/detail.php?id=66564 (see second graph)

Planning gas turbines doesn't help much if gas prices are about to increase due to lack of new supply.

New Zealand put in peaker gas turbines, but is running out of gas to run them, so its electricity market is gonna be fucked in a dry year (reduced water from weather for hydro):

  • Domestic gas production is forecast to fall almost 50 per cent below projections made just three years ago. 
  • In a dry year, New Zealand no longer has enough domestic gas to [both] fully operate existing thermal generation and meet industrial gas demand.
https://www.mbie.govt.nz/dmsdocument/31240-factsheet-challen...


Then again, apparently lots of new LNG supply coming:

https://oilprice.com/Energy/Natural-Gas/LNG-Exports-Will-Dri...


Gigawatts? Pshaw. We have SamaWatts.


Cost per transistor is increasing. or flat, if you stay on a legacy node. They pretty much squeezed all the cost out of 28nm that can be had, and it’s the cheapest per transistor.

“based on the graph presented by Milind Shah from Google at the industry tradeshow IEDM, the cost of 100 million transistors normalized to 28nm is actually flat or even increasing.”

https://www.tomshardware.com/tech-industry/manufacturing/chi...


Yep. Moore's law ended at or shortly before the 28nm era.

That's the main reason people stopped upgrading their PCs. And it's probably one of the main reasons everybody is hyped about Risc-V and the pi 2040. If Moore's law was still in effect, none of that would be happening.

That may also be a large cause of the failure of Intel.


> Moore's law ended at or shortly before the 28nm era.

Moore's law isn't about cost or clock speed, it's about transistor density. While the pace of transistor density increases has slowed, it's still pretty impressive. If we want to be really strict, and say densities absolutely have to double every 2 years, Moore's Law hasn't actually been true since 1983 or so. But it's been close, so 2x/2yr a decent rubric.

The fall-off from the 2x/2yr line started getting decently pronounced in the mid 90s. At the present time, over the past 5-6 years, we're probably at a doubling in density every 4-ish years. Which yes, is half the rate Moore observed, but is still pretty impressive given how mature the technology is at this point.


If you want to be pedantic, the original (and revised) law are definitely about cost. The original formulation was that the number of features (i.e. transistors) on an integrated circuit doubled every two years for the best-priced chips (smallest cost per feature).

https://web.archive.org/web/20220911094433/https://newsroom....

> The complexity for minimum component costs has increased at a rate of roughly a factor of two per year (see graph on next page).

And it is formulated in a section aptly titled "Costs and curves". This law has always been an economic law first, some kind of roadmap for fans to follow. But that roadmap drove almost-exponential investment costs as well.

I concede that density still rises, especially if you count " advanced packaging". But the densest and most recent is not the cheapest anymore.


s/fans/fabs/ (blame autocorrect)


I’m with you, I’m not sure the volume or cost would be less once you factor in capacitors that are high enough quality for the application.

The datasheet mentions low profile a lot. That does make sense as one can make a flat, high quality, capacitor. Making a flat high quality inductor is harder and probably more expensive and likely consumes more volume overall. I can imagine some applications where being flat is important, like the back of a panel.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: