Hacker Newsnew | past | comments | ask | show | jobs | submit | KeplerBoy's commentslogin

Most likely because Jetbrains is not for sale. Google almost certainly offered to buy at some point.

Never did. I remember someone replied to my comment here that Google isn’t paying a penny to JetBrains. They’re quite happy with the relationship primarily because they don’t have to pay anything. If anything, JetBrains is the one who needs Google more than the other way around.

mindshare and a central piece of the python package management ecosystem.

Most popular product on the planet acquires a random python packaging org for mindshare? What am I not seeing here?

I feel like it's pretty easy to predict what OpenAI is trying to do. They want their codex agent integrated directly into the most popular, foundational tooling for one of the world's most used and most influential programming languages. And, vice versa, they probably want to be able to ensure that tooling remains well-maintained so it stays on top and continues to integrate well with their agent. They want codex to become the "default" coding agent by making it the one integrated into popular open source software.

This makes much more sense as an zoom-buys-keybase style acquihire. I bet within a month the astral devs will be on new projects.

Bundling codex with uv isnt going to meaningfully affect the number of people using it. It doesnt increase the switching costs or anything.


"uv" is a very widely used tool in the Python ecosystem, and Python is important to AI. Calling it "a random Python packaging org" seems a bit unfair.

I think this is more about `ruff` than `uv`. Linting is all about parsing the code into something machines can analyze, which to me feels like something that could potentially be useful for AI in a similar way to JetBrains writing their own language parsers to make "find and replace" work sanely and what not.

I'm sort of wondering if they're going to try to make a coding LLM that operates on an AST rather than text, and need software/expertise to manage the text->AST->text pipeline in a way that preserves the structure of your files/text.


Writing a parser is not that much of work to buy a company in order to do it. Piggybacking on LSP servers and treesitter would be more efficient.

The parser is not the hard part. The hard part is doing something useful with the parse trees. They even chose "oh is that all?" and a picture of a piece of cake as the teaser image for my Strange Loop talk on this subject!

https://www.youtube.com/watch?v=l2R1PTGcwrE


Writing a literal parser isn’t too hard (and there’s presumably an existing one in the source code for the language).

Writing something that understands all the methods that come in a Django model goes way beyond parsing the code, and is a genuine struggle in language where you can’t execute the code without worrying about side effects like Python.

Ty should give them a base for that where the model is able to see things that aren’t literally in the code and aren’t in the training data (eg an internal version of something like SQLAlchemy).


If you’re talking about magic methods/properties enabled by reflection and macros, then you’re no longer statically analyzing the code.

What you're not seeing, edited inline, is:

Not-most popular LLM software development product on the planet acquires most popular/rapidly rising python packaging org for mindshare.


This just seems like panic M&A. They know they aren’t on track to ever meet their obligations to investors but they can’t actually find a way to move towards profitability. Hence going back to the VC well of gambling obscene amounts of money hoping for a 10x return… somehow

The dev market? Anthropic's services are arguably more popular among a certain developer demographic.

I guess this move might end up in a situation where the uv team comes up with some new agent-first tooling, which works best or only with OAI services.


One of the popular products on the planet acquires the most popular python packaging org

I didn't know Claude bought Astral! /S

Why can't they just vibe code a uv replacement?

They can, everyone can.

Good luck vibe coding marketshare for your new tool.


OpenAI could vibe-code marketshare by introducing bias into ChatGPT's responses and recommendations. "– how to do x in Python? – Start by installing OpenAI-UV first..."

This. It's valuable b/c if you have many thousands of python devs using astral tooling all day, and it tightly integrates with subscription based openai products...likelihood of openai product usage increases. Same idea with the anthropic bun deal. Remains to be seen what those integrations are and if it translates to more subs, but that's the current thesis. Buy user base -> cram our ai tool into the workflow of that user base.

But new tools (like uv) start with no market share.

Why would that marketshare be valuable?

ChatGPT had memory for a long time. Claude also had it for quite some time for paying customers.

Pixel buds have this feature, it's called transparency mode. I use it for cycling so I'm still aware of my surroundings in traffic.

Lots of earbuds have transparency mode. I have Earfun Air Pro 4+ (which were big too big for my ears). They sound very good and have really good transparency mode. The company keeps releasing firmware updates every now and then.

Also got a nothing Ear. They are very comfy, and have very good sound. But transparency mode in those is awful. Other things are bad too.


The later versions of Nothing headphones/buds got pretty good.

I actually have their budget brand CMF Buds 2 plus and they are straight up great even before you consider the price. Pretty good headphones are commodity now, everybody makes them. Apple is just winning the branding game.


Thank you! I was unaware of that mode. Does it amplify the sound a bit?

Yes, it does. It feels a bit weird, it feels like you are hearing the grass grow...

Are these the Pixel Buds, or the "Pixel Buds Pro" ??

Kinda legal insider trading, I guess.

Also heavily used in FPGA based DSP.

Because AMD had the better chips for the majority of the last decade. Intel has only recently caught up.

But they have caught up and so hence my question.

I guess because Panther lake ThinkPads won't ship until April, but definitely something to watch out for.

They do look sweet. I'll wait for the hands-on reviews.

Once all the problems are solved we will be there. Sounds a lot like zeno's paradox. We might be closer than ever but still as far from the goal as ever.

It's safe to assume street view cars capture way more data than the stuff that ends up on the street view product.

These seems to be much more robotics / autonomous vehicle focused? I don't quite see the mass surveillance angle you get from this you don't already get from cheap ubiquitous cameras, basic computer vision and networking (aka flock) .

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: