Hacker Newsnew | past | comments | ask | show | jobs | submit | digitalPhonix's commentslogin

> Our North Star is ‘1 engineer, 1 month, 1 million lines of code.’

A guiding metric on lines of code. How could this possibly go wrong?


The important metric is how many tests need to be written to trust those lines of code (in one month, and more ludicrously by one engineer).

Maybe the plan is not testing because they think that Rust is "safe" and automatic translation is smarter than a typical Microsoft employee.


> automatic translation is smarter than a typical Microsoft employee.

not really a high bar is it...


Delusional rockstar engineers might really have this kind of opinion of their LLMs and their colleagues.

With less sarcasm, how bad is Microsoft as a workplace? The source of this kind of naively arrogant proclamation can be anything from a single rotten apple, to a critical mass of unchecked idiots in the same business unit, to normal for the company culture, to a top-down mandate.


There’s no info on what the AI does? Or where the AI runs? Or anything really?

Is it just Mozilla testing the waters with the announcement?


Their announcement does not reveal much, perhaps signing up will reveal more. But I am hesitant to do that since I don't even know if I want this feature.

On the other hand, are they even listening to their users or are they just adding AI to everything?


Yeah!

> "I scraped every single restaurant in Greater London"

How hard is that now? I assumed that Google is very protective of that data


I would be interested how this was done as well. She mentioned the was using a free tier from Google, so maybe the data is not protected.


> No one (yes, yes, I'm sure like 3 people) want a list of all results, unordered or ordered by something useless like name

That's not what the author was suggesting (or indeed, what they built). They were trying to untangle the positive feedback bias showing up first in the rankings gives.

I think there's probably a lot more to untangle, but as a first pass it's super cool!


It's the feigned surprise and sort of attitude that google is doing something malicious or it's a subterfuge. Starting with a bolded "Google Maps Is Not a Directory. It’s a Market Maker." and inishing with eg

> the most important result isn’t which neighbourhood tops the rankings - it’s the realisation that platforms now quietly structure survival in everyday urban markets.

For any service like this, _of course_ ranking is at the core of it. A more honest article could have started there, eg "since you can't display all results, and doing so is useless to everyone, the heart of these products is their ranking algorithm and choices. Let's examine Google's."


A tone of breathless wonder is now the coin of the realm. Quality research and interesting analysis gets the same treatment as everything else, because that's what gets clicks and responses. Dinging an individual article for this is arbitrary and capricious.

Don't hate the player, hate the game. I hate the game too, fwiw.


Still a lie though. If you don't know / aren't familiar with a ranker, the author is priming you through the entire article to believe google is doing something wrong or malicious by ranking the results. Rather than the same thing search engines have been doing for 30 years. Whether their ranker is good or bad (and for whom) is separate.

Including, of course, the way many popular chain restaurants got there is they make food a lot of people like.


That’s what lesuorac is saying. The SEC found he violated the rules for a publicly traded company... And then could do absolutely nothing about it to enforce the rules.


Glass transition temperature I think


> Well it turns out there is one customer who really really hates hybrids, and only wants to use ML-KEM1024 for all their systems. And that customer happens to be the NSA. And honestly, I do not see a problem with that.

Isn’t the problem (having only read a little about the controversy) that the non-hybrid appears to be strictly worse, except for the (~10%) decrease in transmission size; and that no one has articulated why that’s a desirable tradeoff?

On the face of it, I don’t see a problem with the tradeoff (both ways, that is) choice existing. I expect smarter people than me to have reasons one way or the other but I haven’t seen a reason for saving bandwidth that could articulate the concrete use case that it makes a difference.

> There is no backdoor in ML-KEM, and I can prove it. For something to be a backdoor, specifically a “Nobody but us backdoor” (NOBUS), you need some way to ensure that nobody else can exploit it, otherwise it is not a backdoor, but a broken algorithm

Isn’t a broken algorithm also a valid thing for NSA/whoever to want?

Them saying they want to use it themselves doesn’t actually mean much?


Actually, thinking about this a bit more - saying that there's no "Nobody but us backdoor" to prove there's no backdoor is a poor argument.

As an example - if there's a weakness that affects 50% of keys (replace with whatever hypothetical number), NSA can make sure it doesn't use those affected keys but still retain the ability to decrypt 50% of everyone else's communications. And using the entropy analysis from this post, that would require 1 bit hidden in the parameters which is clearly within the entropy budget.


Very nice writeup!

> Years are calculated backwards

How did that insight come about?


Thanks.

I was fortunate enough to be programming on an ARM based device, which meant that the terms (x * 4 + 3) strongly stood out to me as highly inefficient, being 2 cycle prep for the more important division. On x64 computers, those two operations are calculated in only one operation by using the 'LEA' assembly instruction (which I wasn't aware of at the time), and so others using that type of computer might not have felt this step needed any simplification.

I tried everything under the sun to get rid of these steps. The technique noted in the article of using the year 101 BC was for a long time my strongest candidate, you can view the implementation of that attempt at the link below [1].

An epoch of 101 BC still meant that there was extra work required to re-normalise the timeline after the century calculation, but it was only a single addition of 365 in the calculation of `jul`. The explanation of how this works is probably a whole blog post in itself, but now that this algorithm has been discarded it's not worth the time to explain it fully.

I also had the year-modulus-bitshift technique developed at that time, but it couldn't be integrated cleanly with any of my algorithm attempts yet. My plan was to simply document it as an interesting but slower concept.

I don't know what sparked the idea of going backwards other than immersing myself deeply in the problem in my spare time for about a month. It finally came to me one evening, and I thought it was only going to save 1-cycle, but when it also meant the year-modulus-bitshift could be utilised, the entire thing fit together like a glove and the speed collapsed down from 20% time saving to 40%.

[1] https://github.com/benjoffe/fast-date-benchmarks/blob/218356...


> Following these [other] incidents and a series of reports on photographers, a U.S. federal judge has temporarily barred Homeland Security agents from using riot control weapons on journalists in the Chicago area.

Why temporarily?


I think air traffic is probably the most resilient group - I’m surprised no one answered!

IFR was designed long before GPS and for the most part, GPS has been shoehorned into the “old” system. VORs around the country are still “primary” for navigations; airways are still primarily defined around VOR radials; and approach plates to large airports have plenty of non-GPS precision approaches. (Some smaller GA-only airfields that recently got IFR approaches might be WAAS/GPS only).

Losing GPS might increase workload for some sectors (en route sectors who won’t be able to clear aircraft direct to waypoints) but not likely TRACON who are vectoring aircraft on pre-defined approach plates.

If you pick a random commercial flight on your favourite flight tracker and check it’s route, 99% of the waypoints on it are defined as VOR intersections, not GPS coordinates. (The remaining 1% are likely en-route waypoints and not in the departure/approach area).

Also, the instrument proficiency requirements for pilots require multiple approach types to be logged every 6 months so they are definitely capable of non-GPS approaches.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: