Hacker Newsnew | past | comments | ask | show | jobs | submit | fauigerzigerk's commentslogin

I'm starting to wonder though whether companies even have the in-house competence to compare the options and price this risk correctly.

>Now a good company would concentrate risk on their differentiating factor or the specific part they have competitive advantage in.

Yes, but one differentiating factor is always price and you don't want to lose all your margins to some infrastructure provider.


Software companies have higher margins so these decisions are lower stakes. Unless on premises helps the bottom line of the main product that the company provides, these decisions don't really matter in my opinion.

Think of a ~5000 employee startup. Two scenarios:

1. if they win the market, they capture something like ~60% margin

2. if that doesn't happen, they just lose, VC fund runs out and then they leave

In this dynamic, costs associated with infrastructure don't change the bottomline of profitability. The risk involved with rolling out their on infrastructure can hurt their main product's existence itself.


I'm not disputing that there are situations where it makes sense to pay a high risk premium. What I'm disputing is that price doesn't matter. I get the impression that companies are losing the capability to make rational pricing decisions.

>Unless on premises helps the bottom line of the main product that the company provides, these decisions don't really matter in my opinion.

Well, exactly. But the degree to which the price of a specific input affects your bottom line depends on your product.

During the dot com era, some VC funded startups (such as Google) made a decision to avoid using Windows servers, Oracle databases and the whole super expensive scale-up architecture that was the risk-free, professional option at the time. If they hadn't taken this risk, they might not have survived.

[Edit] But I think it's not just about cloud vs on-premises. A more important question may be how you're using the cloud. You don't have to lock yourself into a million proprietary APIs and throw petabytes of your data into an egress jail.


Precious real-world engineering skills also play a role.

But most importantly, the attractive power that companies doing on-premise infrastructure have towards the best talent.


Beyond GPUs themselves, you also have other costs such as data centers, servers and networking, electricity, staff and interest payments.

I think building and operating data center infrastructure is a high risk, low margin business.


Perhaps it depends on the nature of the tech-debt. A lot of the software we create has consequences beyond a paticular codebase.

Published APIs cannot be changed without causing friction on the client's end, which may not be under our control. Even if the API is properly versioned, users will be unhappy if they are asked to adopt a completely changed version of the API on a regular basis.

Data that was created according to a previous version of the data model continues to exist in various places and may not be easy to migrate.

User interfaces cannot be radically changed too frequently without confusing the hell out of human users.


>What I've come to realise is that the power of having a bash sandbox with a programming language and API access to systems, combined with an agentic harness, results in outrageously good results for non technical users. It can effectively replace nearly every standard productivity app out there - both classic Microsoft Office style ones - and also web apps.

I very much doubt that tinkering with a non-repeatable, probabilistic process is how most non-technical users will routinely use software.

I can imagine power users taking this approach to _create_ or extend productivity tools for themselves and others, just like they have been doing with Excel for decades. It will not _replace_ productivity tools for most non-technical users.


Maybe the idea is that associating a coin with something/anything that has momentum will make some people believe that the coin could take off along with the thing.

I doubt it. People are pretty savvy when it's about getting something more cheaply or for free.

From a consumer point of view, the best approach would be if devlopers had to sell their app in Apple's App Store (if Apple approves) and could optionally provide other purchasing options on top of that.

It would prevent fragmentation and give people a choice to pay up if they actually value Apple's extra protections (if any).


>but my expectation would be that judges in low-level courts will try their very best not to get noticed for setting any kind of precedent whatsoever.

Is there even such a thing as precedent in the German legal system?


There isn't.

Case law isn't directly normative in civil law traditions like Germany and France in the same way it is in common law traditions like the U.S. and U.K. But court decisions that are deemed interesting do get picked up in journals, cited in academic literature, and cited by other judges in their own decisions. There is a herd dynamics psychology where judges and academic writers default to following along with the principles established in each other's decisions and academic writing, rather than go against that consensus. (Unless their conviction is very strong, and they are, depending on the gravity of the issue, willing to stake their reputations and careers on those convictions). -- I brushed over that distinction when I used the term “precedent”. In my mind it's pot-ay-toh po-tah-toh.

>The rest of the world keeps ripping users off, and Apple's walled garden is as protected a thing as it gets.

This keeps getting repeated but it's not actually my experience. Not even Apple believes it, otherwise they could avoid a lot of legal and regulatory trouble by giving users a choice: Pay through Apple for an extra 30% protection fee.


I think Google Pay does not charge a fee.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: