Hacker Newsnew | past | comments | ask | show | jobs | submit | headmelted's commentslogin

The fact this study even exists is a sign of something having gone very wrong IMHO.

The notion of tracking if time spent on anything helps “prevent burnout” speaks volumes to how we view ourselves as consumables.

The whole culture we have emphasises trading working the best years of your life just so you can (maybe) rest for a little while at the end of your life when your health is failing, which has always been really sad to me.


> The fact this study even exists is a sign of something having gone very wrong IMHO.

I agree, but for different reasons: The paper is an example of someone sending out surveys to collect self-reports and then writing a paper title as if they had performed a study. They did not. They just surveyed some college students and drew conclusions by running statistical analyses on the data until they got something that seemed significant.

It appears to have worked, though, as I’ve seen it shared across the internet by assuming it’s a robust proof of something.

This paper is very bad. The numbers in the abstract don’t even add up, which any reviewer should have caught. To be honest this feels like an undergraduate level assignment where students are asking to give a survey and do some statistical analyses. The students usually pick a topic close to their own life (like Super Mario Games) and then come up with some result by playing with their survey numbers until they find something.


This study reminds me of the types of projects I did when I took statistical psychology classes in undergrad. I was hoping to see data taken directly after participants had actually played the games in a controlled environment. Also, why focus on just Nintendo games?

Judging by the authors' affiliations and Nintendo-approved rhetoric, this does appear to be a shill.


> They just surveyed some college students and drew conclusions by running statistical analyses on the data until they got something that seemed significant.

Is this just cynicism or based on anything? From reading the methods section it doesn't appear this is what happened


From the paper:

> Methods:

> We used a mixed methods approach. First, qualitative data were collected through 41 exploratory, in-depth interviews (women: n=19, 46.3%; men: n=21, 51.2%; prefer not to disclose sex: n=11, 2.4%; mean age 22.51, SD 1.52 years) with university students who had experience playing Super Mario Bros. or Yoshi. Second, quantitative data were collected in a cross-sectional survey…

So interviews with a biased sample (students with experience playing the game) and then a survey.

Also, try adding up those n= numbers. They don’t sum to 41. The abstract can’t even get basic math or proofreading right.

If the body of the paper describes something different than the abstract, that’s another problem

EDIT: Yes, I know the n=11 was supposed to be an n=1. Having a glaring and easily caught error in the abstract is not a good signal for the quality of a paper. This is on the level of an undergraduate paper-writing exercise, not a scientific study as people are assuming.


Seems like n=11 should have been n=1. Use 19, 21, and 1 as a numerator of /41 and you end up with all the same percentages written in the abstract. A typo that should have been caught, but surely nothing more than that and certainly not substantive enough to qualify the claim below:

> This paper is very bad. The numbers in the abstract don’t even add up, which any reviewer should have caught.


> A typo that should have been caught, but surely nothing more than that and certainly not substantive enough to qualify the claim below:

Such an obvious error should have been caught by the authors proofreading their own work, to be honest. Any reviewer would also catch it when evaluating the quality of the sample size.

I find it strange that people are bending over backward to defend this paper and its obvious flaws and limitations.


It looks like "prefer not to disclose sex" was typoed and should be 1 instead of 11.

It does seem to be cynicism, they're convinced the authors "gave people surveys with a lot of questions and then tried to find correlations in the data", but nothing indicates they did more than the 9 questions (plus one more for sex as a control) the paper includes, and restricted it to only Mario/Yoshi players. Ten questions is pretty short.

> and restricted it to only Mario/Yoshi players.

Do you not see the problem with drawing conclusions from a sample set that pre-selects for Mario/Yoshi players?

How do you think they’re determining that playing Mario/Yoshi prevents burnout if they only surveyed Mario/Yoshi players?

I really don’t understand all of the push to support this paper and disregard critiques as cynicism. The paper is not a serious study, or even a well written paper. Is it a contrarian reflex to deny any observations about a paper that don’t feel positive or agreeable enough?


I've critiqued it plenty in other comments, including that exact issue. However, that doesn't mean they "gave people surveys with a lot of questions" to p-hack, it seems like a study designed (albeit not well designed) to test one specific hypothesis. I see no reason to question that they did the methods as described in the paper, which were designed to test this very specific thing (they didn't even test "childlike wonder" in general, just self-reported Mario-induced childlike wonder), but their conclusions aren't supported by their data. If they were p-hacking as you accuse them of, why not have more questions? Why not survey non-Mario players too so there's a new variable to create significant results out of a null?

I agree, but hard work is nothing new. Did the average person throughout history have more leisure than we do? I doubt it. I'm uncertain how to think about burnout in this context. Did they have burnout and were forced to work through it? Were they better at pacing themselves? Maybe the type of work (mental rather than physical labor) or circumstances (working for a corporation) today are more conducive to burnout?

I don’t have any citations, but I don’t think that “work” was at all similar to what we do now. Early hominid work would have involved many different tasks throughout the day, such as tracking, hunting, cleaning, gathering, building, repairing, traveling, etc, right? Compare that to “do this one task 8-16 hours in a row,” and it does seem like a mode of work we would be particularly ill suited for. Orrrr maybe I’m wrong, I’m using general knowledge and inductive reasoning, so I would not be suprised to learn I’m off base here.

> Did the average person throughout history have more leisure than we do? I doubt it.

Recent anthropological and archaeological research is challenging the traditional view that ancient lives were "nasty, brutish, and short." Instead, it appears that many ancient peoples worked less than eight hours per day and frequently took time off for festivals or to travel long distances to visit friends and family. And unlike today, work usually had a more flexible rhythm where short periods of hard work were separated by long periods of light work and rest.


> Instead, it appears that many ancient peoples worked less than eight hours per day

This statement is technically correct if you let the word “many” do the heavy lifting and ignore the people doing the work (slaves, etc)

Claiming that average life in the past was easier is just false, though. If it was easier to shelter, feed, and clothe yourself in the past then those methods wouldn’t have disappeared. You’d be able to do them now if you wanted to. Easier than before, in fact, because you can walk to the store and buy some wood instead of chopping down trees by hand and letting them dry for a few seasons before building, and so on.


Average person took long time off work to travel to visit far away friends? Call me sceptical, because this is provably untrue for pretty much any period and place we have actual resources about.

Can you provide the specific research you are referring to?

I don’t know what research they saw, but the claim was mainstreamed by the popular book “Sapiens”. The author romanticized past life and made claims that life was leisurely until agriculture came along and made us all miserable as we toiled working the soil. Before that we supposedly relaxed all day as our food was easy to catch and we didn’t have to build anything because we were always on the move. There are some very obvious problems with that statement that will be easily spotted by anyone who has ever done any hunting or camping.

This is ridiculous of course. Read Bret Deveraux’s recent series about peasant life.

I'm not sure how environmental factors play into this either. As a Gen-Xer, it often feels like the current late teens and early 20-somethings all have a crippling level of "anxiety" over what should be relatively simple human interaction, and this started well before COVID solidified this influence. Does this in general have an outsized effect on burnout?

I've felt true burnout twice in my life, the first time was after several years without any vacation time taken and about 3 months of 60-80 hour weeks. I literally hit a wall and couldn't even open a project in front of the computer, I was in a haze and not safe to even do anything. My brain was like, "nope!" More recently, a couple years ago it's been a larger state of dissolution about my career without a clear alternative so much as something that I would consider a disablement.


> Did the average person throughout history have more leisure than we do?

Unambiguously yes. This is well documented and impossible to ignore.

Marshal Sahlins described it best in Stone Age Economics but reading Graeber will get you there or Levi Strauss if you’re into the whole structural anthropology thing


It's not about leisure time. It's about the meaning of work. In the past, effects of your work were very direct - carry shitload of stone from one place to another together with your cousin, build a house for you and your family. Nowadays it's all very abstract - have a useless Teams meeting with people you don't care about so that you can do press buttons that maybe change some metrics you don't even understand. What was the last time you felt "I'm happy I built this"?

>Did the average person throughout history have more leisure than we do?

Yes. In the middle ages (and presumably in any agrarian society) people would work intensely for a few weeks and have the most of the year free.


Except that is bullshit. They did worked the other parts of the year too, just not doing the exact same agricultural work as those few weeks.

That thing simply ignores everything it takes to keep animals alive year round, keep kids alive year round, create and repair tools, keep house warm, create fabric, sew cloth, actually cook without modern tools and so on and so forth.

Just because there is a rush time does not mean workers do nothing the rest od the time.


> The whole culture we have emphasises trading working the best years of your life just so you can (maybe) rest for a little while at the end of your life when your health is failing, which has always been really sad to me.

Have you considered getting a job you like better?

You can also take sabbaticals. Or retire early.


Unrealistic for most working people with families.

Maybe, but then that calls into question the whole premise:

> The whole culture we have emphasises trading working the best years of your life just so you can (maybe) rest for a little while at the end of your life when your health is failing, which has always been really sad to me.

If you value your family so much, you are effectively working for them, not for the little rest at the end of your life.


This is the problem of evil right. Those human tribes who just chilled out after meeting the bare requirements of survival died off because some greedy assholes outcompeted them.

I'm only a casual follower of ancient human evolution and anthropology, but this doesn't mesh with my impression. Lots of human groups have been able to relax in relatively hospitable environments, over long spans of time.

They were overwhelmingly overpowered by those who took advantage of the fact they didn't have black powder, rifles, or western ships.

A few who managed to evade this past WWII took advantage of the fact everyone was desperate to freeze things in place to avoid nuclear war, those are the fortunate few who are locked into place for the indefinite future.

Of course there's also the heart of Africa, with no great navigable waterways or geography to trade to europe, north america, or asia, no one gives much a shit what they do.

------------- re due to throttling -------

>I don't think happened because of evolutionary pressure on tribes as the previous poster claimed.

Not a necessary precondition, it can happen through cultural pressure (also something passed down by the generations in tribes). I don't recall previous poster requiring it happen through gene expression.


I don't think happened because of evolutionary pressure on tribes as the previous poster claimed. Certainly that's not clear from the evidence. The human genotype was pretty well set by the time all that was happening, which means whatever evolutionary basis exists for "the problem of evil" had already acted, including on all the people living easy (or at least manageable) subsistence lifestyles for centuries previously.

> Not a necessary precondition, it can happen through cultural pressure (also something passed down by the generations in tribes). I don't recall previous poster requiring it happen through gene expression.

I feel it was implied in the vision of competing tribes, which hasn't really been how it works for a long time. But still, whatever the trait transmission mechanism, I don't think the supposed complete out-competing of non-conquest-oriented groups necessary for their hypothesis actually happened at scale. Humans content to "chill out" have persisted for all of recorded history.


> Of course there's also the heart of Africa, with no great navigable waterways or geography to trade to europe, north america, or asia, no one gives much a shit what they do.

If this is your standard for a relaxing “chilled out” lifestyle then I’m afraid you’d be deeply disappointed if you saw the realities of living like this. In many places simply maintaining a consistent supply of food and drinkable water is nearly a full time job, and that’s with the various contentions of aid coming in.


>If this is your standard for a relaxing “chilled out” lifestyle then I’m afraid you’d be deeply disappointed if you saw the realities of living like this.

Not my standard, the standard presented by the previous poster, where getting food/water/shelter is "chilling" and doing that plus conquering etc is the "less chill" version.

I wasn't explaining why the heart of Africa is "chilled out." I was explaining why at least the initial waves of people with guns who spent an inordinate amount of their "chill time" scheming on how to conquer others, didn't bother much with inner central Africa, thus even if they were chilling they were a bit safer from western ships and guns.

I don't think I ever made the claim all of the heart of africa is just chillin. I'm explaining why there is the potential people in some places could focus more on just eating and sheltering and watering and not as much time fighting against people who spend time on gunpowder and ships. All else equal it should cost less time to eat and shelter than to do that plus other things, and by the standards here, that was the "chill" that was relative to doing all that plus worrying about conquering.

>>Those human tribes who just chilled out after meeting the bare requirements of survival died off because some greedy assholes outcompeted them

>If this is your standard for a relaxing “chilled out” lifestyle then I’m afraid you’d be deeply disappointed if you saw the realities of living like this.

What you've done is redefined chilling out, from what the OG poster had it at (basically food + shelter), and instead you're arguing against someone else that their original definition we were already working on is wrong.


> Those human tribes who just chilled out after meeting the bare requirements of survival died off because some greedy assholes outcompeted them.

The idea of tribes just “chilling out” to survive is a modern anachronism projected on a romanticized past. We’re so disconnected from the realities of clothing, feeding and sheltering ourselves without modern amenities that it’s hard to imagine what pre-industrial like was like. Thinking that “chilling out” was a viable path to survival is a symptom of that disconnectedness.


If you look at a tiger, for instance, they sleep 16 hours a day (or a closer animal, take a look at the night monkey). I realize a human isn't as powerful or have the same needs as a tiger, but I don't see why a (pre-historic) humans have to work that much harder than a tiger merely to eat and reproduce and live long enough that enough survive to do that. A human can work smarter than a tiger, after all... surely we can "chill" as many hours a day as the tiger can.

> but I don't see why a (pre-historic) humans have to work that much harder than a tiger merely to eat and reproduce and live long enough that enough survive to do that.

This is a baffling comparison.

A tiger can sleep outside wherever it wants. It has fur to stay warm. Its offspring are up and running quickly on their own. A tiger can chase down animals and eat them immediately, raw. A tiger can drink water from a stream without getting infections.

The list goes on and on and on. If you think it’s trivial to live off the land and find your own food and shelter, why do you suppose people aren’t doing it?

Have you ever seen videos or documentaries about people who live in the middle of nowhere in self sufficient manners? They’re not having a great time. It’s hard work. Their health declines and they suffer. Their clothes are tattered. They still use a lot of cast-offs and tools and other things that they can find or acquire from society.


There are a ton of studies showing many tribal subsistence societies worked a little less than a tiger[]. Here's one, but they've been trotted out lots of times.

As for meat, yeah I've eaten lots of raw meat and seafood. Even better if you immediately caught it. Not a lot more work though if one tribal member makes a fire, catching it is more intensive than throwing some meat on some hot rocks to char the outside. There are also a lot of places/climates on the earth where you can survive without a shelter that costs more than a very small fraction of your total time to maintain and build, this is where many of the tribes ended up.

Regarding the young, cubs stay with their mothers for 2-3 years or about 20% the life of a tiger. Tribal kids stayed glued as strongly dependent on their parents until they were closer to 12, so a little bit longer than 20% of the lifespan of someone who has already survived long enough to mother/father a child (life expectancy was low in tribal times, but much larger expected lifespan by the time you reach the age of reproduction). A win for the tiger, but not by a longshot.

>A tiger can drink water from a stream without getting infections.

Nah the tiger can also get infections.

I think you're conflating the fact you wouldn't find it fun, with the idea that they were working that much harder than industrial societies. Industrial societies get more for their work, but due to the economies it actually might cost you even more time to get to a relatively self supporting subsistence level in some industrial societies since you would get arrested for being homeless, get arrested or kicked out for building a hut on your own land (you must spend a gazillion dollars on an up to code and permitted house), you'd get arrested for most forms of hunting, you'd have to pay to pick most wild growing fruits, etc etc.

Overall the tiger provides a pretty useful comparison of time spent working, although the tiger (or night monkey, again if you prefer a closer animal) does appear to have worked slightly more depending on which study you go by.

[] https://www.pnas.org/doi/10.1073/pnas.1906196116


A recent HN thread I cannot seem to find discussed the idea that currently in the US work is the default state, and leisure exists to refuel for work. At other times in history, leisure was the default state and work existed to enable leisure. This context affects everything in life - IE a microwave frozen meal is excellent in the work viewpoint (time value ratio), but if you enjoy cooking it’s horrible in the leisure viewpoint.

At which time exactly was leisure "the default state"? The only way to have this is by having a slave-like class while the idle elite could enjoy "leisure", or live in very low density, caloric rich environment, which doesn't last long or ends up with wars (and being enslaved by the neighboring tribe, if you are from subsaharian Africa).

I think there is a growing online mix up of "leisure" time in the past. 99% of people were farmers, farming season is 3-4mo a year. That doesn't mean they had 9mo to do whatever they wanted. The time off was technically not their job but they were doing work on other survival tasks. If you consider re-roofing your shelter leisure time then yeah past people had more leisure time.

We have much more non-survival leisure time now.


My girlfriend and I were talking about this the other day. We both have full time jobs and can only cook “real meal” in the weekend now that WFH ended.

It sucks, I enjoy cooking and want to eat at least somewhat health conscious…


> We both have full time jobs and can only cook “real meal” in the weekend now that WFH ended.

Do you have extra long hours and/or an extreme long (1 hour) commute?

It’s common in my social circles for parents to work 8-5 or 9-6 and still cook weekday meals that are healthy. With some meal and grocery planning it’s not that hard, unless you of course have on of those 90+ minute commutes and a job that keeps you in office until 8PM.

Unless your definition of “real meal” is something more than I’m thinking of, like something that requires hours of prep.

> It sucks, I enjoy cooking and want to eat at least somewhat health conscious…

There are a lot of healthy meal planning (ahead of time prep) or quick and easy recipes out there. It’s pretty easy to prepare a healthy meal with steamed vegetables and a warmed protein in 10 minutes. We can even make an entire healthy meal in 30 minutes start to finish after doing it for years.


More traditional “French” cuisine is not typically ready in 10-30 minutes when starting from scratch (or I’m just incredibly slow).

Cooking a full meal would at least take me an hour end-to-end. As a sibling comment mentioned, it’s more that when I finally get home (6:30 -7pm), I rarely have the energy to put in that kind of time.

So I end up making a quick pasta or other such dish that is ready in 30 minutes.


> More traditional “French” cuisine is not typically ready in 10-30 minutes when starting from scratch

I was responding to the part of your comment about not being able to eat healthy.

Cooking traditional French cuisine on weeknights is not the only way to have a healthy meal. Eating homemade French cuisine every weeknight would be a luxury for working class standards just about anywhere.


How many hours does your job and commute require?

I'd genuinely like to understand a job that is so time consuming that a person wouldn't be able to cook dinner. That doesn't seem ok to me.


Super normal. Let’s say at the simplest, you take 30 mins to get ready to leave from waking up, 30 mins from front door to sitting at your desk, 30 mins to get to bed and sleep that’s 2 hours of your 24 just kinda handling the bare functional minumum. Sleep for 8 and now you are left with 12 hours. Work plus breaks at work is probably 8-10 at the best.

So OK, 3-5 hours left over for everything else, assuming perfect execution on the other parts. Do you have family or pets that need something? Do you have dishes and laundry and trash days and bills to pay? Do you want to watch TV, play a game, do any kind of hobby or leaning? Are you sick? Do you have friendships? Are you tired from work being physically or mentally demanding? Do you need to exercise?

All of those things need to be handled in the same few “outside work” hours each day.


> that’s 2 hours of your 24 just kinda handling the bare functional minumum. Sleep for 8 and now you are left with 12 hours.

24 - 2 - 8 leaves you with 14 hours, not 12 hours.

Sounds pedantic, but 2 hours is a lot in the context of your argument that we only have a few hours per day to do anything.

This conversation gets repeated ad nauseum on social media, yet in the real world it’s common for people to operate fine on normal weekly work schedules. Back when I was still reading Reddit there was an endless stream of posts like this complaining that there was no time left to do anything after work. Every time when the OP was asked where their time was going, it revealed one of two things: Either they were taking way too long to go through the basic motions of life (e.g 2 hour morning routines and 2 hour dinner prep every day with a 1 hour bedtime ritual) or they realized they actually had a lot of time but it was just disappearing somewhere and they couldn’t figure it out. That latter one could almost always be traced to spending too much time on phones or in front of TV.


Yeah that’s a correct point, bad mental arithmetic there.

There are a few other unrealistic things too, but they fall in the other direction. Like I think it’s almost impossible to spend only 30 mins to leave my front door, get in the car, park at work and get into the building, get all the way to my desk and actually be in work mode. When I used to commute it was more like an hour, in busy traffic.

I have lived a lot of my life not having enough time to cook dinner mainly because I have often had a part time job in addition to a full time job, and was studying for a career change. So for a few years I was just kinda spinning plates. So that’s another way people end up caught out for time.

> in the real world it’s common for people to operate fine on normal weekly work schedules

I think it’s common but also maybe not even the majority of people are this way? There’s no good reason that “40 hours of work plus an arbitrary commute time” is a functional pattern for most people.

I think we have a mix of people who find this totally fine and have some energy left over at the end of the day, with people who are fully drained by their jobs. It’s hard for each cohort to relate to the other.

For some people, almost all leisure time is lost in an impossible quest to relax/recharge “enough” for the next day/week of work. Sometimes that explains the phone use or TV patterns. It’s an attempt to rest (plus their attention-taking and holding techniques work better on us when we are tired). It’s hard to plan on cooking if you know you’ll be in that state.

I tend to believe If you can find the right work and the right hours for you it’s a huge improvement in your life, and if you are on the wrong pattern with those it’s very bad and leads to a spiral. A lot of us have to accept the wrong pattern to make enough money to live and retire and support family.


Not op, the job is so soul and mentally draining that you “can’t afford” cooking.

I should have clarified it, but you hit the nail on the head. I arrive home with little energy after a day in the office.

By the time I’m home it’s at least 6:30pm, usually a bit later. If I would work until 6:30 but from home instead of the office, I’d probably still be up for cooking.

Although you also need to get gym time in, family time, chores and other stuff…


I have the same, my commute is a 10min walk, I have no dependants and make a good salary and I find it impossible to cook, I'm just depleted after work. If I add exercise and some social interaction then my time is spent recovering energy... It's probably a sign of burn out or of a bad job

Have you considered cooking before work?

Brutal comment because I’m a random Internet AI:

You can adjust what “real meal” means for you so that cooking at home is possible. The hardest part is finding time together if schedules don’t line up.

For two weeks write down what you do with your time, and then evaluate it afterwards and decide if it was the best use.


Lol, fair enough, but I think this is a workaround rather than a solution.

Don’t think of it as a workaround, think of it as a startup or MVP as you work toward developing a full product.

> At other times in history, leisure was the default state and work existed to enable leisure

It wasn’t that long ago that a lot of hard work was necessary to even survive through the winter each year.

What times in history had leisure as the default state? When was life so much easier than it is right now? Where were all the food, shelter, clothing, and entertainment materials coming from during this time and why was it so much more efficient than today?


> It wasn’t that long ago that a lot of hard work was necessary to even survive through the winter each year.

Well, not all parts of the world have winters.


Every time this topic of historical leisure time comes up and people start bringing up problems with the theory, the goalposts start moving as fast as the conversation. Are we now only talking about people who didn’t live in areas with winters? Because those areas have different sets of problems including entirely different sets of insects, diseases, and predators that aren’t controlled by annual winters, among other things.

> At other times in history, leisure was the default state and work existed to enable leisure.

What times/places are you thinking of when you write this?


The WHOLE US!?

...I don't view myself as a consumable. I enjoy accomplishing. I do not enjoy burnout. I'm interested in ways to prevent it. It's really that simple.

I don't particularly find this survey compelling, but I also don't want to be judged as some vampiric capitalist just because I'd like to have more work bandwidth.


Especially since we're about to give birth to an entire species which is better suited to the task!

It's 1AM in San Francisco right now. I don't envy the person having to call Matthew Prince and wake him up for this one. And I feel really bad for the person that forgot a closing brace in whatever config file did this.


Agreed, I feel bad for them. But mostly because cloudflare's workflows are so bad that you're seemingly repeatedly set up for really public failures. Like how does this keep happening without leadership's heads rolling. The culture clearly is not fit for their level of criticality


> The culture clearly is not fit for their level of criticality

I don't think anyone's is.


How often do you hear of Akamai going down and they host a LOT more enterprise/high value sites than Cloudflare.

There's a reason Cloudflare has been really struggling to get into the traditional enterprise space and it isn't price.


A quick google turned up an Akamai outage in July that took Linode down and two in 2021. At that scale nobody's going to come up smelling like roses. I mostly dealt with Amazon crap at megacorp, but nobody that had to deal with our Akamai stuff had anything kind to say about them as a vendor.

At first blush it's getting harder to "defend" use of Cloudflare, but I'll wait until we get some idea of what actually broke. For the time being I'll save my outrage for the AI scrapers that drove everyone into Cloudflare's arms.


Was it a CDN or Linode failure?


The last place I heard of someone deploying anything to Akamai was 15 years ago in FedGov.

Akamai was historically only serving enterprise customers. Cloudflare opened up tons of free plans, new services, and basically swallowed much of that market during that time period.


> I don't envy the person having to call Matthew Prince

They shouldn't need to do that unless they're really disorganised. CEOs are not there for day to day operations.


> And I feel really bad for the person that forgot a closing brace in whatever config file did this.

If a closing brace take your whole infra. down, my guess is that we'll see more of this.


"In progress - Scheduled maintenance is currently in progress. We will provide updates as necessary. Dec 05, 2025 - 07:00 UTC"

No need. Yikes.


Claude offline too. 500 errors on the web and the mobile app has been knocked out.


I had to switch to Gemini for it to help me form a thought so I could type this reply. Its dire.


Seems like it. Claude just went offline and is throwing Cloudflare 500 errors on the web interface.


I was under the impression (admittedly from an article I read a couple of years ago) that the consensus within the company was pretty much always that robo-taxis were one man’s pipe dream.

Weren’t there also disclosure documents a couple of years ago when they were trying to license autopilot that said they believed internally they were at level 2 as opposed to 4/5? (I might be remembering this part wrong)


> I was under the impression (admittedly from an article I read a couple of years ago) that the consensus within the company was pretty much always that robo-taxis were one man’s pipe dream.

If robo-taxis were ready with the kind of economics outlined by Musk it would be financially irresponsible to actually sell the cars to others instead of just building a massive Tesla fleet and pivoting towards transportation services.

Tesla's still selling their cars? If so, then they're not robo-taxis.

Edit: The other option for Tesla would be selling the cars for a high enough premium to offset the lost taxi revenue. The fact that Tesla seems to be in a price war with other EV makers is not a promising sign for robo-taxis.


Everyone knows they’re at level 2. Level 4/5 is completely hands off, no supervision.

Not even their Supervised Full Self Driving does that


Do I taxis certainly aren’t just one man’s dream. Whether not not they are possible in the next 50 years is another matter, but plenty of people want them and are willing to invest in developing them.


Right but when do I get my cheap Tesla?


> because FC runs on any hardware evenwithout dedicated GPUs

Twenty years of memes disagrees wholeheartedly


You are mixing it up with Crysis?


I absolutely am! Doh!


It's fair, though - it was the original Crysis

There was a time when it took fairly impressive hardware. I think this was one of the first popular 64bit games, upgrading into it


> I think this was one of the first popular 64bit games, upgrading into it

I don't think so. I remember struggles and patches necessary to get it run when I moved to a 64 bit machine a few years after it came out and I wanted to replay it.


A trip down memory lane :) The patch for Far Cry to become 64bit:

https://www.anandtech.com/show/1677

They were technically beat by Chronicles of Riddick who shipped something on disk

Looking back, this did little for performance. I suspect the memory limitations and introduction of SMP around that time to be a lot of warts we recall


I think I remember seeing someone run Crysis in software on a 128core AMD Epyc and get a decent frame rate.


It’s great that this isn’t hurting them but it leaves out a lot that makes me a bit nervous about this being taken as advice.

They’re advocating deploying a binary as preferable to using docker, fair enough, but what about the host running the binary? One of the reasons for using containers is to wrap your security hardening into your deployment so that anytime you do need to scale out you have confidence your security settings are identical across nodes.

On that, the monolith talked about here can be hosted on a single VPS, again that’s great (and cheap!), but if it crashes or the hardware fails for any reason that’s potentially substantial downtime.

The other worry I’d have is that tying everything into the monolith means losing any defence in depth in the application stack - if someone does breach your app through the frontend then they’ll be able to get right through to the backend data-store. This is one of the main reasons people put their data store behind an internal web service (so that you can security group it off in a private network away from the front-end to limit the attack surface to actions they would only have been able to perform through a web browser anyway).


>They’re advocating deploying a binary as preferable to using docker, fair enough, but what about the host running the binary? One of the reasons for using containers is to wrap your security hardening into your deployment so that anytime you do need to scale out you have confidence your security settings are identical across nodes.

There is no universe in which _increasing your attack surface_ increases your security.


Considering the vast majority of exploits are at the application level (SQLi, XSS, etc), putting barriers between your various applications is a good thing to do. Sure, you could run 10 apps on 10+ VMs, but it's not cost efficient, and then you just have more servers to manage. If the choice is between run 10 "bare metal" apps on 1 VM or run 10 containers on 1 VM, I'll pick containers every time.

At that point, why are we making a distinction when we do run 1 app on one VM? Sure, containers have some overhead, but not enough for it to be a major concern for most apps, especially if you need more than 1 VM for the app anyway (horizontal scaling). The major attack vector added by containers is the possibility of container breakout, which is very real. But if you run that 1 app outside the container on that host, they don't have to break out of the container when they get RCE.


The VM/container distinction is less relevant to this discussion than you might think; both Amazon ECS and fly.io run customer workloads in VMs (“microVMs” in their lingo).


I agree in principal but not in practice here.

If you’re using a typical docker host, say CoreOS, following a standard production setup, then running your app as a container on top of that (using an already hardened container that’s been audited), that whole stack has gone through a lot more review than your own custom-configured VPS. It also has several layers between the application and the host that would confine the application.

Docker would increase the attack surface, but a self-configured VPS would likely open a whole lot more windows and backdoors just by not being audited/reviewed.


You'd have to be utterly incompetent to make a self-configured VPS have more attack surface.

I have a FreeBSD server, three open ports: SSH with cert-login only, and http/https that go to nginx. No extra ports or pages for potentially vulnerable config tools.


Given the huge number of wide open production Mongo/ES/etc. instances dumped over the years, I wager having heard of ufw puts you among the top 50% of people deploying shit.


This whole thread is incomprehensible to me.

I guess no one knows how to harden an OS anymore so we just put everything in a container someone else made and hope for the best.


I don’t think we need to be calling people incompetent over a disagreement.

Are you suggesting that not opening the ports to any other services means they’re no longer a vulnerability concern?

That would be.. concerning.


On the other hand. If by using containers it has become more feasible for your employees to use something like AppArmor, the end result may be more secure than the situation where the binary just runs on the system without any protection.


Containers don't really increase attack surface, it's all stuff provided by the OS anyway. Docker just ties it all together and makes things convenient.


> One of the reasons for using containers is to wrap your security hardening into your deployment so that anytime you do need to scale out you have confidence your security settings are identical across nodes.

This is false. Or so you think your host is secured by installing Docker? And when you scale, how do you get additional hosts configured?

True is, when you use Docker you need to not only ensure that your containers are secure, but also your host (the services running your containers). And when you scale up, and you need to deploy additional hosts, they need to be just as secure.

And if you're using infrastructure as code and configuration as code, it does not matter if you are deploying a binary after configuring your system, or Docker.


Complexity is the criminal in any scenario. However, if we simply focus on a vanilla installation of docker, then the namespace isolation alone can be viewed as a step up from running directly on the os. Of course complexity means a vulnerability in the docker stack exposes you to additional risk, whereas a systemd svc running as a service account is likely to contain any 0day better.


> They’re advocating deploying a binary as preferable to using docker, fair enough, but what about the host running the binary? One of the reasons for using containers is to wrap your security hardening into your deployment so that anytime you do need to scale out you have confidence your security settings are identical across nodes.

There are tools that make "bare metal" configuration reproducible (to varying degrees), e.g. NixOS, Ansible, building Amazon AMI images.


All of which would be better than what the post is advocating and I totally agree with this.


I never understood how one “breaches an app through the frontend”. SQLi messes with your data store, natively (no RCE). XSS messes with other users, laterally. But how does one reach from the frontend all the way through, liberally? Are people running JavaScript interpreters with shell access inside of their Go API services and call eval on user input? It’s just so far fetched, on a technical level.


Ahh yes, security through obscurity - if we make it so complex we can’t understand it then no one else can either, right?

The important thing is making walls indestructible, not making more walls. Interfaces decrease performance and increase complexity


Literally the entire guiding principle for security architecture for the past decade or even more has been that "there is no such thing as an indestructible wall".


I agree, perfection isn’t a realistic expectation. I also think effort spent building better defenses leads to fewer exploits over time than adding more of the same defenses. The marginal cost of bypassing a given defense is far lower than the initial cost to bypass a new defense


Literally no-one said that.

(Some of) the reasons why you would do this are explained (I thought clearly) above. None of this is security through obscurity.


That seems like the worst option. Everything up to the free tier would stay there forever with no way for you to ever request it to be deleted.


Turn on Advanced Data Protection before you rip up the key. Then it's all as good as deleted.


That’s a rather generous assumption.

Do Apple definitely not retain a key? If they don’t is the encryption quantum secure?


> Do Apple definitely not retain a key?

If this is the threat vector you’re worried about, you shouldn’t have had anything in iCloud (or any cloud for that matter) to begin with, rendering this debate completely moot.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: