Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

What's the problem with decimals? They're all numbers. -4.5 degrees C is fine, isn't it? (The actual temperature right here right now). Where's the problem?


You don't even need decimals. Nobody who uses celsius gives a shit about the decimals. It's -4 or it's -5 and even that distinction is irrelevant.

Unless you're doing some kind of scientific calculaton there's no need to think about decimals of celsius at all. Just like Fahrenheit users surely don't care whether it's 50 or 53 or whatever. It's around 50, that's all you need to know.


Only place I could imagine something is cooking and even there I probably would not be able to differentiate steak at 56C, 57C and 58C...


It might make a difference for mashing when brewing beer but even that’s a crapshoot.


Yes, but then you might conceivably still measure temperature in degrees Réaumur, if it's a rather traditional brewer. Or so I was told by a Reliable Source(tm).


For sous vide, I will differentiate by 1 or 2 Fahrenheit degrees but I take your basic point.


You don't even need decimals. 45x10^-1. There, fixed




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: