1. Unlike position and velocity, which are relative (there is no given "origin" for them, no way to say where a thing is or how fast it's moving except relative to other things), rotation is absolute. A thing is either rotating or not, regardless of its relation to other things. Objects that rotate "experience (centrifugal) forces as a result" or "require (centripetal) forces to hold them together" depending on how you choose to describe it. This is detectable: hook two weights together with a newton-meter in space and the newton-meter will read non-zero when the assemblage is rotating, zero when not. The reading tells you how fast it is rotating regardless of any external reference point. (An equivalent device to detect position or velocity is not possible, but it is for acceleration.)
2. Yes, everything "at rest" on earth is in fact rotating at the rate the earth rotates. If you stand on the equator at midday and do not rotate you will be standing on your head at midnight.
>no way to say where a thing is except relative to other things
This is always true. The origin is just a thing that other things are relative to. It's just as possible to define an origin in the real world as it is on a piece of graph paper.
Thanks for this explanation. If I understand correctly then, the moon requires some centripetal force in order not to dissipate due to its rotation whereas e.g. my head or the Eiffel Tower do not because they are not subject to absolute rotation.
Indeed. The Eiffel tower and your head do both have some (extremely small) centripetal force compensating for their rotation along with the earth.
(You can break that down in different ways, i.e. use various choices of generalised coordinates to describe it, so exactly what constitutes "centripetal", "centrifugal", "gravitational", "tidal", etc. forces depends on that. I'm being pretty vague in how I decribe it. Regardless, rotation is absolute, or in other words the equations of physics take a different form in a rotating frame of reference than in a non-rotating one.)
Thanks for the clarification I completely mistook what you were saying. This is the fascinating bit for me then, that what’s happening with the moon’s rotation is also happening with everything else
You've prompted it by giving it the learning sequence from the post you're replying to, which somebody who needs the tutorial wouldn't be able to specify, and it's replied with a bunch of bullets and lists that, as a person with general programming knowledge but almost no experience writing raytracing algorithms (i.e. presumably the target audience here) look like they have zero value to me in learning the subject.
Perplexing how different our perspectives are. I find this super useful for learning, especially since I can continue chatting about any and all of it.
> claim to have been trained for single-digit millions of dollars
Weren't these smaller models trained by distillation from larger ones, which therefore have to exist in order to do it? Are there examples of near state of the art foundation models being trained from scratch in low millions of dollars? (This is a genuine question, not arguing. I'm not knowledgeable in this area.)
Both of those were frontier models at the time of their release.
Another interesting number here is Claude 3.7 Sonnet, which may people (myself included) considered the best model for several months after its release and was apparently trained for "a few tens of millions of dollars": https://www.oneusefulthing.org/p/a-new-generation-of-ais-cla...
It would help if TV manufacturers would clearly document what these features do, and use consistent names that reflect that.
It seems they want to make these settings usable without specialist knowledge, but the end result of their opaque naming and vague descriptions is that anybody who actually cares about what they see and thinks they might benefit from some of the features has to either systematically try every possible combination of options or teach themselves video engineering and try to figure out for themselves what each one actually does.
This isn't unique to TVs. It's amazing really how much effort a company will put into adding a feature to a product only to completely negate any value it might have by assuming any attempt at clearly documenting it, even if buried deep in a manual, will cause their customers' brains to explode.
"Filmmaker mode" is the industry's attempt at this. On supported TVs it's just another picture mode (like vivid or standard), but it disables all the junk the other modes have enabled by default without wading though all the individual settings. I don't know how widely adopted it is though, but my LG OLED from 2020 has it.
The problem with filmmaker mode is I don't trust it more than other modes. It would take no effort at all for a TV maker to start fiddling whit "filmmaker mode" to boost colors or something to "get an edge", then everyone does it, and we're back to where we started. I just turn them off and leave it that way. Companies have already proven time and again they'll make changes we don't like just because they can, so it's important to take every opportunity to prevent them even getting a chance.
"Filmmaker mode" is a trademark of the UHD Alliance, so if TV makers want to deviate from the spec they can't call it "Filmmaker mode" anymore. There's a few different TV makers in the UHD Alliance so there's an incentive for the spec to not have wiggle room that one member could exploit to the determent of the others.
It's true that Filmmaker Mode might at some point in the future be corrupted, but in the actual world of today, if you go to a TV and set it to Filmmaker Mode, it's going to move most things to correct settings, and all things to correct settings on at least some TVs.
(The trickiest thing is actually brightness. LG originally used to set brightness to 100 nits in Filmmaker Mode for SDR, which is correct dark room behavior -- but a lot of people aren't in dark rooms and want brighter screens, so they changed it to be significantly brighter. Defensible, but it now means that if you are in a dark room, you have to look up which brightness level is close to 100 nits.)
Game mode being latency-optimized really is the saving grace in a market segment where the big brands try to keep hardware cost as cheap as possible. Sure, you _could_ have a game mode that does all of the fancy processing closer to real-time, but now you can't use a bargain-basement CPU.
Yup, it's great, at least for live action content. I've found that for Anime, a small amount of motion interpolation is absolutely needed on my OLED, otherwise the content has horrible judder.
I always found that weird, anime relies on motion blur for smoothness when panning / scrolling motion interpolation works as an upgraded version of that... until it starts to interpolate actual animation
On my LG OLED I think it looks bad. Whites are off and I feel like the colours are squashed. Might be more accurate, but it's bad for me. I prefer to use standard, disable everything and put the white balance on neutral, neither cold nor warm.
I had just recently factory reset my samsung S90C QDOLED - and had to work through the annoying process of dialing the settings back to something sane and tasteful. Filmmaker mode only got it part of the way there. The white balance was still set to warm, and inexplicably HDR was static (ignoring the content 'hints'), and even then the contrast seemed off, and I had to set the dynamic contrast to 'low' (whatever that means) to keep everything from looking overly dark.
It makes me wish that there was something like an industry standard 'calibrated' mode that everyone could target - let all the other garbage features be a divergence from that. Hell, there probably is, but they'd never suggest a consumer use that and not all of their value-add tackey DSP.
"Warm" or "Warm 2" or "Warm 50" is the correct white point on most TVs. Yes, it would make sense if some "Neutral" setting was where they put the standards-compliant setting, but in practice nobody ever wants it to be warmer than D6500, and lots of people want it some degree of cooler, so they anchor the proper setting to the warm side of their adjustment.
When you say that "HDR is static" you probably mean that "Dynamic tone-mapping" was turned off. This is also correct behavior. Dynamic tone-mapping isn't about using content settings to do per-scene tone-mapping (that's HDR10+ or Dolby Vision, though Samsung doesn't support the latter), it's about just yoloing the image to be brighter and more vivid than it should be rather than sticking to the accurate rendering.
What you're discovering here is that the reason TV makers put these "garbage features" in is that a lot of people like a TV picture that's too vivid, too blue, too bright. If you set it to the true standard settings, people's first impression is that it looks bad, as yours was. (But if you live with it for a while, it'll quickly start to look good, and then when you look at a blown-out picture, it'll look gross.)
“Filmmaker Mode” on LG OLED was horrible. Yes, all of the “extra” features were off, but it was overly warm and unbalanced as hell. I either don’t understand “Filmmakers” or that mode is intended to be so bad that you will need to fix it yourself.
Filmmaker is warm because it follows the standardized D6500 whitepoint. But that's the monitor whitepoint it is mastered against, and how it's intended to be seen.
TV producers always set their sets to way higher by default because blue tones show off colors better.
As a result of both that familiarity and the better saturation, most people don't like filmmaker when they try to use it at first. After a few weeks, though, you'll be wondering why you ever liked the oversaturated neons and severely off brightness curve of other modes.
The whites in Filmmaker Mode are not off. They'll look warm to you if you're used to the too-blue settings, but they're completely and measurably correct.
I'd suggest living with it for a while; if you do, you'll quickly get used to it, and then going to the "standard" (sic) setting will look too blue.
The problem is that comparing to all the monitors I have, specifically the one in my Lenovo Yoga OLED that is supposed to be very accurate, whites are very warm in filmmaker mode. What's that about?
Your monitor is probably set to the wrong settings for film content. Almost all monitors are set to a cool white point out of the box. If you're not producing film or color calibrated photography on your monitor, there is no standard white temperature for PC displays.
It means that the colors should be correct. The sky on tv should look like the sky. The grass on tv should look like grass. If I look at the screen and then I look outside, it should look the same. HDR screens and sensors are getting pretty close, but almost everyone is using color grading so the advantage is gone. And after colors, don't get me started about motion and the 24fps abomination.
> It means that the colors should be correct. The sky on tv should look like the sky. The grass on tv should look like grass.
It is not as clear cut as you think and is very much a gradient. I could send 10 different color gradings of the sky and grass to 10 different people and they could all say it looks “natural” to them, or a few would say it looks “off,” because our expectations of “natural” looks are not informed by any sort of objective rubric. Naturally if everyone says it’s off the common denominator is likely the colorist, but aside from that, the above generally holds. It’s why color grading with proper scopes and such is so important. You’re doing your best to meet the expectation for as many people as possible knowing that they will be looking on different devices, have different ideas of what a proper color is, are in different environments, etc. and ultimately you will still disappoint some folks. There are so many hardware factors at play stacked on top of an individual’s own expectations.
Even the color of the room you’re in or the color/intensity of the light in your peripheral vision will heavily influence how you perceive a color that is directly in front of you. Even if you walk around with a proper color reference chart checking everything it’s just always going to have a subjective element because you have your own opinion of what constitutes green grass.
In a way, this actually touches on a real issue. Instead of trying to please random ppl and make heuristics that work in arbitrary conditions, maybe start from the objective reality? I mean, for the start, take a picture, and then immediately compare it with the subject. If it looks identical then that's a good start. I haven't seen any device capable of doing this. Of course you would need the entire sensor-processing-screen chain to be calibrated for this.
Everything I talked about above applies even more so now that you’re trying to say “we’ll make a camera capture objective colors/reality.” That’s been a debate about cameras ever since the first images were taken. “The truth of the image.”
There is no such thing as the “correct” or “most natural” image. There is essentially no “true” image.
I completely agree. Theoretically you could capture and reproduce the entire spectrum for each pixel, but even that is not "true" because it is not the entire light field. But I still think that we can look at the picture on phone in the hand and at the subject just in front, and try to make them as similar as possible to our senses? This looks to me like a big improvement to the current state of affairs. Then you can always say to a critic: I checked just as i took the picture/movie, and this is exactly how the sky/grass/subject looked.
Well, I know what you mean, color is complicated. BUT, I can look at a hundred skys and they look like sky. I will look at the sky on the tv, and it looks like sky on the tv, not like the real sky. And sky is probably easy to replicate, but if you take the grass or leaves, or human skin, then the tv becomes funny most of the time.
> I will look at the sky on the tv, and it looks like sky on the tv, not like the real sky.
Well for starters you’re viewing the real sky in 3D and your TV is a 2D medium. Truly that immediately changes your perception and drastically. TV looks like TV no matter what.
I'm sure part of it is so that marketing can say that their TV has new putz-tech smooth vibes AI 2.0, but honestly I also see this same thing happen with products aimed at technical people who would benefit from actually knowing what a particular feature or setting really is. Even in my own work on tools aimed at developers, non-technical stakeholders push really hard to dumb down and hide what things really are, believing that makes the tools easier to use, when really it just makes it more confusing for the users.
I don't think you are the target audience of the dumbed down part but the people paying them for it. They don't need the detailed documentation on those thing, so why make it?
> It would help if TV manufacturers would clearly document what these features do, and use consistent names that reflect that.
It would also help if there was a common, universal, perfect "reference TV" to aim for (or multiple such references for different use cases), with the job of the TV being to approximate this reference as closely as possible.
Alas, much like documenting the features, this would turn TVs into commodities, which is what consumers want, but TV vendors very much don't.
I wonder if there's a video equivalent to the Yamaha NS-10[1], a studio monitor (audio) that (simplifying) sounds bad enough that audio engineers reckon if they can make the mix sound good on them, they'll sound alright on just about anything.
Probably not, or they don't go by it, since there seems to be a massive problem with people being unable to hear dialogue well enough to not need subtitles.
It was a real eye(ear?)-opener to watch Seinfeld on Netflix and suddenly have no problem understanding what they're saying. They solved the problem before, they just ... unsolved it.
My favorite thing about Kodi is an audio setting that boosts the center channel. Since most speech comes through that, it generally just turns up the voices, and the music and sound effects stay at the same level. It's a godsend. Also another great reason to have a nice backup collection on a hard drive.
It's a similar thing to watching movies from before the mid-2000 (I place the inflection point around Collateral in 2004) where after that you get overly dark scenes where you can't make out anything, while anything earlier you get these night scenes where you can clearly make out the setting, and the focused actors/props are clearly visible.
Watch An American Werewolf in London, Strange Days, True Lies, Blade Runner, or any other movie from the film era all up to the start of digital, and you can see that the sets are incredibly well lit. On film they couldn't afford to reshoot and didn't have immediate view of what everything in the frame resulted on, so they had to be conservative. They didn't have per-pixel brightness manipulation (feathering and burning were film techniques that could technically have been applied per frame, but good luck with doing that at any reasonable expense or amount of time). They didn't have hyper-fast color film-stock they could use (ISO 800 was about the fastest you could get), and it was a clear downgrade from anything slower.
The advent of digital film-making when sensors reached ISO 1600/3200 with reasonable image quality is when the allure of time/cost savings of not lighting heavily for every scene showed its ugly head, and by the 2020's you get the "Netflix look" from studios optimizing for "the cheapest possible thing we can get out the door" (the most expensive thing in any production is filming in location, a producer will want to squeeze every minute of that away, with the smallest crew they could get away with).
Reference monitor pricing has never been any where near something mere mortals could afford. The price you gave of $21k for 55” is more than 50% of the average of $1k+ per inch I’m used to seeing from Sony.
If you account for the wastage/insurance costs using standard freight carriers that seems reasonable to me as a proportion of value. I’m sure this is shipped insured, well packaged and on a pallet.
Walmart might be able to resell a damaged/open box $2k TV at a discount, but I don’t think that’s so easy for speciality calibrated equipment.
My local hummus factory puts the product destined for Costco into a different sized tub than the one destined for Walmart. Companies want to make it hard for the consumer to compare.
Costco’s whole thing is selling larger quantities, most times at a lower per unit price than other retailers such as Walmart. Walmart’s wholesale competitor to Costco is Sam’s Club. Also, Costco’s price labels always show the per unit price of the product (as do Walmart’s, in my experience).
Often a false economy. My MIL shops at Sam's Club, and ends up throwing half her food away because she cannot eat it all before it expires. I've told her that those dates often don't mean the food is instantly "bad" the next day but she refuses to touch anything that is "expired."
My wife is the same way - the "best by" date is just a date they put for best "freshness". "Sell by" date is similar. It's not about safety.
My wife grew up in a hot and humid climate where things went bad quickly, so this tendency doesn't come from nowhere. Her whole family now lives in the US midwest, and there are similar arguments between her siblings and their spouses.
The ones I’m talking about were only subtly different, like 22 oz vs 24 oz. To me it was obvious what they were doing, shoppers couldn’t compare same-size units and they could have more freedom with prices.
There is no federal law requiring unit requiring unit pricing, but the the NIST has guidelines that most grocery stores follow voluntarily. 9 states have adopted the guidelines as law.
I don't think that's correct. Prices for retail goods aren't usually even attached to the product in interstate commerce, and are shown locally on store shelving.
These exist, typically made by Panasonic or Sony, and cost upwards of 20k USD. HDTVtest has compared them to the top OLED consumer tvs in the past. Film studios use the reference models for their editing and mastering work.
Sony specifically targets the reference with their final calibration on their top TVs, assuming you are in Cinema or Dolby Vision mode, or whatever they call it this year.
There is! That is precisely how TVs work! Specs like BT.2020 and BT.2100 define the color primaries, white point, and how colors and brightness levels should be represented. Other specs define other elements of the signal. SMPTE ST 2080 defines what the mastering environment should be, which is where you get the recommendations for bias lighting.
This is all out there -- but consumers DO NOT want it, because in a back-to-back comparison, they believe they want (as you'll see in other messages in this thread) displays that are over-bright, over-blue, over-saturated, and over-contrasty. And so that's what they get.
But if you want a perfect reference TV, that's what Filmmaker Mode is for, if you've got a TV maker that's even trying.
The purpose of the naming is generally to overwhelm consumers and drive long term repeat buys. You can’t remember if your tv has the fitzbuzz, but you’re damn sure this fancy new tv in the store looks a hell of a lot better than you’re current tv and there really pushing this fitzbuzz thing.
Cynically, I think its a bit, just a little, to do with how we handle manuals, today.
It wasn't that long ago, that the manual spelled out everything in detail enough that a kid could understand, absorb, and decide he was going to dive into his own and end up in the industry. I wouldn't have broken or created nearly as much, without it.
But, a few things challenged the norm. For many, many reasons, manuals became less about the specification and more about the functionality. Then they became even more simplified, because of the need to translate it into thirty different languages automatically. And even smaller, to discourage people from blaming the company rather than themselves, by never admitting anything in the manual.
What I would do for a return to fault repair guides [0].
Another factor is the increased importance of software part of the product, and how that changes via updates that can make a manual outdated. Or at least a printed manual, so if they're doing updates to product launch it might not match what a customer gets straight out of the box or any later production runs where new firmware is included. It would be somewhat mitigated if there was an onus to keep online/downloadable manuals updated alongside the software. I know my motherboard BIOS no longer matches the manual, but even then most descriptions are so simple they do nothing more than list the options with no explanation.
Going a level deeper, more information can be gleaned for how closely modern technology mimics kids toys that don’t require manuals.
A punch card machine certainly requires specs, and would not be confused with a toy.
A server rack, same, but the manuals are pieced out and specific, with details being lost.
You’ll notice anything with dangerous implications naturally wards off tampering near natively.
Desktop and laptop computers depending on sharp edges and design language, whether they use a touch screen. Almost kids toys, manual now in collective common sense for most.
Tablet, colorful case, basically a toy. Ask how many people using one can write bit transition diagrams for or/and, let alone xor.
We’ve drifted far away from where we started. Part of me feels like the youth are losing their childhoods earlier and earlier as our technology becomes easier to use. Being cynical of course.
That doesn't preclude clearly documenting what the feature does somewhere in the manual or online. People who either don't care or don't have the mental capacity to understand it won't read it. People who care a lot, such as specialist reviewers or your competitors, will figure it out anyway. I don't see any downside to adding the documentation for the benefit of paying customers who want to make an informed choice about when to use the feature, even in this cynical world view.
Why let a consumer educate themselves as easily as possible when it’s more profitable to deter that behaviour and keep you confused? Especially when some of the tech is entirely false (iirc about a decade ago, TVs were advertised as ‘360hz’ which was not related to the refresh rates).
I’m with you personally, but the companies that sell TVs are not.
I don't particularly like that, but even so, it doesn't preclude having a "standard" or "no enhancement" option, even if it's not the default.
On my TCL TV I can turn off "smart" image and a bunch of other crap, and there's a "standard" image mode. But I'm not convinced that's actually "as close to reference as the panel can get". One reason is that there is noticeable input lag when connected to a pc, whereas if I switch it to "pc", the lag is basically gone, but the image looks different. So I have no idea which is the "standard" one.
Ironically, when I first turned it on, all the "smart" things were off.
I'm not certain this is true. TVs have become so ludicrously inexpensive that it seems the only criteria consumers shop for is bigger screen and lower price.
TV's are on their way to free, and are thoroughly enshittified. The consumer is the product, so compliance with consumer preferences is going to plummet. They don't care if you know what you want, you're going to get what they provide.
They want a spy device in your house, recording and sending screenshots and audio clips to their servers, providing hooks into every piece of media you consume, allowing them a detailed profile of you and your household. By purchasing the device, you're agreeing to waiving any and all expectations of privacy.
Your best bet is to get a projector, or spend thousands to get an actual dumb display. TVs are a lost cause - they've discovered how to exploit users and there's no going back.
I just went through this learning curve with my new Sony Bravia 8 II.
I also auditioned the LG G5.
I calibrated both of them. It is not that much effort after some research on avsforum.com. I think this task would be fairly trivial for the hackernews crowd.
Agreed. And I’m not going to flip my TV’s mode every time I watch a new show. I need something that does a good job on average, where I can set it and forget it.
The 4th edition of Linear Algebra Done Right has a much improved approach to determinants themselves (still relegated to the end, where it should be). From the list of improvements:
> New Chapter 9 on multilinear algebra, including bilinear forms, quadratic forms, multilinear forms, and tensor products. Determinants now are defned using a basis-free approach via alternating multilinear forms.
The basis-free definition is really rather lovely.
"In 1193 (1981.), I submitted my first article [...] and in 1197 (1987.), I became a member"
Seems obviously wrong, or is that yet another dozenal notation, where what looks like the digit three is really a one? Because it should have been real easy to avoid mistakes like that for an entire decade by just remembering that 1190 = 1980 decimal (next time the decades and dozen-years align like that will be in 2040).
> as is inconsistent in language usage to write differently than to speak. we don’t speak big sounds, that’s why we don't write them either.
Of all fatuous nonsenses I've heard from design "geniuses" over the years, that might take the prize.
We don't look at spoken words, we listen to them. We add audible prosody (both pauses and intonation changes, in particular) to segment our speech. If we were to optimise our spoken language for lip-readers, we might very well choose to add some extra visible segmentation to compensate for the intonation being mostly undetectable.
You could validly claim that capital letters are superfluous given the presence of full stops (and I would disagree and we could debate that), but this argument that capital letters are bad because we don't speak "big sounds" is absurd.
Perhaps you could elevate the discussion by providing an actual argument against this view of string theory, which has indeed percolated through social media?
For my part, I know a little bit more than "shit" about physics but I know very little about string theory and know better than to have strongly held opinions about things I don't understand. I've heard quite a lot about the criticisms and would like to hear a defense of it.
I'd rather my government control the narrative my children are exposed to than Andrew Tate.
Edit: To expand, this is not just a flippant remark. People ignore Andrew Tate because he's so obviously, cartoonishly awful, but they are not the audience. It's aimed at children, and from personal experience its effect on a large number of them worldwide is profound, to the extent that I worry about the long term, generational effect.
Children will be exposed to narratives one way or another, and to want to (re)assert some control that over that isn't necessarily just an authoritatian power play.
The targets to control are not children. They don't need to be controlled, from an intelligence point of view. Government's attention is not infinite, and between worries of losing power and worries about the wellbeing of children, one of the two is the winner, and it's not the children. If children's well-being was the priority, you would see other stuff being made.
This sort of makes sense if our governments are, on the whole, 'better' than Andrew Tate, for some definition of 'better'. But as the slide goes on there will be a tipping point where our governments are worse, meaning them surveilling me becomes problematic. Best shout about it now than then.
Do you decline any responsibility in the moral upbringing of your children? I think you should be the one that decides how they interact with dubious content, not your government.
Counterpoint: Andrew Tate resonates with the younger generations because modern society (at least in the UK) appears to be an ever-growing middle finger to them and Tate promises a (fake, but believable) way out.
When your future looks like endless toil just so you can give half of the fruits of your labor to subsidize senile politicians/their friends (via taxes) and the other half to subsidize boomers (via rent), Tate's messaging and whatever get-rich-quick scheme he's currently hawking sounds appealing.
You can ban Tate but without solving the reason behind why people look up to him it's just a matter of time before another grifter takes his place.
2. Yes, everything "at rest" on earth is in fact rotating at the rate the earth rotates. If you stand on the equator at midday and do not rotate you will be standing on your head at midnight.
reply