Think back to Maya history when the rulers kept astronomy knowledge secret to pretend they were Gods and had control over celestial objects. If expensive education and publishing access provides power to someone and free education and publishing becomes a thread to their authority, that’s not a good testimony on how they used their education advantage while they had it.
AI may be destroying truth by creating collective schizophrenia and backing different people’s delusions, or by being trained to create rage-bait for clicks, or by introducing vulnerabilities on critical software and hardware infrastructure. But if institutions feel threatened, their best bet is to become higher levels of abstraction, or to dig deeper into where they are truly deeply needed - providing transparency and research into all angles and weaknesses and abuses of AI models. Then surfacing how to make education more reliably scalable if AI is used.
Nevada protects founders from shareholder lawsuits. So if someone defrauds or intends to defraud shareholders - they are more likely to prefer Nevada. To be fair, a lot of things can turn into a shareholder lawsuit in Delaware.
How do you do objective research without a data pipeline? Social media companies can use user privacy as an excuse to not share feeds that influence users. The first step to fixing the wrongs is transparency, but there are no incentives for big tech to enable that.
The election is a state at a moment in time. Before the election, to affect the state, propaganda machines target whatever is lower on the maslow hierarchy of needs for the voters - pesky tariffs are a tiny issue compared to that boogeyman that wants to target your children. Someone’s freedom is a pesky thing compared to that immigrant boogeyman going after your retirement savings. Once people have voted because they are scared for their life savings and their children, the elected can do whatever they want and target whomever they want with impunity for several years. Especially if they start building their own militia and threatening the judiciary.
This authoritarian model has proven very successful for anyone Putin and his aparatus has installed anywhere. Now it may be franchised even further.
It's depressingly funny too when you ask these people if they've ever been directly affected by said boogeyman and they'll say no, but the know someone that has. Meanwhile you can ask them things about healthcare, local government, and other matters that affect their daily life, and they'll swear the trans-immigrant-boogeyman de jour is has far more affect on their lives.
From the article it says the change is implemented by telling brands they don’t need an amazon barcode if they have a product barcode, while resellers need an amazon barcode. What happens if resellers decide to just not add the amazon barcode and appear as brands?
Annoying was a thing of the past. Look at the evolution of ads and content placement. With social media advertising being pushed to trigger massive anxiety and societal schizophrenia on some topics, imagine what can be done with personalized AI (especially if the buyers are well funded politicians, or state-backed malicious actors vying for territory, real estate, or natural resources where you live - the highest margin opportunities).
At first, In retail you had billboards and shelf space. The lowest quality ingredients your product has (example syrup bottled with soda water), the higher your margin was, the more you could afford to buy out shelf space in retail chains and keep any higher quality competition out. Then you would use some extra profits to buy out national ads and you’d become a top holding for the biggest investors. That was the low-tech flywheel.
In the Search Engine world - the billboards weee the Margin-eating auction-based ads prices and the shelf space became SEO on increasingly diluted and dis-informative content to fill the shelf-space side. In Video advertising, rage-bait and conspiracy theories try to eat up the time available for top users.
AI advertising if done right can be useful, but the industry that asks for it intentionally asks for obtrusive and attention hogging, not for useful. The goal is always to push people to generate demand, not sit there when they need something. Thus the repetition, psychological experiments, emotional warfare (surfacing or creating perceived deficiencies, then selling the cure). Now if you understand that the parties funding AI expansion are not Procter and Gamble- level commercial entities but state and sovereign investors, you can forecast what the main use cases may be and how those will be approached. Especially if natural resources are becoming more profitable than consumer demand.
Those look like the monitors used on the F1 movie, which is strange, considering it was an Apple production and they maybe should have used apple monitors for product placement . I guess it is a testimony about Kuycon from Apple.
You should look at pictures of Apple's Pro Display XDR. The Kuycon monitor is an obvious rip-off of that in terms of styling, especially the ventilation on the back.
If one person thinks this way, many more do. This is typical in large organizations, especially government institutions, because expense of running entire teams at massive costs for no reason is not born by the team but by someone with a much larger budget that has more money than care or completely wrong incentives (the more people I manage, the more important I am, type of orgs). This is organizational gangrene described from the inside and partly how or why it happens. If you are leading an organization and reading this - figure out how to measure and prevent it.
Humans think this way. This isn't a cultural thing, it's human nature. We like positive people and dislike negative people. Ignoring the fact that political capital is a thing won't make it go away.
The goal is not to ignore human nature, but to build better tools for orgs to get feedback and act on it before it corrodes them on the inside. Government is the biggest of them all - fix this and maybe you can create government that works for you, instead of blowing taxpayer dollars like a leaky bucket. Humans in an organizations are like cells or organs in a body. Every country, team, and organization iterates on a proper nervous system for their body.
imo it's a cultural thing specific to organizations which are raking in money, as many tech companies are. The less actual competitive pressure there is the more everyone is pressured to just shut up and take their cut. Whether it's more or less than it could be is less important than just not rocking the boat.
Whereas if real existential need is on the line then people are incentivized to give a shit about the outcome more.
Tech is so rich in general that the norm is to just shut up and enjoy your upple-middle-class existence instead of caring about the details. After all, if this company blows up, there's another one way that will take most of you.
Not that this excludes the same behavior in industries that are less lucrative. There's cultural inertia to contend with, plus loads of other effects. But I have noticed that this attitude seems to spontaneously arise whenever a place is sufficiently cushy.
Also, this take doesn't (on its own) recommend one strategy or the other. Maybe it makes the most sense to go along with things or fight them for personal reasons, uncorrelated to the economic ones. But it's good, I think, to recognize that the impulse is somewhat biased by the risk-reward calculation of a rich workplace. Basically it is essentially coupled to a sort of privilege.
There is a gap between thinking and action. I think the social media and gaming and online stimulions currently designed to bombard and drain your thinking brain, leaves nothing for the action you and your body needs to take. Your brain only has so much chemistry to trigger neural activation and we are blowing it on mental stress to the point where the body doesn’t have any more mental energy to tackle real world stress or handle real world emotions.
Try an A/B test. Do days with zero screen stimuli - no TV, no phones, no online interaction. Go into the world to a cafe, or a common area with people and do stuff. See how you feel and what you feel up to. Vacations might be good and relaxing because you disconnect. Maybe do it without paying for it.
Make your own. Your mental health loves you when you come up with stuff. Humanity has generated so much content and none of us have the lifetime to consume it all, but that shouldn't stop you from making your own any chance you get.
Make music - you don't need an instrument if you can whistle.
Make stories - just say them to a recorder or your kids or write them down.
Make food experiments - nothing will please your taste-buds more than listening to them and iterating on ways to get better.
Make your own apps or experiences - with AI or by hand, your ideas may be surprising and worthwhile.
Knit or make your own clothes, toys, wearable tech.
Design your own 3D objects and maybe print them or animate them.
We know that workouts lead to endorphins for the body, but the brain version of that is not only enjoyable but also can be scaled to be enjoyed by other humans too sometimes. Don't go through life without trying your own things.
AI may be destroying truth by creating collective schizophrenia and backing different people’s delusions, or by being trained to create rage-bait for clicks, or by introducing vulnerabilities on critical software and hardware infrastructure. But if institutions feel threatened, their best bet is to become higher levels of abstraction, or to dig deeper into where they are truly deeply needed - providing transparency and research into all angles and weaknesses and abuses of AI models. Then surfacing how to make education more reliably scalable if AI is used.
reply