That code is not guaranteed to work. Declaring memset_v as volatile means that the variable has to be read, but does not imply that the function must be called; the compiler is free to compile the function call as "tmp = memset_v; if (tmp != memset) tmp(...)" relying on its knowledge that in the likely case of equality the call can be optimized away.
Whilst the C standard doesn't guarantee it, both LLVM and GCC _do_. They have implementation-defined that it will work, so are not free to optimise it away.
The C committee gave you memset_explicit. But note that there is still no guarantee that information can not leak. This is generally a very hard problem as information can leak in many different ways as it may have been copied by the compiler. Fully memory safe languages (so "Safe Rust" but not necessarily real-word Rust) would offer a bit more protection by default, but then there are still side-channel issues.
Because, for the 1384th time, they're pretending they can ignore what the programmer explicitly told them to do
Creating memset_explicit won't fix existing code. "Oh but what if maybe" is just cope.
If I do memset then free then that's what I want to do
And the way things go I won't be surprised if they break memset_explicit for some other BS reason and then make you use memset_explicit_you_really_mean_it_this_time
Your problem is not the C committee but your lack of understanding how optimizing compilers work. WG14 could, of course, specify that a compiler has do exactly what you tell it to do. And in fact, every compiler supports this already: Im most cases even by default! Just do not turn on optimization. But this is not what most people want.
Once you accept that optimizing compilers do, well, optimizations, the question is what should be allowed and what not. Both inlining "memset" and eliminating dead stores are both simply optimizations which people generally want.
If you want a store not to be eliminated by a compiler, you can make it volatile. The C standard says this can not be deleted by optimizations. The criticism with this was that later undefined behavior could "undo" this by "travelling in time". We made it clear in ISO C23 that this not allowed (and I believe it never was) - against protests from some compiler folks. Compilers still do not fully conform to this, which shows the limited power WG14 has to change reality.
> Once you accept that optimizing compilers do, well, optimizations
Why in tarnation it is optimizing out a write to a pointer out before a function that takes said pointer? Imagine it is any other function besides free, see how ridiculous that sounds?
It's been many years since C compilers started making pathological-but-technically-justifiable optimizations that work against the programmer. The problem is the vast sea of "undefined behavior" — if you are not a fully qualified language lawyer versed in every nook and cranny of the C standard, prepare to be surprised.
Many of us who don't like working under such conditions have just moved on to other languages.
Because it is a dead store. Removing dead stores does not sound ridiculous to me and neither is it to anybody using an optimizing compiler in the last decades.
The whole point of the optimizer is that it can detect inefficiencies by treating every statement as some combination of simple, fundamental operations. The compiler is not seeing "call memset() on pointer to heap", it's seeing "write of variable size" just before "deallocation". For some, optimizing that will be a problem, for others, not optimizing it will leave performance on the table.
There are still ways to obtain the desired behavior. Just put a call to a DLL or SO that implements what you need. The compiler cannot inspect the behavior of functions across module boundaries, so it cannot tell whether removing the call preserves semantics or not (for example, it could be that the external function sends the contents of the buffer to a file), so it will not remove it.
A modern compiler may also completely remove malloc / free pairs and move the computation to the stack. And I do not see what this has to do with C, it should be the same for most languages. C gives you tools to express low-level intent such as "volatile", but one has to use them.
I think the CSS support for that has finally landed, though it means targetting a pseudo element instead. Its been a year, so support is probably good enough you don't care if just the animation doesn't happen.
Note that the transition to `auto` in that post relies on `interpolate-size` which has yet to land on Firefox or Safari, and neither have movement right now.
Isn't accessibility outside of the scope of Wayland, whose purpose is to composite application buffers, and deliver input events?
Something like a screen reader needs to talk to an app and query the toolkit for the contents of a window in a semantic way - that's a toolkit feature not a compositor one.
What does this matter for blind people who want to use Linux?
All that matters to them is that it's super complicated and nobody wants to work with the tech to make screen readers work on Wayland.
To my knowledge, X11 didn't offer a comprehensive accessibility API either - there's no Linux equivalent of stuff like MS Active Accessibility or MSUIA on Linux.
Even back then Qt, GTK and everyone else offered their own API and screen readers needed to integrate with every single one - this didn't really change under Wayland, only the sandboxing makes certain operations harder, but the accessibility story on Linux is not great, and never was.
The standard was Extended Window Manager Hints [0].
Above X11, implemented by GTK and everyone else. Right.
However... Wayland makes it impossible to implement EWMH. Which means the enrire EMWH standard needs to be tossed, and everyone needs to make something new.
You can't even get the title of a window, under Wayland. That's private to that process tree.
Wayland requires accessibility be implemented at the application level, not the window manager. And thats guaranteed to make it always broken for a majority of use cases.
Parts of AT-SPI are impossible to implement under Wayland.
> Wayland has no concept of global coordinates or global key bindings. The protocol itself is designed around atomicity which is a nice concept, but is fundamentally in conflict to the need of assistive technologies to control the entire state of the desktop globally. As such, atspi methods like get_accessible_at_point are impossible in Wayland.
In RHEL I would never touch system python at all, and would install what every version I needed in a venv and configure any software I installed to use what ever version I needed. I learned the hard way to never mess with system python.
Its just the first two results from top of Google.
Maybe the tool was improved in version 3.0, I'm running an older 2.x version. I will check it next time.
The versions were difficult in:
- font size applying
- random loss / reset settings
- there were some issues with the preview when editting
- font preview before selection
etc.
Both of those are from over a year ago? For future, I wouldn't think that's "top" of any discussion.
The strange font sizes and setting reset was mostly fixed as part of the 2020 massive refactor [0]. There are still some minor inconsistencies between the two font editor panels, but they're being worked on.
Thankfully, you shouldn't have any random setting changes since about 2018 build.
The virus-infested computers caused by scam versions of Neopets, are not dissimilar to Windows today.
Live internet popups you didn't ask for, live tracking of everything you do, new buttons suddenly appearing in every toolbar. All of it slowing down your machine.
Stardew Valley. Runs on everything, not just "viable" OSs, made by a single person, and easily competes with an entire genre of gaming to pay the author.
The upper bound of building a business on top of Stardew Valley appears to be https://www.patreon.com/pathoschild which makes under $400 per month after Patreon's cut. That's not enough to work as a single person full time let alone hiring a team.
A $400/mo Patreon does not exactly outweigh somewhere between 18-35 million sales on a single one of the platforms it supports. I would not call that the "upper bound".
That 18-35 million goes to the game's developer and probably not even a single penny goes to those building a business off of designing content on top of the game. That figure is irrelevant.
Taking Fortnite as an example the relevant figure would be that creators on Fortnite can make over $10 million per year. Bringing up that Epic made a few billion dollars is irrelevant to what this conversation is about, which is games where it is financially viable to build content for them.
> which is games where it is financially viable to build content for them.
If that was your interpretation, then it would have been better to have mentioned it anywhere upthread. What we have, so far, is people talking about the gaming industry, and you calling it a monopoly. Nowhere before do we have a mention of third-party developers.
I got tripped up because the parent comment used the term "Fortnite store" when I think they meant "Epic Games Store", so I didn't mention the monopolization that I was talking about was in game monetization upon an existing game.
Wow. I guess Steam must be bankrupt and surviving off just four games. And I guess Epic and Steam just don't compete. And itch and GoG are just irrelevant with no market impact.
Sorry for the sarcasm, but gaming is not "choose between these two" level of monopilisation. And indies just won game of the year awards! Things are just not monopolised.
reply