Hacker Newsnew | past | comments | ask | show | jobs | submit | ashikns's commentslogin

I discovered it today and I'm in love! Thank you for maintaining a piece of simple joy for more than a decade <3

Because a novel is about creative output, and engineering is about understanding a lot of rules and requirements and then writing logic to satisfy that. The latter has a much more explicitly defined output.

Said another way, a novel is about the experience of reading every word of implementation, whereas software is sufficient to be a black box, the functional output is all that matters. No one is reading assembly for example.

We’re moving into a world where suboptimal code doesn’t matter that much because it’s so cheap to produce.


The lesson of UML is that software engineering is not a process of refining rules and requirements into logic. Software engineering is lucrative because it very much is a creative process.

Yeah then you have the choice to not buy the locked down hardware, you don't have a right to get open hardware FROM Google.

Of course there are no good options for open hardware, but that is a related but separate problem.


It's not a separate problem, Google are actively suppressing any possibility of open mobile hardware. They force HW manufacturers to keep their specs secret and make them choose between their ecosystem and any other, not both. There's a humongous conflict of interests and they're abusing their dominating position.


> They force HW manufacturers to keep their specs secret

Spoken like someone who has never ever worked with any hardware manufacturers. They do not need reasons for that. They all believe their mundane shit is the most secret-worthy shit ever. They have always done this. This predates google, and will outlive it.


Often it is because they don't know their own devices. We got a dev board from Qualcomm once and the documentation was totally bogus.


Regulating this is the way to not let general computing die to fuel google and apple profits.

People should have the right to run whatever software they like on the computing hardware they own. They should have the right to repair it.

The alternative is that everything ends up like smart-tvs where the options are "buy spyware ridden crap" or "don't have a tv"


Given how antitrust is not really working right now I would say this is debatable. Also monopolies in the past were forced to do various things to keep their status for longer.


I worked in a similar system. The raw data from the field first goes to a cloud hosted event queue of some sort, then a database, then back to whatever app/screen on field. The data doesn't just power on-field displays. There's a lot of online websites, etc that needs to pull data from an api.


This is the exact attitude that keeps people away from Linux. The moment someone points out practical problems with Linux, it's users get all defensive and elitist about it. Sigh, if at least this changed more people would use Linux.


I feel the same. For now, I've made peace with having to switch to "whatever is the latest maintained fork with privacy defaults" every 6 months. Hopefully Ladybird becomes a usable browser sometime soon.


This is what I have struggled to understand about Zig. It seems pretty much like C in a mental model aspect - you are responsible for everything. It's slightly better than C, but C already runs on everything on the planet and can be made secure even if painfully so. So what niche is Zig aiming to fill?


no, null pointers are enforced safe at the type level in zig, as are array bounds, this eliminates huge classes of errors, so you are not "responsible for everything". unlike c, you often (unless highly tuned performance is needed) do not have to resort to opaque void pointers, and the compiler gives you typesafety on that, another major footgun in c.

also operators and integer types are unambiguous, and there is no UB in safe compilation modes.

It's arguably much better than C, not "slightly better than C"


If you write a modern style of C, you can have bounds checked code and do not need to use void pointers. I usually find that people overestimate the advantages of newer languages compared to using C by comparing to old and badly written C.


GP: "you are responsible for everything"

AKA You are responsible for opting into "modern C". In order to be unsafe in zig (in the dimensions I mentioned) you must opt out.


You would need to opt-in into zig first, which is more effort than opting into modern C when you come from C.


i suggest you show this thread to a neutral third party and ask them if your line of argument makes sense.


I used Zig for my projects for a while, but moved back to C for similar reasons. C23, with gcc extensions, and using MISRA-like coding style where it makes sense, provides a similar experience to Zig but with seamless C interop. You dont have comptime, but my biggest lesson from my time in Zig is that I usually want to pass a vtable instead of monomophizing over input types.


Considering that basically nobody uses Zig, I think most neutral third-parties seem to agree.


Zig detects memory leaks pretty well when you build it using -Doptimize=Debug.


I've been wanting to do this! The plan was to modify the Bazzite DX version build script, but ultimately Fedora being base was a deal breaker for me. With KDE Linux this might finally be a dream come true.


It seems like KDE linux uses a different way to provide a system image than ostree on Fedora Silverblue, so I have no idea how easy it is to make changes on top of.

But for Bazzite (and other universal blue distros) you better use BlueBuild

https://blue-build.org/

In the end it's an OCI container image, so you could technically just have a Dockerfile with "FROM bazzite:whatever" at the top, but bluebuild automates the small stuff that you need to do on top of it, and allows you to split your config in files.

You can have a look at my repository to see how easy it is !

https://github.com/LelouBil/Leloublue


I'm on a 3080 and it uses 1 gb vram and 22% util. Sure it's still not lightweight, but certainly not as bad as you seem to be experiencing.


Perhaps it depends on other factors like screen resolution and scaling.


probably. I’ve got a 4K monitor with a 1050 Ti and the moment I open the site, GPU usage jumps from 1% to 99% and the fans go wild.


Yeah and in real world people from different countries with vastly different economic backgrounds compete on the same stage, I think video games are fine.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: