Hacker Newsnew | past | comments | ask | show | jobs | submit | gary_0's commentslogin

I'm pretty sure quantum mechanics already forgoes conventional causality. Attosecond interactions take place in such narrow slices of time that the uncertainty principle turns everything into a blur where events can't be described linearly. In other words, the math sometimes requires that effect precedes cause. As far as we can tell, causality and conservation of energy is only preserved on a macroscopic scale. (IANAQP, but I'm going off my recollections of books by people who are.)

Einstein laid the theoretical foundations for lasers in 1917, and it took over 40 years of "impractical" scientific work before the first functioning laser was built. It took decades more for them to become a cheap, ubiquitous technological building-block. The research is still continuing, and there's no reason to assume it will stop eventually bearing fruit (for the societies that haven't decimated their scientific workforce, anyways). Look at the insanity required to design and build the EUV lasers in ASML's machines, which were used to fabricate the CPU I'm using right now, over a century after Einstein first scribbled down those obscure equations!

I sincerely wonder how someone that is unaware of any of this finds their way onto HN, but at the same time it is an educational opportunity. 'nothing practical' indeed...

In addition, lasers were long believed to be a scientific novelty without any real world use.

Thanks for sharing. Just curious, is there any way to perform globbing over a list of path-like strings instead of only directly on the filesystem?

In case someone doesn't know, the standard function for that is called fnmatch:

https://man7.org/linux/man-pages/man3/fnmatch.3.html


That's one of the reasons I built zlob. It literally has an endpoint to do this but if you are talking about glibc there are 2 options

1. fnmatch function which is not ideal because it doesn't take into account all the path specific optimizations and does not support BRACE 2. ALTDIRFUNC flag for globbing let you simulate file system which absolutely sucks

in zlob you can simply call zlob_match_paths(<pattern>, <list>, flags, ptr)

where list would be either c string or rust/zig like slices


As I understand it, compiling each source file separately and linking together the result was historically kind of a hack too, or at least a compromise, because early unix machines didn't have enough memory to compile the whole program at once (or even just hold multiple source files in memory at a time). Although later on, doing it this way did allow for faster recompilation because you didn't need to re-ingest source files that hadn't been changed (although this stopped being true for template-heavy C++ code).

> quite explicit furry art, is very common on bluesky

Now there's an understatement. It's bloody impossible to get rid of. People here are sneering at all the political content but they're ignoring the curvaceous elephant in the room. I think maybe bsky has improved things now, but a while back their adult content filters were not up to the task. When I first made an account I almost gave up on it because until I got all the right filter words set up it was nothing but weird porn whac-a-mole (actually that's probably a poor choice of words...)


I absolutely do not understand moving "report spam" under "report misleading". The UX for this is terrible. There are lots of bots posting SEO junk, at a rate and scale that definitely wastes resources, and now bsky has interfered with one of the signals it should be using to combat the problem.

Maybe they're paying bsky. Occam's razor.

Occam's razor takes you to a bribery conspiracy?

Money inclusive or incompetence.

yes. bribery is extremely common and simple.

Have you consumed any media in the last decades.

I designed the electronics for a heavy-duty industrial 3D printer and used a 555 in the failsafe circuit (alongside the manual e-stop). If it didn't get reset by a heartbeat from the embedded computer/software, it would unpower the heaters and actuators.

That's the only use for one that isn't (always) a design smell - it makes a really nice missing pulse detector, better than you can easily do with comparators. But if you have the budget, a purpose-made watchdog chip or a tiny microcontroller really can make a better watchdog.

I would absolutely not want to use a microcontroller or a complicated chip for something like that. Simplicity is the point.

Supervisor chips are not complicated. In some ways simpler than a homebrew analog watchdog, and the good ones will handle failure modes a simple watchdog won't, like those that result in an oscillating output.

Yes, a simple purpose-made chip designed to be used in safety-critical situations, with high tolerances for voltage etc, would probably be better. Although one thing the 555 design has going for it is that a seasoned EE could take one look at the physical circuit and know exactly what it does.

But I would never trust anything that ran software for something like this.


It depends on the system's potential failure modes and what's required by your safety standard, not on one engineer's opinion of what's "best".

Modern 32-bit microcontrollers are cheaper than 555s.

Yes, but uses software, so you have another level of added complexity that may be or may be not desirable.

The chips themselves add a bunch of new failure states to consider beyond software bugs, too. Maybe a bad wire or component puts too much load on the microcontroller's wee internal pin drivers and they melt into a permanent "on" state. Or a voltage fluctuation browns out the chip on boot, partially randomizing its RAM or registers. Or the chip manufacturer fixes some errata or discontinues a particular part number and now a pin you've left floating has become a hardware heisenbug. Or the wrong bit flips in your EEPROM after being in a hot machine for a few years. Suddenly a boring 555 looks pretty good. (Keep in mind, we're talking about "turn off heater after pulses stop", not "abort launch sequence if tank 3 pressure low". The latter is way above my pay grade.)

For every task you could also use a 555 timer for (with dedicated analog support complexity,) you are talking about tens of lines of user code at most.

Even if you had to do everything directly with registers, the amount of C or Rust here is minuscule.


There's the guy who's never shipped and supported a product.

Not if you go to the cheap "Asian brands" like you're thinking with micros, plus with your cheap micro you'll need a reset controller. And budget isn't all BOM cost.

Who provisions dedicated reset monitors on $0.06 MCUs?

My memory that far back is hazy but I seem to recall being able to do full-page zoom in Opera circa 2003.

If I download the image, Fedora KDE shows it properly in Dolphin and Gwenview.


Yes! I don't want a car with an "innovative" way of steering. I don't want a huge amount of creativity to go into how my light switches work. I don't want shoes that "reinvent" walking for me (whatever the marketing tagline might say).

Some stuff has been solved. A massive number of annoyances in my daily life are due to people un-solving problems with more or less standardized solutions due to perverse economic incentives.


> I don't want a car with an "innovative" way of steering.

99.5 % agree, because I would love to try SAAB:s drive-by-wire concept from 1992: https://www.saabplanet.com/saab-9000-drive-by-wire-1992/


The thing why this was only a research project and never came into mass production was regulatory stuff, IIRC? (most EU countries require, still until today, a "physical connection between steering wheel and wheels" in their trafic regulation)


This was a few years before Sweden joined the EU, but yes, I think the lack of a physical connection was one of the main problems.

From what I've read the test drivers also thought the car was too difficult to drive, with the joystick being too reactive. I wonder how much of that could be solved today with modern software and stability control tech.

I can't find it now, but I do remember a similar prototype with mechanical wires (not electrical) that was supposed to solve the regulatory requirements. That joystick looked more like a cyclic control from a helicopter.


Having played enough video games that use joysticks for steering I don't want to drive a real car with a joystick. Crashing in Mario kart or Grand theft Auto because I sneezed is fine but not in real life.


Exactly. The control needs to have both an intentional and major motor movement from the driver. Modern steering wheels have the same benefit as the original iPod wheel. Easy for small movements, even accidental ones; possible for big movements.

Also funny that they had the ability to swap to the passenger to drive it. So acceleration/break for one person, steering for another? Really not a good idea.


I think there's a ton of innovation left to be done regarding steering and light switches.

You're right that it's not going to be better designs, but paradigm shifts.

We still don't know what it means to provide input to a mostly self-driving car. It hasn't been solved and people continue to complain about attention fatigue and anxiety. Is the driving position really optimal for that? Are accident fatalities reduced if the driver is sitting somewhere else? Even lane assist still sucks on traditionally designed cars. Is having to fight a motorized wheel to override steering really all that safe?

Light switches may be reliable and never go away, but we have many well-established everyday examples of automatic lights: door switches, motion sensing, proximity sensing, etc. You never think about it and that's the point.


> Yes! I don't want a car with an "innovative" way of steering.

You might, but you'll never really know.

I mean, steering wheels themselves were once novel inventions. Before those there was "tillers" (a rod with handle essentially)[0], and before those: reigns, to pull the front in the direction you want.

[0]: https://en.wikipedia.org/wiki/Benz_Patent-Motorwagen


I highly doubt there's a steering input device so superior to the current wheel shape that it's worth throwing out the existing standard. Yes, at one point how steering should work (or how you should navigate the Web) was uncertain, but eventually everyone settled on something that worked well enough that it was no longer worthwhile to mess with it.

Although, one thought I had is that there's nothing wrong with experimenting with non-standard interfaces as long as you still have the option to still just buy, say, a Toyota with a standard steering wheel instead of 3D Moebius Steering or whatever. The problem is when the biggest manufacturers keep forcing changes by top-down worldwide fiat, forcing customers to either grin and bear it or quit driving (or using the Web) entirely.


I sympathise with the frustration, but I think the issue isn't innovation itself: it's that we've lost the ability to distinguish between solving actual problems and just making things different.

Take mobile interfaces. When touchscreens arrived, we genuinely needed new patterns. A mouse pointer paradigm on a 3.5" screen with fat fingers simply doesn't work. Swipe gestures, pull-down menus, bottom navigation—these emerged because the constraints demanded it, not because someone thought "wouldn't it be novel if..."

The problem now is that innovation has become cargo-culted. Companies innovate because they think they should, not because they've identified a genuine problem. Every app wants its own navigation paradigm, its own gesture language, its own idea of where the back button lives. That's not innovation, that's just noise.

However, I'd have to push back on the car analogy: steering wheels were an innovation over tillers, and a crucial one. Tillers gave you poor mechanical advantage and required constant two-handed attention. The steering wheel solved real problems: better control, one-handed operation, more space for passengers. It succeeded because it was genuinely better, and then it standardised because there was no reason to keep experimenting.

The web needs more of that approach: innovate when there's a genuine problem, then standardise when you've found something that works. The issue isn't innovation, it's the perverse incentive to differentiate for its own sake.


Leaving aside the externalities of constantly breaking everyone's workflow and potentially introducing disastrous bugs, there's an opportunity cost to innovating where there isn’t a clear need. Google and others are wasting massive resources endlessly tweaking browsers and the Web because that's all they know how to do, their users are locked in and without recourse, and they don't feel threatened by any competitors or upstarts. I would argue the web and smartphones and similar tech are boring now but because the market is controlled by only a few huge companies, the tech hasn't been allowed to become low-margin, standardized cookie-cutter commodities. Instead these attempts to make this old boring tech seem exciting is getting to the point where it's sad and comical.


Your last paragraph reminded me of HTML5 and the WHATWG which led to official W3C adoption.

Back when that started W3C was still strongly embedded in the XML hellhole.


You need to be careful here, because we have a real tendency to get stuck in local maxima with technology. For instance, the QWERTY keyboard layout exists to prevent typewriter keys from jamming, but we're stuck with it because it's the "standardized solution" and you can't really buy a non-QWERTY keyboard without getting into the enthusiast market.

I do agree changing things for the sake of change isn't a good thing, but we should also be afraid of being stuck in a rut


I agree with you, but I'm completely aware that the point you're making is the same point that's causing the problem.

"Stuck in a rut" is a matter of perspective. A good marketer can make even the most established best practice be perceived as a "rut", that's the first step of selling someone something: convince them they have a problem.

It's easy to get a non-QWERTY keyboard. I'm typing on a split orthlinear one now. I'm sure we agree it would not be productive for society if 99% of regular QWERTY keyboards deviated a little in search of that new innovation that will turn their company into the next Xerox or Hoover or Google. People need some stability to learn how to make the most of new features.

Technology evolves in cycles, there's a boom of innovation and mass adoption which inevitably levels out with stabilisation and maturity. It's probably time for browser vendors to accept it's time to transition into stability and maturity. The cost of not doing that is things like adblockers, noscript, justthebrowser etc will gain popularity and remove any anti-consumer innovations they try. Maybe they'll get to a position where they realise their "innovative" features are being disable by so many users that it makes sense to shift dev spending to maintenance and improvement of existing features, instead of "innovation".


> For instance, the QWERTY keyboard layout exists to prevent typewriter keys from jamming, but we're stuck with it because it's the "standardized solution" and you can't really buy a non-QWERTY keyboard without getting into the enthusiast market.

So, we are "stuck" with something that apparently seems to work fine for most people, and when it doesn't there is an option to also use something else?

Not sure if that's a great example

Sometimes good enough is just good enough


> the QWERTY keyboard layout exists to prevent typewriter keys from jamming

even if it is true (is it a myth by any chance?), it does not mean that alternatives are better at say typing speed


As someone that makes my own keyboard firmware, 100% agree. For most people, typing speed isn't a bottleneck. There is a whole community of people that type faster than 250wpm on custom, chording-enabled keyboards. The tradeoff is that it takes years to relearn how to type. Its the same as being a stenographer at that point. Its not worth it for most people.

Even if there was a new layout that did suddenly allow everyone to type twice as fast, what would we get with that? Maybe twice as many social media posts, but nothing actually useful.


I'd imagine at this point that most social media posts are done by swiping or tapping a phone's virtual keyboard (if one is used at all).


One don't need to be a scientist to take a look at own hands and fingers, to see that they are not crooked to the left. Ortholinear keyboard would be objectively better, even with the same keymap like QWERTY, but we don't produce those for masses for a variety of reasons. Same with many other ideas.


> to see that they are not crooked to the left

how it makes ortholinear keyboards better?


If I recall correctly, QWERTY was designed to minimize jamming. The myth is that it was designed to slow people down.

Whether it does slow people down, as a side effect, is not as well established since, as another person pointed out, typing speed isn't the bottleneck for most people. Learning the layout and figuring out what to write is. On top of that, most of the claims for faster layouts come from marketing materials. It doesn't mean they are wrong, but there is a vested interest.

If there was a demonstrably much faster input method for most users, I suspect it would have been adopted long ago.


It's been debunked by both research (no such mention at the time) and practice on extant machines.


These days QWERTY keyboards are optimal because programs, programming languages and text formats are optimized for QWERTY keyboards.


Depends on the language no? Qwerty isn't great for APL.


I have a QWERTZ keyboard!

Is my digital life at a natural end now?


If you mean the default German keyboard layout then, yes, putting backslashes, braces and brackets behind AtlGr makes it sub-optimal in my book. Thankfully what's printed on the keys is not that important so you too can have a QWERRTY keyboard if you want.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: