Hacker Newsnew | past | comments | ask | show | jobs | submit | WhyNotHugo's commentslogin

I’ve been using usbmuxd+ifuse to copy the photo files straight from the phone. No need to wait for an upload/download to some remote server, just a direct cable from the phone to my computer. I get the original files, and can even move (instead of copy) to clear up the phone.

This works on any iPhone? It mounts the non-privileged DCIM folder or whatever over USB to somewhere on your filesystem? With write access?


I use a lot of short-lived terminals. I have zsh+foot configured so that ctrl+shift+n opens a new terminal with the same current directory, so when using Vim, that's as fast as putting Vim in background, but I can tile both windows easily.

I never have more than a one or two dozen terminals at a time, but I definitely open hundred of short-lived ones.


Why? You've said you do but not why you do? Why not leave a terminal open?

Why would I leave it open once I'm done with the task for which I opened the terminal?

You'll enjoy "Altered Carbon", which focuses (partially) on this topic: if we get rid of death, then the worst of the aristocracy never dies.

+1000 - altered carbon season 1 is amazing. IMHO commit to watching it 3x to get everything going on - after the first watching, everyone's like "that was amazing but I'm not sure what I just watched." It's just so rich - if The Matrix is 136 mins vs 570 mins with that much more depth.

Also Greg Egan's Permutation City covers these topics in a different way.

Being closed-source isn't just an ideological issue, it bring about a lot of practical issues. E..g.: distributions aren't going to package it, so users need to download the tarball and install it manually. They'll also need to manually update it (unless they're including some dedicated service?).

Then, integration with the OS will be weird. If you're distributing binaries, you can't dynamically link system dependencies (they are either bundled or statically linked). Any distribution-specific patches and fixes will be missing. AFAIK the default path for the CA bundle varies per distribution; I'm not even sure how you'd handle that kind of thing. I'm sure there's hundreds of subtle little details like that one.

The audience ends up being Linux users, who are fine with proprietary software, have time and patience for manually configuring and maintaining a browser installation, and are also fine with an absence of proper OS integration.

I think Steam is the only popular proprietary software on Linux, and they basically ship an entire userspace runtime, and almost don't integrate with the OS at all.


> The sticking point like always will be media playback (read: DRM/widevine). That is the graveyard where Linux browsers go to die.

On Firefox, you can disable DRM in about:config. Forks such as Librewolf and Tor Browser disable DRM by default.


I’ve seen dedicated hardware devices which scan a QR-like code and show this in a little screen of their own. The bank provides them and does not require any app.

I only know of a single bank using this.


>I only know of a single bank using this.

If it's not Crédit Mutuel then you now know of a second bank using this method.


I am interested too, my fallback bank trapped me (or my courage to resist), the fallback of fallback would be crypto but i am not sure i want to depend on this too...

Meanwhile, the last hope is that people will use more cash (if the digital world is too hostile, oh wait it is!)


This does sounds like the situation where the employer should provide you with the phone.

Indeed. Never spend your own money on work related expenses. If your job requires a phone, they need to provide one.

When working on pimsync[1] and the underlying WebDAV/CalDAV/CardDAV implementation in libdav, I wrote "live tests" early on. These are integration tests, which use real servers (radicale, xandikos, nextcloud, cyrus, etc). They do things like "create an event, update the event, fetch it, validate it was updated". Some test handle exotic encoding edge cases, or trying to modify something this a bogus "If-Match" header. All these tests were extremely useful to validate the actual behaviour, in great deal because the RFCs are pretty complex and easy to misinterpret. For anyone working on the field, I strong suggest having extensive and easy to execute integration tests with multiple servers (or clients).

All servers have quirks, so each test is marked as "fails on xandikos" or "fails on nextcloud". There's a single test which fails on all the test servers (related to encoding). Trying to figure out why this test failed drove me absolute crazy, until I finally understood that all implementations were broken in the same subtle way. Even excluding that particular test, all server fail at least one other test. So each server is broken in some subtle way. Typically edge-cases, of course.

By far, however, the worst offender is Apple's implementation. It seems that their CalDAV server has a sort of "eventual consistency" model: you can create a calendar, and then query the list of calendars… and the response indicates that the calendar doesn't exist! It usually takes a few seconds for calendars to show up, but this makes automated testing an absolute nightmare.

[1]: https://pimsync.whynothugo.nl/


Which server was the most compliant? I have been using Radicale for a while, but would like to know if that is not a good choice.

> Nearly all systems have the ability to call libraries written in C. This is not true of other implementation languages.

This is no longer true. Rust, Zig and likely others satisfy this requirements.

> Safe languages usually want to abort if they encounter an out-of-memory (OOM) situation. SQLite is designed to recover gracefully from an OOM. It is unclear how this could be accomplished in the current crop of safe languages.

This is a major annoyance in the rust stdlib. Too many interfaces can panic (and not just in case of an OOM), and some of them don’t even document this.


> This is no longer true. Rust, Zig and likely others satisfy this requirements.

Rust and Zig satisfy this by being C ABI-compatible when explicitly requested. I'm pretty sure that that solution is not actually what the author meant. When you don't explicitly use `extern "C"` in Rust or `export` in Zig, the ABI is completely undefined and undocumented. I would go as far as arguing that the ABI problem, with Rust at least, is even worse than the ABI problem C++ has. Especially since Distros are (if memory serves) starting to ship Rust crates...


The only reasonable way to solve that problem is to design a standard ABI that supports expression of all features in modern compiled languages like Rust, Go, Zig, Haskell, Ada, etc. It doesn't make any sense to design stable ABIs for each language separately, because then you'll end up with a system directory full of dynamic libraries of several distinct ABIs.

Sure, a shared model does make sense in many ways. We could share within a family, neighbour cooperatives, and similar scales. With the users co-owning the means of processing.

But the current model is that we all rent from organisations that use their position of power to restrict and dictate what we can do with those machines.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: