It's not. Regulation (EC) No 1370/2007[^1] states in the annex, related to compensation in cases where a public operator operates subsidised public services and commercial, for-profit activities, that:
>In order to increase transparency and avoid cross-subsidies, where a public service operator not only operates compensated services subject to public transport service obligations, but also engages in other activities, the accounts of the said public services must be separated so as to meet at least the following conditions: [...]
Another topic is: should France be allowed to keep the TGV monopoly in their country because they need it to finance the rest of their network, while they are allowed to operate abroad (like in Spain), taking away business from Renfe through the free market competition they try to impede on their country anyway?
BT was deep into preparations for a nationwide fiber rollout at the time of privatisation in the early 80s. The project was cancelled, the fledgling factories equipment and expertise were instead exported to South Korea, enabling their widespread fiber penetrance.
That delayed fiber rollout in the UK by decades.
Was that a success? Could be they were too early to justify the cost?
But without someone pushing ahead, who develops the technology?
It was that kind of delusional decision making that justified the privatization in the first place. Rolling out fiber nationwide in a world where the web had only just been invented and everywhere else in the world was connected via modems would have been catastrophically expensive and supplied bandwidth nobody would have been able to use. The idea the internet could have skipped straight from 33kbaud to fiber speeds is idiotic to anyone who remembers the state of the internet at that time. Most servers were not connected to the internet via fiber.
Unwittingly a brilliant demonstration of how short-term capitalistic behaviour hamstrings society. Japan, Korea and Hong Kong are way ahead of the UK and much of Europe precisely because of the lack of insight and vision reflected in your post.
Which is a means to an end, not an end in and of itself... UAE has higher median home bandwidth than the USA but nobody is claiming UAE has somehow uniquely benefited from that.
No they aren't. I don't buy the fastest internet available to me because it costs a lot more and none of my computers would be able to use it without special upgrades. If I did for some reason upgrade, nothing would get faster because bandwidth isn't the bottleneck for anything I do, latency is (and latency is mostly server side). Most people are in that situation in the country where I live. FTTH can deliver more bandwidth than there is demand for.
I mean, come on. This is HN. Aren't we supposed to be engineers here? Increasing bandwidth only matters if some useful activity is constrained by the lack of it. You can't just say more of X is always a good thing, for any X. Ignoring tradeoffs is the mistake that creates leftism, but there are always tradeoffs.
> Rolling out fiber nationwide [...] would have been catastrophically expensive
It seems like they had managed to bring the cost below copper:
> In 1986, I managed to get fibre to the home cheaper than copper
> we had two factories, one in Ipswich and one in Birmingham
But the British government was concerned:
> BT's rapid and extensive rollout of fibre optic broadband was anti-competitive and held a monopoly on a technology and service that no other telecom company could do
> So the decision was made to close down the local loop roll out and in 1991 that roll out was stopped. The two factories that BT had built to build fibre related components were sold to Fujitsu and HP
This might be an argument for privatisation, because the government was still in full control of the company when they prevented the fibre rollout. Would the owner of a private company squander such an advantage over concerns for their competitors?
On the other hand, would a private company have had the capability to plan this forward in the first place? We do see that from Big Tech companies (e.g. Apple silicon) but could BT have done it under private ownership?
Hmm, I didn't realise that's why the rollout was stopped. (Which doesn't really make sense as a given reason anyway because wouldn't the fibre just have become part of openreach anyway.
So I think new technology is generated through research funding, either public sector (universities) or R&D depts in private companies (today companies like Apple, previously old school companies like pre-Welch GE). I guess BT was more like the latter, except state owned.
In telecoms there's also a universal service obligation, which does not make economic sense when driven purely by profit motive. Cost of rolling out fiber to a small village will probably never be recouped. Thats why FTTH w/ Virgin Broadband was only available in cities for a long time, and expensive.
In the US where telecoms have regulatory capture, and no public access telecom network, you see stories of rural communities trying to fund their own infra. It's expensive.
Cost of rollout and universal service can be helped by rolling out at scale, building the factories, reducing unit price etc.
So all this together....
I think private companies _can_ have the foresight to do this kind of forward planning...
But a big nationwide rollout of a public good? Where is their financial incentive? They would provide an environment for the acceleration of future commerce and technological development. But if they don't make money from it, why would they?
> wouldn't the fibre just have become part of openreach anyway
The issue was the subsidies. The fiber plan wasn't going to be profitable then or maybe ever, so it was dependent on tax funding that competitors wouldn't have access to. BT had to become an economically rational company which meant tossing not only the fiber stuff but around half their employees too. Building fiber and then giving it to OpenReach wouldn't have helped BT become competitive.
Planning ahead wasn't the issue. The issue was rationality, or economics if you want to call it that. You don't want to build out infrastructure way in advance of demand, it's just bad engineering and would have imposed huge costs on the already crippled British economy for no gain. In the period we're talking about home computers don't speak TCP/IP at all, there is no web, the highest bandwidth users of the internet are email / IRC, hardly anyone is selling internet access to homes and the biggest sites have uplinks of a few megabits/sec at most. What exactly are they going to connect to over all this fibre? Online video wouldn't become practical for decades, machines of the time couldn't even begin to handle it even with infinite bandwidth.
No, Cochrane was an idiot, exactly the type of central planner privatization was good for getting rid of. Look at what he's saying.
> In 1986, I managed to get fibre to the home cheaper than copper
According to a guy who hadn't done it. How do you make building out an entirely new physical network cheaper than using the existing one? What was this magic trick he found that let him snap his fingers and instantly replace all the wires in the ground? That claim just wasn't true, was it.
Lots of things in that interview were very wrong. "In 1974 it was patently obvious that copper wire was unsuitable for digital communication in any form". The first patent for what became ADSL was filed in 1979. Internet access was rolled out across the existing copper network successfully for decades after that. His engineering skills were "obviously" not that great because DSL isn't an unintuitive idea, it just runs data at different frequencies on the same copper lines as voice. There were high bandwidth copper links in the 1970s already. There's lots of details involved to make it work well to consumer residences, but the concept is simple enough. He didn't research that possibility first because being a nationalized monopoly meant there was no downside to just playing with the coolest tech whilst ignoring economic rationality. He had no history of running a profit-making business, he'd spent his entire career in nationalized monopolies.
You can see the problem here:
> "It's like everything else in the electronics world, if you make one laptop, it costs billions; if you make billions of laptops it costs a few quid".
Since when do laptops cost a few quid? And the costs of FTTH aren't dominated by the cost of fiber and switches, WTF. The cost is all in the manual labor of rebuilding the physical network and upgrading the homes themselves. You can't manufacture your way to a cheap nationwide fiber network.
This kind of economic illiteracy is exactly what brought the British economy to the edge of total collapse in the 70s and caused voters to bring Thatcher in three times in a row (and maybe they'd have gone for a fourth if the Tories had let them).
I'm not sure you know the subject very well. By 1986 Cochrane had already overseen fibre installation in the long lines network - leading to a dramatic drop in maintenance and staffing costs - and had installed fibre to his home. He had a clear understanding of the tech and the costs involved. You don't end up on as big a list of boards and directorships as he has without being savvy with both tech and economics.
Your faith in the British establishment is touching but misplaced. Being given a long series of fake (sorry, "visiting") academic roles and board seats is exactly how you'd expect them to protect a former government official.
He wasn't savvy with tech. I just demonstrated that. He claimed it was "patently obvious" you couldn't deliver data over copper just a few years before the tech to do it was invented.
He wasn't savvy with economics either. BT was a basket case when he was running it, which is why they had to lay off over 100,000 employees the moment they were privatized. And again, he claimed you could make FTTH cheap by scaling up cable manufacturing, which is not only economically nonsensical but is obviously nonsense to anyone who thinks about it for a moment.
The moment BT had to actually deliver value matching the prices they charged, they could no longer justify FTTH nor could anyone else, because it was irrational. That's why the project got cancelled. Stop trying to force a left wing narrative where none fits: if FTTH was such a great idea in the 1980s other companies would have done it. None did, because it was the wrong call.
The private market made the exact same choice in the 90s, but instead of just being something that cost maybe more tax money than it should, it was hyped up and full of lies and marketing and bullshit and burned tons of cash for really stupid projects and caused a serious recession.
So..... not really a point for privatization here.
The 90s were dominated by modems and around the turn of the millennium DSL connections. Fiber internet (to the home) didn't start until much later. The backbone was getting fiber deployed before then, but there was never some kind of unusual bandwidth crunch in the UK. The private sector managed the rollout just fine.
I don't quite understand this post. Wouldn't rolling out fiber infrastructure early have been proved to be visionary and made the UK a serious technical force?
In Australia, we went through a similar journey where fiber to everyone's home was planned and then politically destroyed. Except this happened in 2010 and has been a significant factor in our inability to retain a technical edge.
No. The idea that more bandwidth to the home=generically futuristic economy and outperformance is exactly the kind of bad central planning that makes socialist countries poor! It's the kind of mistake privatization was intended to fix (and did). The Soviets made the same mistake decades earlier when they overbuilt steel mills.
The USA is the world leader in computing and many parts still have notoriously poor bandwidth to the home today. The link between home fiber and economic performance is very weak.
Bandwidth upgrades need to consider the whole equation, including cost of infra upgrades of different techniques and demand. Remember that fiber was over built during the dot-com bubble and ended up going dark because there wasn't enough demand to consume it, not even on the backbone.
Yes, that's a serious ux issue in modern test results. Instead of just a range there needs to be some kind of variance score. It can be presented simply like, 10% out of range, or, 20x lower than normal. Hell, even just a second range that says "critical, seek medical attention". Then you can judge how close to that line is worth going for. There just should be more information. It's all computers nowadays anyways, they could print thousands of words, send me a video explanation, they could do anything!
Don't see what an LLM really adds here that wouldn't be served by a sensible algorithm which flags up results needing urgent action.
Maybe the end user get more clearly explained and non-bewildering information, which is useful, but also points to a failure of any health system / internet resources to provide this.
The LLM _HAS_ empowered the patient to act on his own health, which is undeniably a good thing.
It seems to me what the LLM adds is that it is the sensible algorithm which flags up results needing urgent action in a growing set of everyday domains.
The human like conversation urging to go to the ER was particularly useful in this instance. I am not saying I don't fear for it's mistakes but clearly it's ability to communciate in natural language is something to not take for granted.
Indeed the victory for the AI here is the ability to convince the author to take immediate action. The second win for the AI is the concise one liner the author tells the triage nurse so that her further diagnosis and treatment may proceed. A regular human being would be likely to start with a long form story of not feeling well for some time.
To be fair, This is what AI is good at. No matter what problem you're trying to solve, AI will provide a solution.
In this case, lab result analysis was entirely based around standard care protocols that LLM is really good at regurgitating back to you.
Yeah sure, anyone could probaly promt feed this into a homeopathy solution which would have killed the patient as per their own predisposed will. You could argue a standard programmatic diagnostic evaluation would be better but its not entirely clear wether human decision making improves based on webMD above an agitated Chatgpt urging you to seek professional help.
And now that I think about it more, no one except doctors and AI could reliably improve diagnosis based on a picture and lab results side by side. So if your healthcare system doesn't provide a response then chatgpt might actually be the closest neighbour in time of need.
This may be a nice example of the benefits and drawbacks of GUI vs CLI software.
Sure OpenSSL has myriad inscrutable options, but the majority of these probably aren't going to be used by 90% of people.
Yet including them in a CLI gives less clutter/overhead versus what they would in a GUI.
They can just stay out of the way for the people who need them.
I guess the same could be said of an "Advanced Features" second page of a GUI though.
My problem with Joplin is that on my M1 Macbook pro it really shows how much of an Electron app it really is, in the worst ways. Extreme memory use and UI lag for an application which displays text. That said, aside from performance it's quite satisfying to use. It's very simple and does its job well.
I recently migrated to Obsidian and although the learning curve is steeper, I'm quite happy with the results.
I went from Notion to Joplin for note taking: I drop 1000s of images in a note and annotate them and publish them. Notion crashed all the time; so badly that support apologised and had to delete things for me. Joplin has 0 issues with it; it scrolls super fast, is a pleasure to work with. Also on a m1. I cannot say I noticed any bad behaviour compared to others, included Obsidian.
I went from Notion and Joplin to Obsidian mostly because Obsidian stores stuff as regular files on my filesystem and I can easily interact with them with external programs and scripts.
Joplin, while being open source itself, has a "proprietary"[1] storage format I can't be arsed to figure out how to interact with.
[1] Meaning non-standard in this case, Joplin is the only software using Joplin's storage method
Plume looks very interesting! One place I personally find many apps in this category fall down is their ability to handle pdf’s embedded or attached within notes gracefully. This applies to desktop and mobile apps.
This may be a slightly weird use case but I accumulate tons of pdf’s that are often relevant to my notes and want them easily embedded and viewable in a first class way. Obsidian is a great example of how not to handle a pdf locking it to a small portion of the window and not allowing it to be full screened.
Forcing the user to dump all of their pdf’s into something like google drive locked away from the rest of their notes is a crappy experience and h fortunately keeps me using apps like evernote purely for this functionality.
If you need pdf viewing and management (indexing, annotations, citations), it might be useful to look into citation managers, which are built to do exactly these things: check out Zotero.
Thanks! Hmm, I've seen many open-source Qt apps integrate PDF support, so I guess I can study them. I'm adding this to the to-do list. How do you usually add a PDF to your notes? Drag and drop? What's the ideal way you look to interact with it? You said no full-screen, so what it does look like?
Plume is actually based on my open source note-taking app Notes[1]. You can already get it on Flathub, Snap Store etc. Notes uses just a simple plain text editor while Plume has a completely revamped block editor that I built from scratch. That parts of Notes used in Plume will remain open source (per the MPL license) but the rest of the code will be closed source. At least for the time being.
Ah too bad, I do need a rich text notes app (and no markdown, I hate markdown, under the hood it's fine but I don't want to deal with it :) ). Also hate kanban and agile methodology by the way ;) Luckily I'm not a dev otherwise I would have to work with all of those lol.
But perhaps you could do the monetisation via the sync service only and make the app open source :) That would be great, at least for me so I could compile it on FreeBSD. Some others do this, like Obsidian, for which there's an actual BSD port. But it's electron, sadly. But I understand... It's a tiny niche. I'll keep looking.
Of course I can't use flatpak and snap. And I can't stand snaps so no way I'd use that. Flatpaks are a bit better but not working on BSD.
I really used to love tomboy. It was fast, rich text, would automatically hyperlink notes together as you typed, it was so great. But they stopped development on it. There were a few reboots but they were complete rewrites and lacked all the speed and smoothness I loved.
Is it a deal breaker for you that the app isn't open source? What if I create binaries for FreeBSD/your distro and there's no telemetry/option to disable connecting to the internet (even for updates)?
That would work perfectly yes! It's not the internet connection that bothers me (in fact I'd probably use the sync).
But usually developers don't care enough about the tiny userbase of FreeBSD to even consider that. If you would do that, I would really like it, though I can imagine that from a time/gains perspective there is no point. Which I do totally understand.
One thing I like is that your monthly fee is very reasonable. Obsidian costs double the price of my entire Microsoft 365 subscription :) Besides it being electron that's another issue for me. Especially because it's just not really that great.
I support your app so much I would pay a monthly fee instead of a one-time purchase. My notes are as valuable as my life. I don't mind the app being proprietary if it gets the love it deserves. If you can accomplish the goals you set out, like providing good functionality and performance, then I'm cheering you on. My needs are very basic, just the minimum to accomplish Zettelkasten.
I don't mind using a closed source solution - but only if I can keep my notes separated from the application itself. Makes it easy for me to back up my notes and to use versioning tools like git. It also allows me to use bash to manipulate my notes independently of the application at any given time.
Plume seems very pretty - good job on that, but....
"All notes in Plume are simple plaintext strings under the hood. Right now, all these plaintext strings are stored in a SQLite database locally on your computer. But we have plans to remove the reliance on a custom database and to store all notes as simple .txt files inside a folder."
I've been burned too many times by organizational tools that like to keep your notes internally stored within their systems.
Gotcha, no worries, I'm 100% going to migrate the database to a simple arbitrary folder with .md/.txt files. I also want that for myself. It will take a few months of work after the initial release, tho.
I didn't yet try to create a mobile version, but I don't see why, as Qt Quick is very performant. I guess we'll have to wait till I port it and do some testing.
> This degradation of software by web apps shows the lack of optimal resource utilization of even one of the most powerful chips of recent times.
A-fucking-men. Web tools are for building web apps, software tools are for building software. I avoid all these goddamn electron things like the plague if at all fucking possible.
Garbage on phones, garbage on computers, garbage on tablets. Garbage.
It's unfortunately very hard to avoid them. But indeed, on macOS, I try to find only native or native-like apps for my needs. It's the difference between a healthy diet and junk food for my computer.
yeah, it goes to show you that Electron based apps can be done in a reasonably performant way but it does require extra care/attention as Obsidian is an electron app as well.
I have about 13,000 notes (with embedded media PNGs, MP4s, etc) across 50 folders/subfolders on my Mac M1 and searching across all notes in Obsidian is for all intents and purposes instantaneous (less than a second).
Every time I see a post about a note-taking app, Obsidian is mentioned in the comments. I don't know if the app is really good, or if those are just paid comments. At this point I'm not even surprised anymore, especially when conversations always end with a mention of Obsidian Sync.
It's the biggest & most popular app in the note-taking space. It's closed-source, which I don't love, and I've tried to look for alternatives, but there just isn't anything else that's as good as Obsidian. In a situation like that, you don't need to pay people to talk about your product. People will evangelize it on their own.
It's because there are two ways to sync content across devices, paid sync through Obsidian vs. git. Given sync is a p0 feature, it seems logical that both get mentioned when the question arises.
Also, the app's really good, and I pay for Sync -- git works well, but it's a bit clumsier on iOS. Never posted a paid comment in my life.
Joplin released an ARM build a few months back -- so if you were using the pre-arm build it was bad, so just throwing this out there for folks who maybe had similar experiences -- make sure you are using the arm build (it should happen automatically now)
Disclaimer: I've contributed to Joplin in the past, and I use it dozens of times a day with no big speed complaints.
I wish I could compile those electron apps from source à la FreeBSD’s ports or whatever. I’d rather have the laptop compiling for a few days than using electron
One golden rule that I learned is that interface elements should not move unexpectedly after the interface has been drawn.
Google is particularly bad for buttons which move between you lifting your thumb and pressing the screen, but they're not alone.
I wish this would be addressed at the OS level. If a target popped into existence less than ~0.25 seconds before it was touched, a touch event shouldn't be generated. Humans reaction times aren't fast enough to hit a button that quickly anyway.
One problem with this: People often learn to touch multiple targets quickly in sequence, because one touch event predictably pops up the target for the next event.
Huh.
So Android's push notification service is built on their instant messenger (GTalk),
and Apple's instant messenger is built on their push notification service.
The only appreciable difference I see a need for a border guard to physically inspect and stamp passports, which I suspect slows things down. I guess EITAS checks in the future too.
Once EES comes into effect the stamps may not be necessary anyway, but much like fusion, it's perpetually 6 months away.
The main difference is that pre-2021 UK travellers simply had to show travel documents whereas now they have to get them stamped.
On the surface, it isn't clear to me why that makes much of a difference in service time, but the CEO of Eurostar pointed a finger to these changes in January: https://www.bbc.com/news/business-64390979, claiming their trains carry 30% fewer passengers due to station bottlenecks.
And yes, they claim the EES checks should help speed this bottleneck.
Open competition kind of spoils this model. It's not really sustainable.