I worked extensively with Windows 2 back in the day.
Possibly even with Windows 1.0 I can't quite recall.
It was a constant battle to get enough RAM for applications to run. Futzing around with himem.sys and moving around network drivers and stuff in high memory.
Windows applications had to run within the first 640K of RAM. In reality, around that time most corporate machines were 80286 machines and as I recall a high specced machine usually had 2MB RAM.
Kinda weird to think that working, useful Windows applications ran on systems with that little memory.
> It was a constant battle to get enough RAM for applications to run. Futzing around with himem.sys and moving around network drivers and stuff in high memory.
This was a battle even if you weren't running Windows. Many DOS programs wanted enough of that first 640K that you had to jump through hoops to fit them in.
Rebooting 15 times with different driver ordering in the hope you'll find another 3KB of RAM to get the application to run. Really sucked if you were booting from floppy.
QEMM was way more aggressive in clawing out the memory blocks from all nooks and crannies, definitely was more configurable and also had a way to test different configurations of device drivers (ie CONFIG.SYS) fully automatically, so you just started it, go to eat/whatever (pray what it wouldn't lock up) and come back to the best result.
I remember I'd save random mouse drivers I found on different machines at school looking for the one that used the least amount of ram. God, what a nightmare.
My course of action was to run Memmaker, and let it take care of (almost) everything. It usually worked fine. Sometimes I had to fine-tune config.sys/autoexec.bat to make a bit more room or disable EMS, but those were edge cases.
Now that I think about it, it'd be fun to know how Memmaker works internally, but I can't seem to find info on that. Maybe no one has done such analysis... yet.
> Now that I think about it, it'd be fun to know how Memmaker works internally, but I can't seem to find info on that. Maybe no one has done such analysis... yet.
It has been a long time...doesn't it just go through and LOADHIGH all the obvious possibilities? I feel like it maybe had some library of candidates it would check.
I never had this program, I think I did it all manually after analyzing config.sys and autoexec.bat on systems with more free ram than I had. It makes me realize how much knowledge I just mined from school computers.
When I was a kid I really wanted to play Ultima VII and getting that game to work right with a sound card and mouse was a game within itself for a newbie. I think it used its own proprietary memory manger just to make it harder. Eventually I was able to figure it out by making the right boot disk and config.sys, but man what a pain.
That game was amazing and I remember doing the exact same thing. Had to disable ems memory and boot the computer so that the game engines memory manager had enough to work with.
Honestly that game probably set in motion my career path in IT.
> Kinda weird to think that working, useful Windows applications ran on systems with that little memory.
> Ugh I'm glad those days are gone.
I'm quite the opposite... i miss the times when applications could run in a few megs of ram.... nowaday just a browser showing a simple weather report wastes 2 gigs+ of ram.
I was there. They were perhaps not shit applications, but they were extremely basic.
It’s a nostalgic fantasy to look back at that stuff imagining it’s in any way comparable to what users expect today.
I don’t care about applications wasting RAM. Modern machines have plenty of cores and RAM and the software is unbelievably powerful today.
I have some nostalgia for those days (I have a huge collection of vintage computers).
But it wasn’t “the good old days”, in which software was done right and now it’s all sloppy and bloated and lazy. Not at all.
I’m glad of modern software, operating systems and tools. Back then there was always something you wanted to do but couldn’t because the software to do it didn’t exist yet - very frustrating. Now almost anything you want to do, you can because someone has developed the stuff you need to do it.
I’m super glad I was there for the early days but glad they’ve passed.
But you paid a shitton of money for those modern machines with modern amounts of ram and cpu cores, and threw away a bunch of still working hardware for that.
And in a few years, your writing software (*office, word, whatever) won't gain any memorable new features, but the current 32gigs of ram suddenly won't be enough anymore, and you'll buy a new machine, say how you don't care if Word uses 40gigs of ram, because you have 256gigs installed and it doesn't matter...
My firefox is currently using 7 gigs of ram, and i have 23 tabs open... al the information in all those 23 tabs is maybe 20 megabytes, and that includes the pictures in them.
Look at mobile phones... you use your browser, chat programs, take and send photos and sometimes even call a person using one... and after a few years you're still doing the same, but the apps have grown so much in both size and ram usage, that you have to replace your phone, even though you still need and use the same functionality you used when you bought it. Yes, games got better and look nicer, but for most "old people", the only reason to waste money and create e-waste is bloat, because developers can't optimize anymore.
The idea that RAM requirements are going up so much all the time doesn’t seem to match reality if you ask me.
The fact that I’m still getting by totally fine with an old 8GB on my old laptop is frankly speaking insane. It’s from 2015. Back in the 90s using an 8 year old machine for anything modern would have been unthinkable. OK, it can’t do the latest games or 8k video editing. There are some modern tasks it can’t do. But with everything else there’s no big difference with my newer machines.
I can still get by with RAM that’s within an order of magnitude of my 2008 gaming computer. Even for games. Isn’t that crazy? With everything else the requirements (disk size and GPU transistors) have gone up like 2x orders of magnitude.
There’s been some jumps in ram requirements over the years but often it has been for important things like browser sandboxing or HiDPI/retina displays.
There is some validity to the frustration. The switch to MS teams at work was associated with a big jump in the RAM needed for a usable system. But I also get why.. the pace of development of MS teams has been extremely fast which would’ve been hard without electron. I don’t miss Skype for business.
And now they’ve worked on a version that’s more optimised? If Teams is optimised I can get by with 8-16GB for a long time
So, an 128x increase of ram requirement (32->4096) in ~21 years. Do you actually get 128x more of anything? Does it do 128x more? I mean sure, the icons are high-res, so maybe a few 100kB of ram for that, but 128x more? Even the 2016 version required 2 gigs less... and the 'whats new'? https://support.microsoft.com/en-us/office/what-s-new-in-off...
For some things, I agree... a jump form notepad.exe to word brings many new features and an increase in memory usage... but the new word versions keep growing and growing in size but the new feature set doesn't bring anything that would excuse such a large increase in usage.
Yes, teams was broken, and it used less memory if you used it in a browser than a native app.. so there's no excuse for that, especially since the native app was mostly a browser in disguise.
I mostly blame lazyness and overspeced developer computers for that... why optimize when you can be lazy if you have 128 gigs of ram? If we made developers use (for testing) 5 year old, mid-tier laptops bought that were available in supermarkets (so something your grandma bought in walmart on a sale 5 years ago), the world of memory usage would be a much nicer place. Same for mobile phones.. test the app on a 3 year mid-range phone, and if it's too slow, it's going to be too slow for many users.
quad core celeron with 4 gigs of ram. This is what my grandma would have bought. Barely enough for word today, probably not enough in a few years when a new version comes out.
> I don’t care about applications wasting RAM. Modern machines have plenty of cores and RAM and the software is unbelievably powerful today.
I do care when applications waste a lot of RAM and cores but application performance is still terrible. Looking at you, Microsoft Teams. I don't remember mIRC ever being a slouch, even on a Windows 95 PC with 8mb of RAM.
Yep, skype is horrible too (hmm... could there be any connection, I wonder who bought skype a few years ago...), even if you use the same functionality as with irc clients (just text chat). Fucking slow to respond, replaces code snippets with emoticons and every time you send anyone an URL, it has microsoft servers open that url... "just in case"... even if it's a http://192.168.1.100/ url, which is clearly on a private ip address, and there's no way microsoft servers will be able to connect to that, but you'll still have to wait for a timeout before the message gets through.
yet still not able to draw a cursor or a scrollbar so i need to wait 2 seconds in word for the cursor to appear or 1 second, after i move the mouse to the side, for the scrollbar to appear.
I feel like I'm going crazy in these threads. UIs have really awful responsiveness these days and it is a source of constant annoyance to me. And then they hide all the elements like that, gating commonly used functions behind unpredictable delays so that users can't even find them unless they know to rub the magic spot until it appears.
I clicked the start menu on a PC today and was greeted with spinning circles on an otherwise black box for at least 5-10 whole seconds before it showed anything.
Imagine what it would be like by now if all software had kept up the early ratio of what could be accomplished per kilobyte of memory and per megahertz of processor performance. Especially per kilobyte of code itself.
Now it takes GB and GHz to accomplish the same office work as 30 years ago, and the UI responsiveness is slower by comparison. This should have only improved as more advanced hardware was deployed, so the root cause has got to be a defective software approach(s) over the long run.
Not the promising brave new world that seemed like it would make interactions with PC's orders of magnitude better by the time the 21st century came along.
> There was a demon in memory. They said whoever challenged him would lose. Their programs would lock up, their machines would crash, and all their data would disintegrate.
> The demon lived at the hexadecimal memory address A0000, 655,360 in decimal, beyond which no more memory could be allocated. He lived behind a barrier beyond which they said no program could ever pass. They called it the 640 K barrier.
> Windows applications had to run within the first 640K of RAM. In reality, around that time most corporate machines were 80286 machines and as I recall a high specced machine usually had 2MB RAM.
There was a protected mode version of Windows 2.0 that would run on 386s and allow larger memory access. I believe it also ran on 286s, but protected mode was crippled quite a bit on those chips.
I can't remember the specifics now - it was 35 years ago or something... but as I recall the memory situation wasn't sorted out until Windows 3.0. That's what made Windows 3.0 so exciting - the memory model.
That was windows "enhanced mode", when running on 386. 286 could not return to real mode, once entering protected mode (by design; a way was found later by resetting the cpu) and had no vm86 mode. 386 had these features, and by the time of windows 3.0 release it was popular enough, so they could be used.
No, even without those 386 features, Windows 3.0 did introduce a new memory system that allowed Windows applications to more easily use up to 16MB of RAM IIRC.
Windows 1 was hard to work with because they used a strict tiling model for window management. All apps had to fit into a screen grid of non-overlapping tiled windows. You very quickly got to tiles that were too small to be useful and hurt the usefulness of Windows as a product.
By Windows 2, they had given up on that tiling model and gone with overlapping windows.
In that era you had to load the network drivers before windows, it was all DOS based. It wasn't until windows for workgroups that windows based networking was a thing.
Netware, I also worked with Microsoft's SMB and a few other lesser known things. 3Com something or other. There were some interesting network types around then too - unfortunately alot of coax networks (a disconnected cable would bring the whole network down), Banyan Vines, some IBM token ring which had some weird cables and connectors. Cat 5 networking came in not long after.
Back then there were all sorts of protocols flying over the network. IPX/SPX, XNS mainly. The one protocol NOT present on our systems was TCP/IP.
Nice, must have been interesting to use such a variety of tech that people were still figuring out. Yeah, I got directly into TCP/IP with trumpet winsock. All the stuff prior was before my time (the stuff that I had access to anyway).
No it wasn’t. The aspect ratio was always 4:3. The resolution changed from 640×350 (with non-square pixels) to 640×480 (with square pixels). But the aspect ratio did not change. (The first image in the article is therefore displayed incorrectly compared to what it would have looked like.)
Is there something else going on here? The screen above would have no square icons or ui elements with rectangular pixels. Did everyone just adjust their CRT so the image would fill the 4:3 even if it distorted the image?
Note that "icon" here is the minimised state of the window, and the clock application draws itself in that state with the same code as in the normal state, and so it utilises screen metrics, while other minimised applications just call default windows procedure which puts bitmaps on the screen.
> Did everyone just adjust their CRT so the image would fill the 4:3 even if it distorted the image?
Yes? I never saw a "letterboxed" CRT. I think I'm with the people who are suspicious of the Windows 1.0 screenshot. Is there an emulator somewhere we can check with?
The icons were not square. The image was not “distorted” when displayed as 4:3; it was designed to look that way. Non-square pixels are, in fact, a thing.
It was already a feature in Windows 1.0 to let you customise the colours of the UI elements, and it lasted (with gradual improvements) until Windows 8 where it was removed completely.
I still miss the Windows Classic theme. It was simplistic enough while being streamlined enough for most workflows, now I have to hover icons to find specific windows in an application group or deal with the internet-connected Search/Run box.
It wasn't perfect, but it worked better for me, at the very least.
https://github.com/dremin/RetroBar gives at least the classic taskbar back. I have been using it on windows 10 for a while now. It is much better experience.
It hasn't been since LiteStep that I found myself so appreciative of some random Windows engineers' work. But this is actually awesome. The only thing I would change is to remove the reliance on .NET Desktop.
Just for comparison the Acorn Archimedes launched in 1987 with Arthur OS (later to become Risc OS) with a full windowed desktop with overlapping windows etc. They were able to pull this off because they had designed a new processor that they called the "Acorn RISC Machine"*
Yeah that drag and drop save was funny. It was the era of 'lets do this using drag and drop just because it looks cool and metaphorical - you are literally PUTTING that file IN that folder - geddit?'
I don’t agree that it was eccentric, I used these as my first machine and they felt pretty natural. I would also suggest that these interactions weren’t completely standardised at that point.
Yeah, I’ve never heard of that either. AFAIK, Apple offered Xerox to buy some stock pre-IPO, but I have never seen anyone mentioning royalties. The whole paragraph is dubious without sources.
I do get a sense from this article, due to the presence of stories I never heard before, that this is how history gets slowly rewritten. But an important aside for me is how exciting and awesome these developments felt to the users. Modern tech progressions feel relatively dull, probably because they have relatively little impact compared to the leaps being made in the early days.
A bit late to the party but I still remember vividly the day I was back from school and my dad installed Windows 2000 on the family PC. I remember looking at the shiny animations thinking this is the peak of technology. It was exciting!
That was how I felt on my second day of high school when all the Windows 98 PCs from the day before were stacked up in the corner and replaced with sleek, new, black Windows XP boxes. I wanted XP at home so bad, it would be a couple more years.
Usability-wise it’s better. It actually had more affordances than Metro did. In Metro you had to freaking poke around to see if there were active elements. Also Metro was too reductivist icon heavy.
IMO it might have been better than both of them! But probably too restrictive. It worked wonderfully on Windows Phone but it didn’t feel right at all blown up to desktop size in Windows 8.
"Restrictive" is probably a perfect word to describe it. I maintain that Zune Desktop was the most beautiful app ever produced, but it definitely had a lot of custom design elements on top of the Metro base.
Metro was peak design for me. I was doing UI work for MS around 2004 on Windows Media Center, and I think the design on that UI predated it officially becoming "Metro".
The flat design work I did for the UI on there I borrowed from Sky Digital's 1998 UI shown at the start of this video:
I couldn’t disagree more on Metro. For me that was the peak of bad design. With Metro I couldn’t tell the difference between interactive and non-interactive objects because they all looked exactly the same.
This might be an unpopular opinion but I still hate desktop UIs that use hyperlinks as navigation elements. I get why they’re useful in documents where you’re cross referencing terms in an almost relational kind of way, but for applications I want widgets to be explicit as interactive. And for desktop applications, you cannot just middle click a link, so you have no idea if a new window will open or you’ll get transported away from the current view with no easy way of returning back to the current screen.
KDE, in my opinion, is these days the best example of desktop UI out there. I’m not saying they get everything right, but it is far more consistent, discoverable and self explanatory than modern Windows (again, in my opinion).
> Overlapping windows, a proportional system font, and more general UI improvements were all goals [of the 2.0 release of Windows]
As an additional strange twist, Windows 1.0 was capable of overlapping child windows[1], title bars and all, even if the official MDI support APIs came later. It just didn’t use that capability for top-level windows for some reason.
I have a faint memory that this was because overlapping top-level windows seemed like too much of an infringement of an existing windowing system (maybe the Xerox one mentioned in the article) but later they decided, maybe it wasn't.
I always thought garbage websites from garbage companies were the only ones to pop up a subscription window. Bradford Morgan White, why did you turn your website into one of those websites with that "subscribe" popup?
Interesting, from the screen shots it seems windows 1.0 had what looks like a burger menu on the top left, and a task at (or status bar) on the bottom. Went full circle 30+ years later.
That menu is the system menu [0][1], still present in current Windows (Alt+Space). You can try out the Windows 1 version in your browser here: http://copy.sh/v86/?profile=windows1
I still use it to close applications by keyboard (Alt+Space C) because I find that easier to press than Alt+F4.
Double Clicking System menu also closes the app, even in Win 11. My brother still uses that to close programs. System menu is now missing in some apps.
In Edge, though no visible menu button but clicking in that area brings the menu and double click closes window. Explorer is even weird there is no button or area, but double click to close works while holding Alt+Space
At the time, I actually used and liked DESQview better. I wonder what things would have been like had Quaterdeck sold itself to M/S ? I remember M/S trying to buy the company, but they refused to be sold. I think they folded when Windows 3 (or 95) came out, forgot which version finally killed them.
At home, I had moved on to Coherent somewhat before Windows 3, then eventually Linux.
I was a big fan of OS/2. The late 80's and early 90's were a golden time for personal computers.
It's a big bummer that the world coalesced around Windows and macOS and now seems to be concentrating more and more around Web technologies. I would love to see an alternate timeline where OS/2, AmigaOS, TOS, BeOS, and other interesting operating systems coexisted and competed for another 20 years.
Re: that last quote, I'm not sure they could have done a better job of copying the Macintosh at the time, except in a very crude way.
Microsoft was targeting 16-bit Intel CPUs which required a coprocessor for floating point support. That's a huge constraint on what you can practically do, compared to the Mac's 68k.
Hardware is probably the biggest reason Windows was garbage until Windows 95, where they jumped ahead of Mac OS with preemptive multitasking etc.
Implementing 2D graphics operations with fixed point arithmetic isn't hard, and has the benefit of being absolutely exactly reproducible on 1980s hardware (hence why TeX uses it).
You can do very sophisticated work entirely with fixed point (such as the first Playstation), and I suspect we're going to see more noise from the ML people about using it for silicon space efficiency reasons.
The Atari ST lacked a blitter, but also had a GUI environment less miserable than Windows 2.0. Industry pundits called it the "Jackintosh", as it was Jack Tramiel's budget answer to the Macintosh.
Digital Research's GEM was able to do a better job than early Windows of getting the Lisa/Mac UI style/paradigm on PC hardware. So much so that Apple sued them. And they got to market sooner than Microsoft.
Though their original target wasn't a Mac clone, it borrowed Mac UI elements. Lee Jay Lorenzen left Xerox and joined DR because he couldn't sell Xerox on the idea of getting Xerox Star functionality onto commodity 8080/8086 class machines.
Another early one was VisiOn, from the VisiCalc folks, which had a very nice (for the time) minimalistic look and feel https://en.wikipedia.org/wiki/Visi_On
As a Mac person that also used Windows NT in my day to day work, I never liked Windows 95.
From a purely subjective perspective the design wasnt to my taste.
But the worst thing was the Viruses!
Windows 95 was a hellscape of security issues.
It felt like I never encountered a Windows 95 box that actually worked like it was supposed to. And it seemed like everyone was always reinstalling Windows 95 to get it working again.
In the late 90s I had an NT4 box that I never had to reboot. My Mac was getting rebooted multiple times a day.
I worked as a lowly PC technician assembling machines from components for a little shop in a strip mall when Windows 95 was what everyone wanted.
The OS was indeed a hot mess, but one thing was abundantly clear after assembling and installing it on hundreds of new machines: driver quality varied wildly across hardware vendors, and installing them was often the change transforming a perfectly stable fresh installation into a crash-prone unstable disaster.
Linux may not have had much in the way of hardware support, but what hardware it did support tended to work pretty well. This was a huge difference vs. Win95, where all hardware ostensibly came with drivers for the OS, sort of a classic quality vs. quantity situation.
That would explain why my experience with Win95 was so good. I never really understood all the hate for it (especially given 98/98SE/ME/XP, which came after 95, all of which I never really liked due to the all the extra "user friendly" features I never really used). I must have been lucky to have had the 'right' hardware with the better drivers at the time. IIRC I stayed on Windows 95 right up until Firefox (or whatever it was called at the time) finally dropped support for Windows 95.
Also, Firefox isn't a rebranded Navigator, it was a new, alternative, light browser made to get rid of all the bloat that made it into the main browser.
Even in something as late as 2009, I installed Ubuntu, was told by Ubuntu to upgrade the packages, did so, and on next reboot saw some sort of arcane panic message and was dropped into a bare shell straight out of POST.
Linux stability pre-2015 (?) is not something to be lauded.
I've used Linux exclusively since the 90s after leaving OS/2 Warp and rarely encountered a driver-related instability issue with any actual supported by mainline hardware. The few which come to mind had to do with NIC drivers requiring disabling offload features, or link rate auto-negotiation failures.
Your problem sounds like a non-specific distro update bug. Don't even get me started on my mother's current Win11 machine that breaks its USB printer on every.single.update.
> Even in something as late as 2009, I installed Ubuntu, was told by Ubuntu to upgrade the packages, did so, and on next reboot saw some sort of arcane panic message and was dropped into a bare shell
Yet after 10 years people still think automatic updates are a good idea.
> driver quality varied wildly across hardware vendors, and installing them was often the change transforming a perfectly stable fresh installation into a crash-prone unstable disaster
YES.
99% of BSODs are driver or hardware issues related. It's not that many things in the OS itself what can break so bad what you need to panic.
But of course it's always Gates himself who wrote shitty OS.
GeoWorks Ensemble ran much better on the same hardware. If you wanted closer to a Mac experience it was a much better choice … except that anti-competitive licensing deals made it more expensive because most systems came with Microsoft licenses whether or not you intended to use them.
+1 for GeoWorks which ran a lovely suite of bundled programs amazingly well on a 286, and was preinstalled alongside MS-DOS on some systems in the early 90s. Despite being a quality product, apparently development was a major pain (expensive documentation and bad workflow), while Windows rode a wave of third party software to market domination. An interesting footnote to the Windows story of the era, and a cautionary tale perhaps. But FWIW I still run Ensemble in a VM :).
And Workbench was there with pre-emptive multi-tasking, resizable overlapping windows with adjustable colours in the same year that Windows 1 was released.
Off-topic: When I open that page, my email address shows up in the text input for subscribing. How do they know my email? I don't remember ever going on that website.
Apple did release a LocalTalk card for PCs circa 1987, although the article I found about it shows MS-DOS drivers[1]. AFAIK until Windows for Workgroups MS-DOS provided network file access, so perhaps this is referring to support for AppleTalk printers???
Alternatively, 2.11 was 1989, and I’m sure Ethernet cards would be available for NuBus Macs by that time. Or perhaps even those funky SCSI to Ethernet adapters for those without.
Possibly even with Windows 1.0 I can't quite recall.
It was a constant battle to get enough RAM for applications to run. Futzing around with himem.sys and moving around network drivers and stuff in high memory.
Windows applications had to run within the first 640K of RAM. In reality, around that time most corporate machines were 80286 machines and as I recall a high specced machine usually had 2MB RAM.
Kinda weird to think that working, useful Windows applications ran on systems with that little memory.
Ugh I'm glad those days are gone.