Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Can’t imagine how these Windows engineers feel about the enshitification of their baby. So much time invested into it - must be hard to see it taking this current trajectory.


The kernel is better than ever. As for the shell, it's always had junk on it for commercial reasons - remember MSN and baking IE into the OS, or how the system requirements were ridiculously low for the reasons of marketing and to placate OEM manufacturers? And how OEMs were allowed to add loads of junk software on computers wearing their "Designed for Windows" stickers?

I'm pretty sure XP came with a digital app store right at the top of the redesigned Start menu, Windows Media Player had ten music stores integrated all selling DRM'd WMA files...

The system requirements especially, must have created a lot of work right down into the kernel team.


NT 3.x, 4.0 and 5.0 (2000) which the presentation are about were, at least, quite free from commercial junk. It was only when consumer and professional Windows merged into one with XP that the enshittification started -- but that being said, it's bad on an entirely different level now. It's easy to look back longingly at the days of XP and 7 now that Windows is so overwhelmingly user hostile.

Like, rhetorical dude from 2002, you're mad that Windows XP will not let you remove Internet Explorer easily and that it requires online or phone activation to work? Let me tell you about Windows 11...


> Remember MSN and baking IE into the OS

Yeat another thing where Microsoft was ahead of the curve, nowadays we get Electron (aka Chrome) all over the place.

People even buy laptops where the browser turned into the OS!


Some of us wipe that and install Linux because the hardware is cheap.


And just like with Windows, contribute to the sales number.


> Windows Media Player had ten music stores integrated all selling DRM'd WMA files...

Apologies. This was me. Pretty much all 10. Most of them were just white-labels of the same code. Believe me, I hated doing it. MS didn't want to do the right thing and vertically-integrate everything like Apple was doing, which was the better solution as then you owned the entire user experience from end-to-end.

We know how that story ended.


Yes I could tell they were white label. One supermarket chain in the UK was offering everything from pet insurance and funeral insurance to phone plans and digital music. The way it was listed in the interface among so many other random brands was hilarious to me.


Well they got slapped for including IE forcible by the EU and there was never an App Store like that over XPs lifetime to my knowledge? Also you could quite easily replace the whole Shell. Sadly not anymore if you do that now stuff you need access to will just stop working because everything "metro" will need the explorer shell actively running in the background.



Windows Catalog (which I totally don't rememeber at all... was it in all the builds or only some regions??) was apparently "the showcase for products that are designed to make optimum use of Windows XP. Go digital, and discover a great new computing experience!" [1][2] More of a catalog listing software and hardware that got the designed for XP logo, rather than a place to buy those things, let alone an app store experience. At that time, it was much more common to purchase applications inside an actual store.

[1] https://web.archive.org/web/20011113052730/http://www.micros...

[2] https://web.archive.org/web/20020409123842/http://www.micros...


> Can’t imagine how these Windows engineers feel about the enshitification of their baby.

I think people are forgetting how unreliable Windows was in its early days. If you were doing anything complex (programming, editing pictures, ...) Windows couldn't run for 2 hours without crashing every so often.

If anything, the core of the Windows operating system has only gotten better with time. Yes, they keep adding fluff to the desktop environment but that doesn't take away the progress they have made in stablizing their core operating system.


> Windows couldn't run for 2 hours without crashing every so often.

I`m really curious, which version of Windows you mean?

Because I don't remember this on win 3.11, win XP, win 95, etc. etc. Of course there sometimes HW/drivers issue, sometime some programs corrupt system files, etc. etc. But crashing every so often.. thats strange.


Windows 95 and 98 loved to crash because a fly at the other corner of the room moved an atom which hurt the OS' feelings momentarily.

Ran out of memory? BAM. An official driver from Intel or nVidia or ATI did something slightly off-time because silicon decided to wait a clock for something, BAM. You had a professional capture card with high bandwidth for that time, and you wanted to capture a video, BAM.

A blue screen because of a spinlock access violation, a Windows bundled driver, or any high-end software was common back in these days.


Oh yeah it did. The amount of times when you are writing something in Word, then figure it has been 15 minutes, I should save. Only to move the mouse to the save icon only to have the entire system just stop. No error just a total lock.

This is why you can tell if people grew up in that era, you have muscle memeoy of CRTL + S every few minutes burned into your soul.


> you have muscle memeoy of CRTL + S every few minutes burned into your soul.

I'm honestly afraid that my child will born with it and do that pinky-midfinger combo on air like playing air guitar on day 1.


I didn't quite suffer from anything quite this severe, but Windows PCs definitely needed a restart once a day for sure. The weird one for me was installing new software requiring a restart. Some applications would insist on it, presenting you with a modal saying something like "Your computer will now be restarted, save your work and click the Ok button"


Windows NT 3.51 was a big milestone in terms of stability. Windows NT 4 got even better from a stability point of view. I can't remember the last time I got a Windows blue screen of death when it used to be common to see one but on a decreasing basis as new versions of Windows came out.


> I think people are forgetting how unreliable Windows was in its early days.

Not to mention being as easy to attack as a house made of butter.


And some people still use WinXP, Win 7, Win 8 etc...


At least you had a decent host firewall by then. Pre XP-SP2 you'd get malware just hooking up to the internet.


Later Windowses are full blown spyware with ads. If I had to use Windows for some reason it would be 7.


It's not difficult to de-shittify 10/11. There's a tool that automatically does it call ShutUp10.

It's arguably a bit shit from a business perspective, but has no real impact on power-users day to day.


You need to be careful with ShutUp10/11. You can easily break automated security rules or system APIs if you carelessly enable all of those settings. You can't just apply these patches and forget about them, sometimes you need to undo your work to get updates installed or to resolve problems (for example, the "disable internet check API" privacy setting can cause some applications to display "you're not connected to the WiFi" popups).

It's also an uphill battle against the ever encroaching Microsoft Edge bullshit; every time you remove part of the bullshit, Microsoft comes out with an update that adds more.

If you're stuck with Windows I'd consider the safe defaults for ShutUp1x as essential but you do need to read the notes for every setting you enable, which may require some Googling so you understand what you're doing.


Does it work without a Enterprise install?


Yes, I'd even go as far to say it's designed for use with Home and Pro; it's setting the toggles enterprise/LTSC users will most likely already be managing through their centralized group management software.


Was Windows NT ever that unstable? I know 95, 98, and ME were all notorious for stability issues, but was under the impression that NT was better.


Windows 98 was unstable because its drivers and usermode software components still came from a time where they controlled every aspect of the computer.

NT solved that problem by not allowing a lot of that nonsense, breaking code in the process. This incompatibility is the reason new Windows 95/98 PCs are produced until this day (https://nixsys.com/legacy-computers/windows-95-computers, https://nixsys.com/legacy-computers/windows-98-computers): back in the Win9x days, programming your computer like you would program a microcontroller today was quite a reasonable thing to do for certain applications, like controlling production lines.

There is the uptime overflow bug to deal with, but a monthly reboot is easier than reverse engineering and porting control software.


> Windows couldn't run for 2 hours without crashing every so often.

That sounds like a Windows 3.1, where applications could easily take down the operating system. Windows 9x wasn't quite as bad. If I recall correctly, properly written applications could not take down the operating system though drivers certainly could. That said, there were certainly ways for developers to break the rules since there was little (if any enforcement) so some applications did take down the operating system. With the Windows NT series, there was sufficient isolation and enforcement of that isolation, that it was very reliable. Drivers could be an issue, as with bugs in Microsoft's code, but that was nothing in comparison to contemporary versions of 3.1 and 9x.

On the whole, I don't think it is reasonable to blame Microsoft for the reliability of their operating system. There were certainly design issues that resulted in it being unreliable, especially when running third-party code. On the other hand, the operating system was basically an evolution of a product line that started on the 8088 with very limited memory (I'm speaking of PC-DOS here) and a great degree of compatibility had to be maintained. Keep in mind, the computer industry did not work at the same pace: features had to wait until processors incorporated them, processor adoption had to wait for manufacturers to build them into their systems, and then consumers buy those systems in sufficient numbers. For example: the 286 was introduced in early 1982, but the IBM PC AT did not come out for another 2.5 years. Microsoft was also limited by the hardware their customers owned, even when it supported particular features. Life is much harder when you cannot throw memory at the problem because people had 2 or 4 or 8 MB of RAM.

On the other hand, Windows NT was a completely different product. There was much less concern over compatibility. There was much more intent to throw away baggage to create a modern (for the time) operating system. It did not crash every two hours.


I'm honestly not sure if that was windows's fault. In that time period we also had:

1. budget devices from OEMs that cut corners at every cost

2. capacitor plague with merchants unable to guarantee good capacitors from any source


A core piece of enshitification though is that a product becomes Less Useful over time - Reddit and Twitter lose third party apps, Apple is making its desktop OS more "secure" (read: convoluted and does less stuff) every release. The things you Liked about it, goes away.

Windows, despite its legitimately annoying monetization strategy, has absolutely done the opposite - it does More Stuff every release, and the stuff it did before largely still works.


> Apple is making its desktop OS more "secure" (read: convoluted and does less stuff) every release.

Do you have some examples of how macOS is doing less / capable of less today, than say 1 or 2 or 3 releases ago?


Adding ads is clearly doing "More Stuff" yet becoming "Less Useful". That is the most obvious counter example imo.

Another would be fragmenting the settings between the control panel and the new settings menu. It does more stuff (you have twice as many settings apps!) but it is less useful, because you are less likely to find the setting you are liking for.

Another example of doing more and becoming less useful is requiring a TPM for Windows 11. My security should be my decision. Not letting one install Windows obviously makes Windows less useful than if it could be installed.

In general (ie, not a Windows specific issue) ever growing hardware requirements makes the software less useful over time, as it can only run on a smaller and smaller subset of hardware. As software gets better, it should run on more hardware than it did before. Not less. Windows will simply not run on hardware from 15-20 years ago that is otherwise fully functional. That means it is less useful than it was before.


> is that a product becomes Less Useful over time > it does More Stuff every release

I wouldn't say "doing more" is better. I'd be happy if it did a lot less. I don't care about most of the big new features in windows. I'd be a lot more happy they'd rework their old antiquated stuff that keeps making problems (drivers, registry, focus handling, etc. etc.).

> Apple is making its desktop OS more "secure" (read: convoluted and does less stuff)

What is apple really making less useful with time? For me I really like many of the new features. The only reason I stick to windows is that gamging is still horrible on macOS.


> I wouldn't say "doing more" is better. I'd be happy if it did a lot less. I don't care about most of the big new features in windows.

There are two levels of features here (maybe three) that we should consider:

- There are consumer facing features, the stuff pushed by marketing departments since it will grab the attention of customers and (perhaps) make it more desirable for customers. A lot of this is targeted towards specific groups of users, while being less useful to others, and goes out of fashion very quickly (assuming it ever went into fashion).

- There is the infrastructure. This stuff is harder to sell users on because relatively few people care about the details. It includes everything from exposing functionality to developers to improving performance and security. Sometimes it turns out this functionality is only of interested to a limited subset of developers. Sometimes it retrospectively seen as a problem that needs to be addressed. Either way it is very difficult to alter or remove because other software depends upon it. (Heck, even internal software depends upon it. While they may have the means to update internal software, that doesn't mean they have the resources to.)

I'm tempted to split the second category into two, but the net effect is the same so we may as well keep it simple.

As for the Apple thing, well, Apple has a more focused market. Choosing Apple also tends to be a conscious decision, while choosing Windows tends to be more a default position. For those reasons, I have no doubt that macOS is a better OS in the eyes of its users than Windows is in the eyes of its users.


Here's a few examples - software requires signing and App Store accounts (literally called "Gatekeeper") which causes problems with OSS, you can no longer write kernel drivers on Arm64, many apps now require an avalanche of Vista-style "Do you want to allow this Thing Y/n" prompts, many other apps have to walk users through clicking into security settings to e.g. enable screensharing or productivity tools that use a11y hooks, the list goes on and on. Software on the Apple platform is becoming Less Useful over time and the list of things you can Do keeps getting smaller.


That's a really interesting take. Thanks!


A few years ago when I was there, there were still remnants of that same Dave Cutler NT culture, especially around the folks who worked on minkernel/.

I agree there are definitely shitty chunks Windows, but there are still some very solid foundations there to this day.


For example, Windows kernel's write watch feature is useful for writing a GC. Linux lacked (and as far as I know, still lacks) this feature, so Microsoft had to rewrite .NET runtime.

https://devblogs.microsoft.com/dotnet/working-through-things...


> Linux lacked (and as far as I know, still lacks) this feature

AFAIK it is possible to do this on Linux (either through mprotect + SIGSEGV or userfaultfd) but it's slow. But there's a work-in-progress patch that the Collabora folks (probably on a contract from Valve if I'd had to guess, as some games do use this) are working on which will add a new fast way of doing this.


userfaultfd was ~2015 (https://lwn.net/Articles/636226/), so Microsoft couldn't use it at the time. It could be better, but yes, Linux is making progress.

Even in current form, userfaultfd is useful for GC, so Linux's lack of the feature in 2015 was unfortunate. Android 13 added a new GC taking advantage of userfaultfd: https://android-developers.googleblog.com/2022/08/android-13....

> A new garbage collector based on the Linux kernel feature userfaultfd is coming to ART on Android 13... The new garbage collector... leading to as much as ~10% reduction in compiled code size.


A lot of the original designs of Windows were elegant in theory but never simplified and unified. COM objects are a good example, they were just a pain to deal with from languages at the time (and arguably still are).


Delphi and VB made COM relatively painless.


That’s indicative of poor bindings for those other languages.

The nice thing about COM is that it provides a well-defined, C-based ABI for calling object-oriented interfaces; if your language has a FFI that supports C, then you can call COM objects.

I’m a big believer that COM bindings for any language with automatic memory management should not expose refcounts directly to the programmer (at least in 90% of cases). It’s not far fetched — the original, pre-.NET Visual Basic did a very good job of this.


COM was still a pain even when using MS languages. The WinRT era COM is a bit better but WinNT era was just needlessly elaborate. Everything was done with giant 128-bit GUIDs that were impossible to recognize or memorize, so they added a naming layer on top but it wasn't used consistently. COM servers had to be registered before they could be used, and that was the _only_ way to use them, so you couldn't just export some objects from a DLL and load them directly from that DLL (or at least it wasn't well documented, my memory is getting fuzzy about this stuff). Then you had the obscure thread safety approach in which instantiating some kinds of COM objects would create an invisible window that you were just expected to know about and then you had to write the boilerplate to pump the message loop, if you weren't doing the COM call on the UI thread. Etc.

The goals were good, and other platforms haven't really tried to achieve them (KParts and Bonobo were the closest equivalents but both were abandoned a long time ago, DBUS isn't quite the same thing). But COM was fiddly.


I switched from 98 to ME for a week before 2k, and it (2k) was rock solid for years.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: