It really has gotten to the point where Linux offers the best option for a sane desktop experience. Watching Windows and macOS implode while KDE and Gnome slowly get better and better has really been something. Not quite at the point I'd recommend them for grandma and grandpa, but not that far off, either.
I've been using a Mac basically full time for years now, due to work. It's easily the worst UX and it's sort of shocking, after decades of hearing "it just works" or whatever. Hidden windows, hidden desktops, obscure keyboard shortcuts, etc.
I actually don't even know how to use the mac for the most part, I've learned to live in the terminal. I contrast this with Linux where I can just... idk, browse files? Where windows don't suddenly "escape" into some other, hidden environment, where I can just use a computer in a very sane way, and if I want keyboard shortcuts they largely align with expectations.
I was extremely frustrated while on a call using a mac. I made the video call full screen, which then placed it onto essentially a "virtual monitor" (ie: completely hidden). I had no way to alt tab back to it, for whatever reason, and I had no way to actually recover the window in any of the usualy "window switching" means. I knew there was a totally undiscoverable gesture to see those things but I was docked so didn't have access to the trackpad.
I figured out if you go to the hidden dock at the bottom and select Chrome, as I recall, you can then get swapped back over to that virtual desktop, "un full screen" the window, and it returns to sanity.
Mac UX seems to go against literally every single guideline I can imagine. Invisible corners, heavily reliant on gestures, asymmetric user experiences (ie: I can press a button to trigger something, but there isn't a way to 'un trigger' it using the same sequence/ reverse sequence/ 'shift' sequence), ridiculous failure modes, etc.
I can't believe that people live like this. I think they don't know how bad they've got it, I routinely see mac users avoiding the use of 'full screen', something that I myself have had to learn to avoid on a mac, despite decades of having never given it a second thought.
MacOS definitely has its issues but this just makes it sound like you have different expectations of how an OS should work. Different isn’t always bad. Hiding applications is a pretty key concept in MacOS. Shortcuts are pretty straightforward? Cmd+H to hide, Cmd+Q to quit. Spaces aren’t hidden- there’s lots of ways to access them, but it seems you haven’t bothered to learn them. In your example pressing ctrl+right would have switched the first full screen space. You could also have right clicked the Chrome icon in the dock for a list of windows.
BTW the dock doesn’t have to be hidden, and idk if it was a typo but alt+tab isn’t a default shortcut. Command is the key used for system shortcuts, so maybe you should have tried that? Like yeah it’s different but that doesn’t make it bad. If you been using it for 10 years without figuring that out…
—-
I’m with you on the 1st party apps though, and the stupid corners on Tahoe.
I call it "alt tab" because that's how my brain maps the keyboard. The reality is simple - I struggled going from Windows to Ubuntu about 20 years ago but ultimately made it to the other side knowing how to use both well. With macs, I didn't. 10 years later and all of my adaptations are to avoid the operating system. In 10 years the main thing I've learned is how to get myself out of a jam and stick to the parts of the OS that don't feel like shit. I mean, it's not like I haven't learned these things, I know how to gesture, I know how to exit full screen, etc, it's not like I didn't ever learn, I'm explaining that the experience was dog shit.
Anyone is free to claim that I just didn't try, or didn't give it a fair shake, or perhaps I'm just some idiot who doesn't know computers or whatever.
Maybe I just think an OS should work differently, but okay? I've never said that I have some sort of access to a platonic ideal of objective operating systems and that macs don't meet it. I'm saying that I think it's bad and I gave examples of why. And I think I can easily appeal to my experiences seeing others use the OS - I don't think they find anything you're talking about appealing either.
> Hiding applications is a pretty key concept in MacOS. Shortcuts are pretty straightforward? Cmd+H to hide, Cmd+Q to quit. Spaces aren’t hidden- there’s lots of ways to access them, but it seems you haven’t bothered to learn them.
They're not talking about Cmd+H hiding or virtual desktops - those exist on Windows too. The issue is how macOS handles window placement with zero visual feedback.
For example, when you open a new window from a fullscreen app, it just silently appears on another space. No indicator, no notification. You're left guessing whether it even opened and where it went. The placement depends on arcane rules about space layout, fullscreen ordering, and external displays - and it's basically random half the time. You either memorize the exact behavior or manually search through all your spaces.
Years ago, they changed the behavior of the green button to be "fullscreen into a separate space." As someone who never uses spaces, this is never what I want.
You can escape it by moving your cursor to the top edge of the screen and clicking the green button on the titlebar that appears to exit fullscreen.
> Years ago, they changed the behavior of the green button to be "fullscreen into a separate space."
Not quite. It has the old behavior (grow to as large a window as supported) if the app does not support full-screen. For instance, the Settings app cannot grow wider, so it grows to full screen height.
The icon that appears when you hover over the green button reflects whether it is full screen or zoom behavior. If you hold option, you will always get zoom behavior IIRC. However, due to the green button being overridden to be a menu in Tahoe, the button icon may or may not reflect zoom/full screen behavior if you press/release option and may instead show the optional modifier on the options in the pop-up menu.
I do not believe there is a way to disable full screen behavior completely, nor spaces. However, I don't think I'd be able to survive working on a Mac without both so I haven't done a lot of investigation there.
In this case, because I had docked my laptop, the entire window moved to a virtual desktop that didn't actually map to a real desktop. Meaning that the video call continued in a virtual desktop that I literally could not see, that I could not mouse over. I don't know if that's just a multiple-monitor bug or whatever but the behavior is stupid even without that failure mode.
Apple presumes you have a multitouch pointing device. You can three-finger-swipe between spaces. I know there's a keyboard equivalent, but you'd have to look it up.
It used to be that Macs would use single button mouses because the user would otherwise need to know which one to click, but now we have to know how many fingers to use and in which direction to swipe, so much for discoverable
It’s certainly “bad design” if we’re designing specifically with the OS convert who has a grudge against trackpads as the target user. But multitouch and its functionalities has been a fundamental part of macOS for nearly two decades now. For better or worse, a traditional mouse makes about as much sense for a macOS environment as it does for an iPad at this point. It’s workable, and it has certain advantages, but it’s really not recommended as your only pointer. At best, it’s used in tandem with a trackpad.
Most of the input devices that Apple sells on their website don't have multitouch, including 0 keyboards and only one of the mice. Many of the photos on the site for each of their non-iMac desktops include full setups that don't have a magic mouse or separate touch pad. The Mac mini and Mac Studio don't come with any input devices, and don't say anywhere that multitouch is recommended (closest is some language clearly marketing it as an up-sell on the Studio, "Take your creativity to the next level [with extra purchase]").
You're making multiple desktops sound very confusing when it's really not. Every desktop OS has them and macOS' implementation is quite good. You want bad virtual desktops, try Windows.
It sounds like you don't actually want the app in fullscreen. Fullscreen is "I only want to be in this one app window with no distractions." I pretty much only use it for watching videos.
If you want the window taking up the entire screen while staying on the desktop, double click the window chrome and it'll expand to fill the screen. And if you want the dock not taking up space, there's a setting to auto hide the dock (which I always enable)
> It sounds like you don't actually want the app in fullscreen. Fullscreen is "I only want to be in this one app window with no distractions." I pretty much only use it for watching videos.
I do want that. Every other OS has no issue with what I'm describing. Who said I don't want distractions? I want the video content to be expanded as widely as possible, that is what "full screen" means. Who said "full screen" means a separate desktop?
Ridiculous tbh
> And if you want the dock not taking up space, there's a setting to auto hide the dock (which I always enable)
> The fact that a full screen window creates a whole new virtual desktop is hilarious and I dare you to justify it.
I can kind of see the idea here. The alternative is that all the other windows in the working desktop get hidden behind the fullscreen window. That's pretty bad UX. I personally avoid it on Linux by always moving a window to its own desktop before fullscreening it.
That said, the implementation is awful, and exposes the rotten foundations of Mac's window management paradigm.
IMO floating windows always fall apart and should be reserved for modals and transient dialog boxes only. Everything gets a lot easier to understand when applications can't occlude one another or occupy the same space.
> The alternative is that all the other windows in the working desktop get hidden behind the fullscreen window. That's pretty bad UX.
How? It means I could have a full screen video and then overlay something smaller over it, or maintain my alt-tab behavior as it plays in the background, etc. I'd maintain the same UX. Why would full screen have such a weird behavior?
You're right that it's more consistent to have windows behave as you describe, and Windows and Linux both treat fullscreen windows this way. I posit that Apple cares more about not hiding windows behind others than it does consistency. This also shows with their new window placement algorithm that results in an absolute mess of windows all partially occluded but with some corner or edge peeking out of the stack for a user to visually identify and click to focus/being-to-top. Compare to Windows that (at least when I last used it) opens new windows at a slight diagonal offset from the last focused window, almost like building a neat deck of cards. Apple's ethos is also on display in the design of Stage Manager, which groups windows into these messy clumps and creates a visible shelf to swap between window bundles. Everything is optimized for hunt-and-peck visual users. If you're the type to organize your windows and workflows then you're fighting the system.
Love Linux, been using Manjaro with Gnome for the last 10 years, but need to use Mac on my current job, so I tried to approach this constructively and work around the rough edges:
* Rectangle Pro for window management
* Better Display for better picture on non-4k display
+ a couple of more similar tools
+ retrainig muscle memory from Ctrl to Cmd and Emacs-y instead of Windows-y shortcuts
Feels okay now. Plus native ms365 apps, smooth sleep mode, great hardware and great battery time -- mac has its sweet spots as well.
Or you could maybe learn how to use the OS, in linux lingo RTFM. I don't want to be rude, but the critique was very flippant, the arguments vague, all about expectations based on years using a different OS, doesn't seem you want to give it a fair chance.
I gave both generalized and highly specific cases where I felt the UX failed. I referenced principles of UX as well as literal "here is what my experience was in a concrete story".
> , all about expectations based on years using a different OS
No? I mean, again, funny. I explained how I've been using MacOS for years. Actually a decade, now that I count it out.
Plenty of people use an OS for years without learning. And you admitted to spending time in the terminal, which indicates lack of will to try and learn macos shortcuts, gestures, windowing model, spaces, and so on. And the comment used sweeping generalizations, without referring to any specific principles broken which aren't just personal dislikes or unfamiliarity with a different way of doing things.
> I gave both generalized and highly specific cases where I felt the UX failed.
No guidelines named, no principles defined. No comparison standard is established.
The earlier fullscreen story is a specific case, maybe a discoverability argument, but not not that UX violates every principle. MacOS spaces and fullscreen apps follow a workspace concept, it's not a window resize mode.
> Asymmetric user experiences
What’s asymmetric is not the command — it’s the spatial context. The claim that it’s violated is arguable.
> Heavily reliant on gestures
Not sure which guidelines this breaks, but every gesture has a keyboard shortcut alternative, there is mission control key, menu bar, dock.
> And you admitted to spending time in the terminal, which indicates lack of will to try and learn macos shortcuts, gestures, windowing model, spaces, and so on.
It indicates no such thing, other than that my preferred UX on a mac has landed on the terminal. It doesn't indicate whatsoever that I never tried to learn, or that I haven't learned, unless you presuppose that learning would necessitate using the computer a specific way.
Indeed, I have learned quite a lot of the various gestures, spaces, etc, unsurprisingly. I avoid them because they suck, and the learning experience was shit.
> And the comment used sweeping generalizations, without referring to any specific principles broken which aren't just personal dislikes or unfamiliarity with a different way of doing things.
All design principles are going to boil down to personal dislikes lol but no, nothing was "unfamiliarity" you can stop saying that thanks.
> No guidelines named, no principles defined. No comparison standard is established.
I could cite guidelines if you think it would help. Microsoft released a UX guideline years ago justifying why magic corners etc are a bad idea. Of course, they obviously don't follow that guide these days. What would you like?
I'm not interested in debating this. I'm perfectly fine with how I've expressed myself, I'm just not motivated enough this late in a Friday to get more detailed, so you'll have to just try to decipher what I've said and find if there's value to you or reject it, which I think is your prerogative.
And if you bring up these points to an Apple fanboy, they'll tell you that "you just don't get it" or "forget all the 'bad Windows habits' and just learn the Apple way of things. It's soooo intuitive!!".
> "forget all the 'bad Windows habits' and just learn the Apple way of things
I mean I'd be willing to say I don't get it, because I sure as fuck do not get it. But I think I'd absolutely reject the "forget all the other stuff, learn this". It's been literally years on a Mac. I remember the frustration of going from Windows to Linux, I look back at that adjustment and laugh, it's hilarious to me that that felt frustrating when I contrast to my Mac adjustment. At least the Linux adjustment was tractable, the Mac adjustment is a total joke.
I actually suspect that people don't "adjust" in the sense of learning how to do things with a mac but instead adjust to not doing things with a mac, like how many mac users I know of outright say they just don't use full screen mode because it's confusing.
And yes, the fullscreen mode is the perfect example. It is so shockingly poorly implemented that I almost never use it. Even if someone thought it was 'good enough', that doesn't change the fact that there is a forced transition animation when going to/from fullscreen that is unreasonably slow and awkward.
I actually like the concept of an app in full screen creating a new virtual desktop.
I feel like it’s really intuitive when you switch desktops with the trackpad.
It’s just incredibly poorly implemented, like all the window management on macOS.
Disclaimer : I own MacBooks since 2010 and I have seen macOS rotting update after update. To me they achieved a really mature and pretty well thought OS with Snow Leopard and it’s been slowly rolling downhill since then.
I can totally say that KDE AND Gnome AND Cinnamon AND Sway AND even the immature Niri are all better experiences than macOS.
Agreed. On MacOS, I use a variety of smaller apps and scripts to make it less awkward, e.g. Karabiner, BetterTouchTool, Hammerspoon, and, of course, "Alt-Tab" (https://alt-tab-macos.netlify.app/). I am even contemplating starting to use a dedicated window manager, such as Aerospace (https://github.com/nikitabobko/AeroSpace/). But all of this is a massive time investment.
Personally replaced Windows 10 with Linux Mint on my very computer illiterate mother in law's laptop a few months back. Haven't heard any complaints so far.
Linux is ready for prime time for anyone not bound to Windows/MacOS software.
Personally, I'm still on MacOS for work, but all my personal devices run some form of Linux. It's been liberating to say the least.
I set up windows 11 on a laptop for my dad so he can read emails and browse the web. Came back 3 months later when he told me he couldn't see the PDF files anymore. Turns out he installed THREE different PDF viewers that he randomly found on google, they installed tons of bloatware/spyware, replaced browser toolbars and searches etc. to a point where I decided to just restore from a recovery point. Told him not to download weird stuff (again) and ask me when he needs help.
At that point I questioned myself: I really should have installed linux for him.
This is still a thing? Browsers still have toolbars???
My go to for family is giving them no install rights, and adding a remote desktop app for me to connect to them when they need something to install.
I don't get called very often anymore, and when I do, it's for their work computer or something, to which I say, talk to your IT department, I can't fix that.
Browsers today view and can do limited editing for PDFs. No need for a dedicated reader. One does need a dedicated authoring tool if you need to create PDFs from scratch. Most OSes support print to PDF as well if you only need conversion.
I've never seen a website break because of ublock, at least not in the default config. If it's that much of a problem you can just remote in on grandmas computer and disable it for whatever website.
I think that beats remoting in when granny inevitably gets scammed by an ad.
There really is no excuse in my mind for not running an ad blocker. It's as vital to personal computing security as firewalls and anti malware.
Blocking ads helps grandma not accidentally leak private information that could have disastrous consequences, for example, getting scammed out of their money.
Not blocking ads helps grandma visit a few more websites that don't work well with adblock.
> Linux is ready for prime time for anyone not bound to Windows/MacOS software.
I suspect in order for this to be true we'd need a PR campaign that can shift culture on the scale of civil rights.
I'm not trying to be hyperbolic or deride Linux or anything—I agree that technologically it's probably ready. Overall UX I'm slightly skeptical. But the far bigger problem is culture.
There's already been a shift away from "PCs" among younger people. The majority of my kids friends have never touched a "regular computer." I've heard an unsettling number of reports of new hires who have never heard of a spreadsheet.
I'm bringing this up because if kids aren't using PCs as much in the first place and quite literally don't know what an operating system is (and please challenge this assumption; I'm going off of anecdata) it's going to be even harder to try to create cultural awareness and acceptance of linux.
But even disregarding that there would need to be a massive, massive coordinated campaign to create a real culture shift. I'm talking superbowl ads.
Again, not trying to be pessimistic, I'm trying to say that "ready for prime time" at this point has little to do with engineering or even design and far more to do with PR. Once I started launching my own products I quickly discovered (as everyone does) that making the thing is like 5% of the job and the remaining 95% is marketing.
The frustrating thing is that developers are some of the most reluctant to change. I'm sick of fighting docker on my Mac among the many other problems. But if we can't break away nobody else is going to either.
I mean yeah, Chrome and Firefox both run on Linux. And that covers 99% of what most "normies" need.
It's funny when people say Linux is difficult for their grandparents or siblings, when that's the place it covers best. And it keeps them from calling you about random adware/spyware/viruses they accidentally installed.
It's prosumers and professionals that have more issues with Linux, because they tend to rely on proprietary software that's problematic to install/use.
Before she passed, I had one of my Grandmothers on Ubuntu for about a decade... I had to set it up for her, and I ran updates every few months for her, but she really didn't have an issue... Her Windows 9x era games even ran under Wine when they wouldn't load on Windows (7 I think), correctly.
Email, browser and a few games... she was pretty happy with it.
I was so close to getting my parents to switch to Ubuntu in the late 2000s. It stuck until my dad needed some piece of software on the home PC for work that only worked with Windows. Today, they have iPhones and they think it will be more convenient to have a Mac to "sync things". Oh well...
Gnome Shell in particular offers a ridiculously coherent, sane window management. Nobody agrees with all the choices the Gnome Team took to get here, but it sure is nice there being one way of doing everything that makes sense contextually.
I don't even know if Gnome and Gnome Shell are the same thing. One thing I do know is the default install of Gnome on Debian 13 leaves you without a dock, without a system tray, and without minimize/maximize buttons. They purposely remove the three most important tools the average user relies on for navigation.
It's like trying to make a car without any round edges because "square edges are better". Good luck with the wheels!
I can fix that somewhat with extensions, but every normal person I know will take one look at the defaults and abandon it. That's a reasonable choice in my opinion. Why use something where the first interaction gives you a clear indication you're going to be fighting against developer ideology?
If you want to customize your DE a lot - Gnome isn't for you.
If you just want a clean and productive environment by default... Gnome is great.
Once you stop fighting it, sigh, and go with the flow... modern Gnome is genuinely pleasant in that I spend almost zero time thinking about it, and shit just works.
I still run other DEs for some specific purposes where "general use" isn't the goal, but I can reliably hand non-technical family members a machine with Gnome and they don't have to come ask me a bunch of questions.
My problem with GNOME (after having used it as my main desktop on my Linux systems for many years) is that it removes some really useful features and they are not just expert features, but also features that non-technical users are used to, such as system tray icons and menu bars. You can bring them back with GNOME Extensions, but for instance, the system tray icon extensions are very buggy.
KDE on the other hand just has these and is also great out-of-the-box (I pretty much run stock KDE).
I've been using KDE for a decade and I completely agree. It used to be only better than GNOME because I could remove features from it and now I run completely stock KDE and it's solid compared to anything else.
I bought an SBC that booted into Gnome on the official disk image, and it didn't recognize my mouse. It was entirely unusable. In applications that were part of Gnome itself, like the settings menu, it was impossible to navigate using tab and arrow keys.
>settings menu, it was impossible to navigate using tab and arrow keys.
Huh? All you need is tab and the arrow keys to navigate the GNOME Settings app. I'm literally doing that right now. Maybe it was a later addition but it works perfectly fine in GNOME 49.
Anyone who lived in a browser was fine a decade ago.
At this point... it's basically anyone who doesn't want to play competitive mp games with poorly implemented anti-cheat, or who doesn't have niche legacy hardware (ex - inverters, CNCs, oscopes, etc).
Steam tackling the gaming side of things has basically unlocked the entire Windows consumer software ecosystem for linux. It's incredibly easy to spin up windows only applications with nothing but GUI only software on most distros at this point.
Crazy how much better a system with a modern linux kernel and Gnome or KDE is than Windows 11. I'm at the point where I also prefer it to macOS... which is funny since I think Gnome was basically playing "copy apple" for a bit there 5 years ago, but now has really just become the simpler, easier to use DE.
In the past few years, I’ve started to develop a form of “upgrade dread” when it comes to OS upgrades. What are they going to enshittify now? What are they going to drop support for now?
This somehow excluded Linux and its DEs, and I eagerly read any news, changelogs, and announcements in this space. They’re still not perfect in every aspect, but at least I see things improving instead of public turf wars between departments trying to improve their KPIs.
Why is there an extra URL handler for MS Edge that bypasses the default browser config? Why is the search bar this wide in the default taskbar config instead of showing a simple button? Why are local searches always sent to Bing with no easy way to switch it off or change the search provider?
> I’ve started to develop a form of “upgrade dread” when it comes to OS upgrades.
I've been going the other way on Linux.
I used to think it might be wise to postpone updates if you were traveling, especially using a rolling distro. Today, I would be quite confident running the updates 10 minutes before leaving.
Granted, this is also because I'm more confident than ever that I could fix most breakages, and worst case the smartphone is there, but I've also not seen big breakages for years.
I have a somewhat opposite experience. I also use a rolling distro, and in the past six months, I've seen wine break, and I've also seen Citrix Workspace break due to a dependency problem (perhaps Mesa?). Granted, these two cases are somewhat unusual because Citrix Workspace is closed source and the software I'm running with wine is also closed source. I rarely experience breakages of open source software other than GNOME extensions.
Yep. I run NixOS unstable-small on my ThinkPad and there is rarely breakage in daily updates. If it ever happens while on the go, I can just boot into a previous generation. The immutable OSTree/bootc distros are similar, as well as openSUSE, which uses btrfs snapshots on updates.
Given that a lot of things happen in the browser, I think it wouldn't be too crazy. There are even distros that look like Windows if you're after that. What part of it do you think isn't ready for this scenario? (honestly curious)
There are no good options for grandma these days. I've been helping my 85-yr-old mother with her computer stuff (she has an iMac) and there's so much user-hostile, broken stuff--not just on the Mac itself, but many of the internet-based services she has to use--it makes you want to take a baseball bat to the while affair.
Similar experience here: I setup Debian stable for my 76 yo mother, and for a 79 yo friend. Works like a charm, and the 2 years release schedule is perfect for people who don’t care about bleeding edge and would rather have stability.
Unattended security upgrades keep it secure, and in my experience a bit of initial “locking things down and simplifying” is valuable, but after this it’s smooth sailing compared to other older folks I help with Windows systems where MS is constantly throwing at them insane bugs, complete UX changes, ads, or Copilot everywhere.
If you want to compare on the basis of microissues like this one, then note that KDE Plasma has exactly the same issue with the resizing area of rounded corner windows aa the one pointed by TFA.
Learnability and usability are related but independent concepts. This feature is difficult to find out about unless told, relatively easy to remember once you first learn about it, and very intuitive once you try it out. Dragging from the corner is slightly easier to stumble upon or infer, but most people will learn about it because they were told about it. The alt drag version (both for resizing and moving windows around without having to go to the title bar) is in addition, becomes second nature quickly and is significantly nicer once you learn it.
The main problem is that Apple wants to be opinionated. Linux is the polar opposite of that. People used to say the latter is bad, but it turns out the former is way worse (many hackers of course already knew this).
> Not quite at the point I'd recommend them for grandma and grandpa, but not that far off, either.
But at this point grandma and grandpa are the only ones I'd recommend to use Apple devices.
Opinionated design was great back when Apple's Human Interface Guidelines were based on concrete user testing and accessibility principles. The farther we get from the Steve Jobs era, the more their UI design is based on whatever they think looks pretty, with usability concerns taking a back seat.
It was good because it was both Opinionated (in other words, the path to write software that follows the design was easy, and the paths to write software that violated the design were hard), and also well-researched by human interface experts.
Now what we appear to have is "someone's opinion" design. A bunch of artists decided their portfolios were a little light and they needed to get their paintbrushes out to do something. I don't work at Apple, but my guess is that their HI area slowly morphed from actual HCI experts into an art department, yet retained their power as experts in machine interaction.
So here we are, we still have Opinionated design, but it might just be based on some VP's vibes rather than research.
I don't like to paint Apple as being completely incompetent (but damn have they been screwing stuff up), but I do think trying to solidify the experiences around a common codebase has become untenable. The idea is great thought - write one app that works on macOS, iPadOS, iPhoneOS, visionOS, etc. What a time saver that is for developers - but the problem is that screen sizes and interactions with those different platforms vary. Yes, resizing a window with your clunky finger needs a bit more wriggle room, while a pixel precise mouse or touchpad is a lot different.
Opinionation (heh, opinionatedness?)'s value is entirely different depending on the user category.
Hackers by and large don't want opinionated, because they're willing to spend the time configuring & customizing AND have the knowledge to do so.
Just about everyone else (as far as I can tell) very specifically do not want this, and for those who do, the amount of customizeability e.g. MacOS offers is enough. Having an immediately-useable computer (recent problems notwithstanding) is of much greater value.
So when you say "The main problem is that Apple wants to be opinionated" I can only conclude that you're coming at this from the 'hacker' POV. But I may be misunderstanding your comment.
> Thus if you create a system tailored at the average user, then none of your users will be happy.
This is objectively false. Some users will find some frustration or unhappiness with certain things, but remarkably, there are plenty of software UIs that get the job done for just about everyone.
This has to be sarcasm. Either that or you have never used KDE or Gnome even once in your life. No DE for Linux is anywhere near as polished as the DE in Mac OS. You have to spend hours customizing KDE or CFCE to get them to function even halfway near what an average user would expect. Gnome is okay but so bloated and even more opinionated than MacOS or Windows.
This is definitely not the case, and I invite anyone reading this comment to install a Linux distribution themselves in a VM or something to find out via direct experience. Fedora is a good place to start in my opinion.
Window management mostly works fine, but app design is years behind.
KDE Dolphin has a static toolbar like Finder, with its config menu being two lists like some Microsoft toolbars, and the available items list is sorted alphabetically.
The flat view switcher is multiple separate items, named directly after their corresponding view type, one called list, another called icons and so on.
So if you want a Finder style view switcher, you first need to know it exists beforehand because the naming is confusing, then you need to know how many views are available beforehand because they're separate items, and finally you need to hunt them down because the list is alphabetical.
This is pretty much the quality you can expect when using KDE software.
Another example is breadcrumbs, the current folder doesn't have an arrow, so you can't browse deeper with it without perhaps expanding folders, unlike on Windows 7. Side bar favourites also replace the top folder, so if you browse the home folder with it you'll often find yourself suddenly unable to use it.
Of course I've been using Cosmic for most of the past year now... It's getting better, but still some rough edges... the launch bar still doesn't feel quite right, and there's still times where keyboard navigation doesn't quite work right/smoothly.
> "...while KDE and Gnome slowly get better and better"
These projects have been around for literally decades and really haven't changed much during that time. I think what you're noticing is that Linux desktops are as good as they always have been, but since Apple and Microsoft keep messing with theirs for marketing reasons, in comparison it seems that Linux GUIs are improving.
This is just not true at all. Yes Gnome and KDE are old, but they've changed SIGNIFICANTLY.
Gnome 2 => 3 was a bigger and more ambitious transition than anything Microsoft has done. Except maybe DOS => NT. Same thing with KDE 3 => 4.
KDE gets new features on a very regular basis and they're not just, like, little checkboxes added here or there. No. Theyre huge changes. New system resource monitor, new notification center, new widget editor, new panel editor, window tiling... the list goes on. And that's just, like, the past 2 ish years.
Linux GUIs are improving, and rapidly. Before, they were close. But the gap keeps widening. At this point, KDE is so unbelievably far ahead of windows in terms of UI, UX, usability, performance, and feature set that it doesn't seem fair. I don't know if Microsoft can catch up. And, if they could, it would take multiple versions of windows.
I've actually bought a Mac Mini which I use for media consumption and run it besides my Linux (Cachy OS) gaming PC. I have a jellyfin server, but the media client for linux is totally broken.
And, when you use an nvidia card, you really have to do a deep dive on which settings and which render client you want to run. I now have a stable solution that runs KDE Plasma via Wayland, that allows for games to run smoothly. It took me a while to figure that out.
The Linux community also, quite frankly, sucks. When you need to figure something out, you really need to make it a study and only if you know the correct jargon, you are deemed worthy of help. Othrwise you're bombarded with rtfm comments.
my mother and younger sister both prefer it over default Windows 10/11 design. mum says, "feels similar to my phone [pure Android 12] yet I can do so much more".
given that sister only really needs Steam Big Picture and everything mother uses is already in Flathub or defined in a Nix flake, they didn't experience any ecosystem issues
I don't love all the new tahoe stuff, and do wish I could go roll back, but this hand wringing around Apple is way overblown IMHO. What he is reporting is real, but in my actual usage I haven't noticed this at all- in other words, if this wans't called out, I am not sure I would have ever realized it.
Tbh I have always found window management on Macs to be annoying and something to be avoided- Rectangle or something similar is one of the first things I install and try to use the shortcuts to just put windows in either a quarter or half of the screen.
That said, I use Macbooks for the hardware, if for whatever reason I had to switch to Linux I would just shrug and not care one bit. It took me a few years to realize, but MSFT just disappeared from my life one day and I didn't even notice.
Also, for some reason KDE renders everything super-fast/smoothly on my 120Hz 4k display, whereas macOS on Apple Silicon is often stuttering (no, it's not the Electron bug). The tables really turned, when I first switched to macOS on the desktop in 2007, the GPU-based rendering was insanely good compared to... pretty much everyone else.
Rather than evolutionary improvements we get Liquid Glass and ads in iWork applications. The enshittification has started I guess.
Sorry but you clearly haven't used macOS. Linux on the desktop is still about 15 years behind, and I tried it recently. It's such an inconsistent experience it's almost hilarious.
Speaking as a Tahoe user by the way who is not experiencing any issues to speak of (on 26.0.1 - and I can't reproduce the resizing inconsistency either). I've been using macOS since 2003 (back when it was called Mac OS X) and before that I was a Linux desktop user since 1996.
I used macOS as my daily driver from Tiger to last year, actually. I don’t know what the inconsistencies you’re referring to are, but I certainly prefer them to cloud account nagging and constant attempts to monetize user behavior, which is the modern macOS experience.