> Windows users would not switch simply because they didn’t need Linux.
Not really. The author identified the real reason before that. Users wouldn't switch because most users don't even think about the choice of OS, which would mean switching. I.e. they only ever use what comes on the computer pre-installed, and that's it. They don't care much what it even is.
Therefore the main blocker wasn't the bad advocacy or anything of the sort, but the old and crooked stranglehold of MS on manufacturers. The only way to seriously increase Linux desktop adoption is for availability of Linux pre-instsalled computers to significantly rise. And MS use their heavy leverage of Windows pricing to prevent that.
> and the desktop itself became irrelevant
I hear this often, but it remains as wrong as ever. Desktop usage has its place, and it's not going anywhere in the foreseeable future.
> They still just want something that plays games and checks the Internet.
And Linux is good for that already. So, as above, that's not what prevents its adoption.
> Therefore the main blocker wasn't the bad advocacy or anything of the sort, but the old and crooked stranglehold of MS on manufacturers. The only way to seriously increase Linux desktop adoption is for availability of Linux pre-installed computers to significantly rise.
Plenty of companies have tried offering Linux pre-installed and almost all of them have failed for various reasons. The main reason is that people don't buy them. Either they are perceived as being too hard to use, or they don't easily run the business software or games that users typically want, or buyers are confused by Linux having 157 variants, or they are happy with Windows, or whatever.
The other reason is that manufacturers can make money selling Windows PCs but can't make money selling Linux machines, partly because of the extra costs of hardware qualification, drivers, increased stock control, accounting, book-keeping and advertising costs, and massively increased support costs... unless support is either denied or sold separately as an ongoing cost.
Of course, Linux is getting a free ride on the back of Microsoft's compatibility work. If you are using Linux then you are benefiting hugely from the cost of parts being driven down by Windows PC sales. (Apple switched to Intel partly to gain the same benefits.)
> MS use their heavy leverage of Windows pricing to prevent that.
One of the settlement terms in the anti-trust case was that Microsoft was not allowed to do deals on Windows pricing, so that all the leading OEMs were on a level playing field.
This settlement ran for roughly a decade. Did you see a huge rise in Linux PC shipments? No. The only real difference was that it pushed up the average price of Windows and made Microsoft more profitable.
Anti-trust case revealed, that MS threatened manufacturers to raise Windows price, if they'd sell computers without an OS. That's another indicator of such blackmail. I don't think the court did anything about that.
If all would have been clean deals wise, there wouldn't have been such a mess of Windows tax and never ending tying problem.
Yes, it did. The DoJ fixed the price of Windows for all the major OEMs, and all sales had to be made via the same OEM-only website.
I don't appear to have a record of how many OEMs that applied to (or else I can't find it), but the number I remember is 40.
This had the very amusing effect of increasing the price that IBM paid for Windows. IBM went to court to complain that Microsoft had tripled its price, but as far as I recall, that price increase was from $9 to $27.
Either way, nothing stops anybody from setting up a company to sell Linux PCs -- you could do it yourself. Is there really nobody in America who can sell Linux PCs at a profit, when the OS is free?
Since that's evidently the case, blaming Microsoft just looks like a fig-leaf for some more fundamental failure(s).
> Is there really nobody in America who can sell Linux PCs at a profit, when the OS is free?
Dell sells the XPS 13 Ultrabook with Linux pre-installed pretty successfully. Tuxedo is another Linux HW vendor. At lot more devices are listed at http://linuxgizmos.com
Note though that Dell's Linux offerings are commercially supported distros--so Linux is not "free (as in beer)." And I can't really imagine that selling a laptop with upstream bits and saying we won't take your call for support would go over well when selling a laptop.
> Note though that Dell's Linux offerings are commercially supported distros
What's the problem with that? Why should I use a different distro if the installed one works perfectly with all HW components? If I really needed another distro I could user docker or the like on that same device.
No problem at all. But the question was about Linux being free. And it's a good bet that a hardware manufacturer probably isn't going to ship with a community-supported distro with no support.
In the UK, RM Ltd launched the original 7in Asus netbook into the education market with two options. (1) Windows XP with normal support. (2) Linux with no support except "reset to factory condition".
Dell UK tried offering Linux with no support, but with the option to buy third-party support from Canonical. The cost of support more or less excluded home and education buyers.
There's frequently speculation that Dell will discontinue or stop updating it, which doesn't suggest huge success. It's also just one PC model from only one major OEM, and I believe it's mail-order only.
I wish it were different, but right now there's not much evidence there's a significant, existing unserved market for (GNU, not Chromebooks) Linux PCs.
I do agree that it should be noted that people specifically want GNU to be big. Chromium OS and Chrome OS are fantastic operating systems that are very capable. And it's Linux. No, you cannot "apt-get install GTKapplication", but it's still very Gentoo underneath the GUI. Of course you still want a more general purpose distribution for general purpose needs.
> It's also just one PC model from only one major OEM, and I believe it's mail-order only.
I don't consider that a problem since Linux has always been mainly in the focus of developers. Linux is a top choice if you want a reliable work horse for your work which doesn't annoy with unwanted features.
Usual users are mostly gamers and surfers. They get along with the limitations of Windows and iOS, so they don't need Linux, and that's the actual reason for the small percentage of desktop Linux.
A big problem is that the Linux world loves to fragment itself.
There is no "Linux" to bundle. You have to pick a distro.
On the surface all major distros look like a good choice, but for unclear reasons they're all incompatible with each other despite doing the same things in the same ways.
For techies it doesn't matter because they can get any distro to work if one works. But it matters a lot for non-techies because things like tribal knowledge and even packages are all incompatible.
> A big problem is that the Linux world loves to fragment itself.
I am using Linux for >25 years. The "big" problem actually turned out a big advantage since I was never forced to go with any immature mainstream.
For instance, when the KDE 4.0 and Gnome 3.0 disasters came which replaced the excellent KDE 3.x and Gnome 2.x desktops I could switch to another Linux Desktop without any hassle.
This would have been impossible with Win 8. Windows users had to endure the pain (someone called it "epic fail") until MS recognized that they made a mistake. Now they have to get used to Win10 which loves to phone home. They have no other option, except switching to OSX or Linux.
Too bad 99.9% of the time Windows gets the UI right enough for most users. From what, Win 95? the first time complaints about the UI became mainstream was Windows 8 ("It looks like it's for touch screens!"), and from 8.1 onward those complaints faded back to being mostly from power users. Through all that I can still run a UI application written for Win 2000 and have it work fine.
I'd be fine if I had to deal with one Linux desktop that took a wrong turn for a year or two if it meant that desktop would provide backwards compatibility to the 1990s instead of breaking API compatibility whenever it seems cool.
> Too bad 99.9% of the time Windows gets the UI right enough for most users.
True for Win7, wrong since >Win7. I would have agreed if you mentioned iOS. Anyway, today Android has the greatest market share, and that's running Linux.
> I'd be fine if I had to deal with one Linux desktop that took a wrong turn for a year or two if it meant that desktop would provide backwards compatibility to the 1990s instead of breaking API compatibility whenever it seems cool.
Breaking API compatibility has always been a problem in the Windows world (first DOS, then Win16, then Win32, then .NET, now Metro).
The Linux/Unix world is much more sustainable. You can likely still use applications which were written twenty years ago.
True since Win9x to my knowledge and only wrong once during Win 8 for most users (as I said in my comment)
>Anyway, today Android has the greatest market share, and that's running Linux.
And it also has horrid fragmentation, basic UI APIs so broken across singular versions of their OS they need to ship a support library to work around device specific bugs and bring in new features. Also as a full time Android dev who works with embedded installations of Android, sure Android is Android/Linux, but it's not really productive to treat it as Linux past a few basic party tricks like chroot and such.
>Breaking API compatibility has always been a problem in the Windows world (first DOS, then Win16, then Win32, then .NET, now Metro).
What? If anything Windows is known for never being willing to break old versions of things. There's the story of Win 9x containing flags for SimCity to emulate Win 3.x behavior just to keep compatibility. And way more stories like that one: https://blogs.msdn.microsoft.com/oldnewthing/
Not to mention your list doesn't really make sense... what does Metro have to do with .NET, and what does .NET have to do with Win32, Win16, or DOS? I mean, for the record Windows 8 32-bit runs Win16 apps, has Metro, runs .NET applications and DOS applications, but that doesn't really mean anything?
>The Linux/Unix world is much more sustainable. You can likely still use applications which were written twenty years ago.
A UI application using MFC written against Win 2k will still work out of the box on a Windows 10 PC. What about an application written against Gnome 1.2. on Ubuntu 16.x?
"Breaking" in that sense isn't the same as in the sense my posts are talking about.
It's "breaking" in the sense of compatibility. If an app runs on Win 7 and Win 10, Win 10 didn't break compatibility between Win7 and Win 10 because an upgrade doesn't go through properly.
And while I have heard plenty of issues with upgrades, I've also personally had 0 issues with them since 7, on any of my personal machines, even some in the insider track.
Even in the thread you link to almost every other response is in line with what I mean: "it's usually very very hard to accidentally make a program run on Windows version N but not on Windows Version N+1"
So does Linux. Before I had a MacBook I tried Ubuntu on my PC twice (this is back in 2007 and another time in 2011) and both times an upgrade made Ubuntu unbootable(!). These days its nicer, but the 'Linux is more stable' myth is only true if you stick to servers/CLI. Desktop its Mac > Windows > Linux and on laptops its Mac >> Windows >>>>>>>>>>> Linux
In my experience, users were rather confused and lost with the UI overhaul in Windows Vista.
With Windows 8 users could not even figure out how to log in to the system.
Windows 9x/NT/2000/XP was more consistent.
>The Linux/Unix world is much more sustainable. You can likely still use applications which were written twenty years ago.
It's true that the kernel API on Linux is quite stable, but that doesn't get you very far with most applications, since they are usually linked against some old library, which depends on another old library etc.
I'd go further and say that most users don't understand the difference between OS, hardware and form factor\case and see a computer as an all in one appliance no different from a smartphone or a console.
You can see this when people talk "Mac vs. PC", they'll talk about the features of the hardware and the OS interchangeably (Macs are better because they get less viruses), even though bootcamp has been an officially supported path on Mac hardware for a long time.
There is a marketing problem with mainstream users in that most people don't know what Linux is, but I'd guess that many people don't know what Windows is, probably don't know what Android is and don't know what iOS or OSX are, so I'm not sure if understanding what Linux or Ubuntu is actually matters for usage.
I completely agree in that sense that Linux would work for average desktop users if OEM's installed it. The only major barriers I can think of are lack of MS Office and potential support overhead. Companies would have to hire a lot more dedicated Linux desktop support staff.
The problem you run into when it does come preinstalled is users are not familiar with it and don't know how to use it.
Take this example[0] from years ago when Dell had Ubuntu installed on some laptops. This is an example of an average user. For lack of a better words dumb as dirt. If it doesn't work right away and they can't solve it in under 5 minutes they get mad.
The article also brings up a relevant often over looked point, you need certain software that only runs on Windows or macOS. Libre Office will export to a Word format but the default is set to odt, expecting average users to know how to change format is pushing it. I've gotten in the habit of just exporting all my documents as PDF's because there's no question that the receiver will be able to read it but that isn't something that goes through the minds of most people I imagine.
Very well said. When I see people in here say they installed Linux on their parent's computer, I have to hope Mom and Pop are old Unix hackers, or their kid has a lot of free time.
We don't realize just how much non-technical people struggle to do the most basic tasks on their computer. They struggle mightily to so much as connect to a new wireless network or download and run a graphical installer on Windows or Mac. Adding a printer, changing settings or connecting to a Bluetooth device are probably beyond their abilities.
My parents are both extremely intelligent, and early adopters of technology who've had PCs since the 80's; every time I see them they've filled their Macs with adware ("No, it's good; it does a cleanup! It's from Amazon!"). To send them an email is to risk it being lost in the flood of hundreds of newsletters they inadvertently subscribed to. I know fellow developers with Macs who run all their apps off of the still-mounted disk image instead of dragging them to the Applications folder.
You expect non-technical people to use a package manager? To find the non-free repository, or have the slightest clue what "package manager" or "non-free repository" even means? To hack on the command line? To edit configuration files? I often need to run to the docs and StackExchange for those kind of things, and I'm the guy my (intelligent, competent) team members come to about that stuff. You expect them to find and install drivers? To deal with stuff breaking when they update their distro? To update in the first place?
Like the woman in the story, they're likely to give up when they see MS Office isn't available. People in my graduate CS courses claim they can't open .docx files because they don't have MS Word. It's a more-competent-than-average person who even knows about and recognizes file extensions.
Think about how painful it is to get your coworkers to so much as try out a new language, library or development tool. How much hand-holding they need. How they whine and complain about the slightest difference, even when the new way is clearly superior. And this is talking about highly-intelligent people above the 99th percentile for knowledge and interest in computers.
So yeah, I don't think Linux is really in a state where it's usable (without handholding) by non-developers without lots of time, ability and interest to learn Unix. Even Windows and MacOS are something of a losing struggle. People do better with the more idiot-proof iOS and Android, but I'm still troubleshooting my folks' phones on a regular basis.
Anecdotally, completely wrong. I set my tech phobic father up on Ubuntu 6 or 7 years ago. He doesn't need to care what a package manager is, because he doesn't install anything, except for security updates when prompted. He uses the browser, abiword, and gnumeric, and is totally fine with that setup. A couple of years back he bought a new printer, plugged it in, and the system walked him through configuring it automatically.
Same thing can be said about Windows. It's not anywhere more ready for out of the box use than Linux. I.e. it needs learning, and learning effort can take time, that's true. But the argument that Linux is harder to learn doesn't stand.
The situation with drivers, hardware compatibility, internet configuration, software discovery, installation and availability, OS updates, layman-friendly Google troubleshooting and so forth is far more idiot-proof on Windows than Linux. Most of those are ecosystem and not OS-related, but that doesn't help a person who's not totally clear on what an operating system even is.
What's easy for you or I is worlds beyond what's easy for the average Joe or Sally who wants internet, email, MS Office and maybe gaming.
If we are talking about people who buy pre-assembled computers, hardware compatibility questions are addressed by the manufacturer. I doubt they would be selling hardware that runs Linux, and at the same time has drivers problems. And if we are talking about people who are ready to assemble PCs from parts themselves, they are as well ready to do their research and figure out what works and how well.
At the same time, driver situation on Windows can be actually worse than on Linux. Especially if we are talking about manufacturers dropping support for their closed drivers. It often happens, that with Windows the only solution is to upgrade your hardware, while Linux merrily continues working on it for years. I encountered this multiple times, especially with laptops.
Not sure what you mean about network configuration woes, Network Manager GUI for example is pretty straightforward and easy to use.
Well there's the rub, virtually nobody is selling out-of-the-box Linux PCs. So, if Linux is being installed by the user, you're rolling the dice on how nicely it plays with your wireless, touchpad, graphics, sound, webcam/microphone, HiDPI screen and so on. "Probably" isn't good enough when the competition is guaranteed to work out of the box. And again, it needs to be guaranteed totally automatic; the population we're discussing won't be able to make the correct selection from a dropdown or install drivers manually (or have any clue what that means).
The problem with the GUIs I've seen for Network Manager is the same problem affecting nearly all Linux software, which is they're geared towards expert users. They expose too many configuration options to the user instead of hiding them in menus or preference panels. Experts like you or I see past all the noise and recognize the relevant bits. But a layman doesn't know what anything means or what's relevant, so they're likely to become stuck and confused.
Worse, many of the visible but probably-irrelevant options will prevent them from connecting to the internet if they're set to the wrong thing. The biggest part of idiot-proof UI/UX is looking at a "connect to a wireless network" modal and counting the number of ways it's possible to exit the modal without vs. with successfully connecting. You want all roads to lead to the happy path, even with Grandma rand() at the wheel.
Most Mac and Windows software could use improvement here too, although to a lesser extent, and same with iOS and Android but again to a lesser extent than with the desktop OSes.
Think of a person who leases a Camry. They like it because it's familiar and easy to drive, it meets all their needs more than adequately, and it has a low cost of ownership. If something goes wrong they bring it to the dealership and it's either covered under warranty, or, if not, they just pay whatever the cost is out-of-pocket. When their 2 years are up they lease another Camry.
One of the car blog Jalopnik's writers, Tavarish (https://kinja.com/apidaonline), doesn't know why anyone would want a Camry when they could have a 10-year-old European sports or luxury car at the bottom of its depreciation curve. The idea they're maintenance nightmares is a myth; he does all the maintenance himself, and finds like-new parts in junkyards instead of buying from the OEM, so they hardly cost him anything to own.
Obviously, Tavarish is an expert mechanic. He can replace a clutch or a turbocharger in his sleep, whereas so much as a fuse or a headlight is beyond the capabilities of our Camry lessee. We are experts. If our printer stops working we can recognize the issue, find and compile the community-supported open-source Epson 6000 SUX driver, and be on our merry way. Other people, who think a driver is a golf club or Morgan Freeman, just buy a new printer.
> Well there's the rub, virtually nobody is selling out-of-the-box Linux PCs.
That's exactly the root of the problem that was discussed above, isn't it? So that's the real issue, and not inability of Linux itself to address common use cases of non technical users. Because Linux can address them today.
> So, if Linux is being installed by the user, you're rolling the dice on how nicely it plays with your wireless, touchpad, graphics, sound, webcam/microphone, HiDPI screen and so on.
Such users don't roll dice to decide what hardware to install on (if they are smart to install it on their own). They do research, get parts that work well with Linux, and then install it. Without doing research - well, anything can happen.
> The problem with the GUIs I've seen for Network Manager is the same problem affecting nearly all Linux software, which is they're geared towards expert users. They expose too many configuration options to the user instead of hiding them in menus or preference panels.
OK, for a test (KDE Plasma 5.8.4), I just enabled my WiFi (I'm currently on wired connection), selected a network from the list, and clicked a single button [Connect]. It opened a field for password right there. Type in the password - and voilà, it's connected. Nothing else is needed for the simplest scenario. Of course you can always go into connection settings and start changing around custom DNS and what not if you know what you are doing. But all of that in not exposed in the simplest case. So I doubt, configuring network will be an issue for Linux newcomers.
> So that's the real issue, and not inability of Linux itself to address common use cases of non technical users. Because Linux can address them today.
I think that's a lot truer than it was 15 years ago (although there hasn't been much improvement since 10 years ago), but it's still less true than is the case for Windows or Mac, which themselves are tricky for the inexpert user.
There's a reason Dell markets their Linux laptop as the "developer edition" and not the "Linux edition" or "Ubuntu edition" or whatever. As discussed elsewhere here, OEMs who've sold Linux PCs get many returns, and find their customer support lines (very expensively) jammed with buyers completely clueless about how to accomplish any simple tasks with Linux.
And, even if people who've never used a computer before would find Linux as easy to learn as Windows or MacOS (IMO it's in the same ballpark but still behind), most people have used computers, and have already learned how to accomplish basic tasks on one of the Big 2 desktop OSes. With Linux they need to relearn all that from scratch (and for that specific dialect). People have only so much free time, and that's a big time commitment for them, with a big opportunity cost.
When MS had their Windows 8 debacle (making their users have to relearn how to accomplish basic tasks), IIRC, Macs increased in market share, but the Linux market share didn't move (or at least it's relative improvement was worse than Mac). And then Windows 10 gained some of the market share 8 lost, at the expense of both Mac and Linux. My interpretation of those data is that they suggest Linux is harder for new mass-market users to learn than MacOS or traditional Windows.
> Such users don't roll dice to decide what hardware to install on (if they are smart to install it on their own). They do research, get parts that work well with Linux, and then install it. Without doing research - well, anything can happen.
> most people have used computers, and have already learned how to accomplish basic tasks on one of the Big 2 desktop OSes.
This comes down to MS and Apple trying to dominate the education system. IMHO schools could teach Linux from the start, and a major part of this issue would be gone already. Closed incumbents fear that, and provide "incentives" for schools, to make sure their lock-in remains unchallenged.
> There's a reason Dell markets their Linux laptop as the "developer edition" and not the "Linux edition" or "Ubuntu edition" or whatever.
There is too much inertia there. Since Dell had troubles in the past, they translate it into present. They should just start anew and offer Linux on all their models.
> very unreasonable (for a layman) modal if it can't.
If something can't deduce security method, what do you expect it to ask? Particular details of the UI adjust to selected security method (at least in KDE manual configuration). I.e. I find it completely reasonable, and I doubt Windows or any other OS can fare better if automatic configuration fails.
> There is too much inertia there. Since Dell had troubles in the past, they translate it into present. They should just start anew and offer Linux on all their models.
If I were a PC OEM I'd be in wait-and-see mode for Google's new OS, which may or may not be Android-for-the-desktop.
> If something can't deduce security method, what do you expect it to ask? Particular details of the UI adjust to selected security method (at least in KDE manual configuration). I.e. I find it completely reasonable, and I doubt Windows or any other OS can fare better if automatic configuration fails.
It should guess the possible security method(s) based on the entered password, and if that fails display the more complex modal, but with the lesser-used fields behind an "advanced" button.
Personally I'd be pretty upset about Android for the desktop. Ironically, while Android is using Linux kernel, it's also using completely incompatible userland from the rest of the Linux world, starting from its own libc. This creates a major mess and lack of drivers on mobile for normal Linux distros that use glibc, Wayland and etc. and need for such hacks as libhybris to work around that mess. I really don't wish the same sick situation to spill out to desktop Linux.
> It should guess the possible security method(s) based on the entered password
I'm not sure it's a good idea security wise. I suspect, it might inadvertently expose your password.
> If I were a PC OEM I'd be in wait-and-see mode for Google's new OS, which may or may not be Android-for-the-desktop.
I'd be investing in linux in case google and MS go the nexus/surface/mac route and cut third parties out of the equation. Kind of like what samsung is doing with tizen.
As much as I'm not a fan of some of Dell's dodgy multinational practices, I have respect for them pushing mainstream adoption of pre-installed Ubuntu.
I tried to persuade a UK company Novatech that offers computers without an OS to go one step further and offer Linux. They liked the idea, but the support overhead wasn't worth it for them.
The only company i recall having trouble with returns were MSI, but then they managed to ship their Wind series with missing drivers for things like the webcam...
>The only way to seriously increase Linux desktop adoption is for availability of Linux pre-instsalled computers to significantly rise
I don't know if there's a term for this, but ironically, that's like a self-destructive goal. Most people who use Linux currently would wipe the hard drive as the first thing if they bought a laptop that came preinstalled with an OS. So if everyone became an enlightened linux user, everyone would start wiping their drives right after buying the machine, invalidating the need for preloading in the first place.
No it's not. Let's say I buy a Dell with Ubuntu on it, that alone is a good indication that many other distros would run just fine. If I buy a laptop with Windows there are absolutely no guarantees for being able to run Linux without issues.
Some users. Most users aren't going to install any OS on their own, no matter what OS. They are the majority. And to gain adoption among them, Linux should be preinstalled.
Admittedly, I don't own a lot of AAA titles, but only one of the ones I own doesn't support Linux (Skyrim). Even most of the indie games I own support Linux. Honestly, it feels like games that don't support Linux have been the minority in the past few years.
There aren't many stats on actual Linux usage for cross platform games. The only public data is from Humble Bundles, and it was for indicated usage (i.e. you could select what platform you intend to use the game on while buying).
Even if it's 10% - it's a lot of users.
There was some article, that availability grows gradually. I.e. if too many games will come out for Linux at once, the market will be too flooded with them, which will in turn hurt both developers, and users.
Steam survey doesn't show percentage of usage per OS for cross platform games. At least I've never seen such data published. It's tossed around often, but it's not useful really.
That's true. However, I'd wager that the data would still correlate with the hwsurvey results. Would be interesting to see the actual numbers, especially for CS:GO and Dota 2.
I don't think it would. I saw some reports from actual developers, that differ quite a lot from that survey. So, I wouldn't use it as any sort of definitive metric.
Of my steam library of 324 - yes, mostly Indie games, not AAA, which generally don't interest me - 230 is available on Linux. Now I have to admit I would not buy a Humble Bundle if there were not at least two Linux compatible games in it, but then why would I as I have 3 home machines none of which ran Windows. That changed when I bought a Vive, the same machine has not been switched on since October last year... Oh well another 2k invested in the "future of humanity".
I was commenting on the availability of Linux games. There are very many and they generally don't suck. Avoiding Linux because of a lack of games is not a strong argument, if it ever was.
Not sure how indicative I am, but I run a windows VM just for gaming and have set the same up for friends.
If more games were available on linux I wouldn't have to run the VM and be a lot happier.
I still refuse to buy or play any blizzard games due to their policy of ignoring linux users.
Steam survey is not useful for estimating Linux usage of cross platform games. It's not even clear how accurately it represents all Steam users. But given that majority of Steam games are Windows only, such survey is pointless when you need to know percentage of Linux usage for a given title.
Their number is gradually rising. Higher availability of games needs breaking the catch 22, it's not an instant process (i.e. bigger market means more games). Linux gaming grew in the last 5-6 years tremendously already. And it in turn increased Linux desktop usage. So catch 22 is eroding and things are progressing in the right direction.
Recent bigger focus on improving graphics situation (Vulkan and etc.) is helping that as well.
Using Ubuntu 16* again after maybe a decade of not using Linux has not left me impressed at the desktop progress.
By it's nature, the Open Source (OS) community doesn't have the man-hours or leadership structure in place to make a coordinated desktop system.
I would argue that it is at least the rare combination of technical know-how, leadership structure, man-hours and perseverance that makes a system successful. An example would be Linus Torvalds with Linux but there are many others.
Further to this, I believe a commercial structure in conjunction with an open-source component/ecosystem has a much greater chance for success, by it's nature.
So in this respect I think RemixOS/Android on Linux is a better path to a successful Linux desktop as Android today allows 95%+ of users to do everything they need while also allowing the rest of us the use of all the great facets of the Linux/GNU components.
Linus uses KDE, IIRC ;-) I don't really know anybody who liked Unity (what's shipped by default on Ubuntu) out of the box. I know people who have learned to like it, so I suppose it's an acquired taste.
I actually like Gnome 3 because it is highly configurable for a programmer (plugins are really, really easy to write). But, in the end, I don't like all the Gnome infrastructure -- especially they really broke internationalised input for a long time and pretty much forced me to change.
My wife uses KDE and absolutely loves it. I had previously introduced her to Gnome 3, which she didn't like and Cinnamon, which she was pretty neutral about. She was originally a non-computer user -- she had only ever used a cell phone for her whole life if you can believe it.
I eventually migrated to XMonad and I don't think I will ever change to a more integrated desktop environment.
Free software is about freedom. It's freedom to use your computer how you see fit. It's freedom to write whatever software you want, or to tweak it, or to fix bugs. It's freedom to help other people by writing software, or tweaking it or fixing bugs.
The idea of having some coordinated effort that makes decisions for everybody and provides a lowest common denominator is appealing for a lot of reasons. Probably that's what it takes to compete with the vendors who see the computer as an integrated consumer good -- a black box that is highly targeted to a particular market.
As a user of free software who enjoys his freedom, penetrating the mass market, but removing choice is not something I look forward to. I think some people have this idea that Linux needs to have mass market appeal to be successful. I am not one of those. I'm happy if more people want to enjoy software freedom, but I could care less about Linux market penetration on the desktop.
But anyway, if you are interested, I would give KDE a shot. Personally, I find it over complicated, but it has some really compelling features if you're into that kind of thing.
I think it is true, and there has always been desktops that are a small taste variation away from Windows and desktops that are a small taste variation away from MacOS X, and desktops that are a small taste variation away from CDE/Motif.
I have always felt that this was the single largest impediment to Linux as a desktop, not because there were several different tastes but that the APIs used to provide the UX were different across all of them as well. GtK2, GtK3, Qt4, Qt5, XMotif, Etc. Some FOSS apps try to 'adapt' depending on which desktop you are using, many just punt and look like what ever their 'birth' desktop looked like, and many suck in that other desktop as a function of being installed. My Ubuntu 16 deskop has XFCE on it (Xubuntu) but other apps have sucked in both versions of GTK, and all of Qt and the KDE libs.
While it lets people complain if you force a single desktop metaphor down their virtual throats, at least it has a good shot of the Apps all working, in a general way, the same way. And that is what separates (in my opinion) more general adoption of the Linux desktop from the specialize adoption we have now.
> I don't really know anybody who liked Unity (what's shipped by default on Ubuntu) out of the box. I know people who have learned to like it, so I suppose it's an acquired taste.
Everything's acquired taste. FTR, I quite like Unity.
> By it's nature, the Open Source (OS) community doesn't have the man-hours or leadership structure in place to make a coordinated desktop system.
That's wrong. KDE and Gnome are coordinated desktop systems. Why do all people cry out for a single desktop? GNU/Linux (Unix also) and the desktop had always been separated, and I consider this a very good design choice. Right now the Linux community is slowly moving towards Wayland. Thanks to the design of Linux it's no problem to add a new series of desktops to Wayland until the best ones will get obvious, likely two or three again (as KDE and Gnome).
Ubuntu - Canonical in general - is not a good representative of "Open Source".
They don't contribute back to Debian, don't cooperate with the Gnome project (Unity?), and there are other options out there.
They are good at marketing, though. Ubuntu feels like a wonky OSX clone..
What really annoys me is that they have made people in general think of Linux as "Ubuntu".
There are other options. :)
I am a happy Debian KDE/lxQt user not using *untu. :)
> By it's nature, the Open Source (OS) community doesn't have the man-hours or leadership structure in place to make a coordinated desktop system.
More precisely, man-hours that can be directed by leadership at a specific goal. FOSS is great at creating things that programmers want, like text editors, terminals, and web frameworks, because programmers will do those things for themselves. It's pretty bad at creating things that non-programmers want, like WYSIWYG text editors, spreadsheets, desktops, and web browsers, because programmers will only do those things for money, and there isn't enough money in FOSS.
> Further to this, I believe a commercial structure in conjunction with an open-source component/ecosystem has a much greater chance for success, by it's nature.
For all its other faults, Apple got this right with Mac OS X: Apple pays people to write a coherent desktop system, but it's still Unix under the hood for people who use and develop FOSS. Android and Google Play have sort of done this for phones. I think there is a market for someone to make a paid, non-FOSS desktop based on Linux, since there are plenty of coders who don't want to hack on hairy desktop stuff.
"WYSIWYG text editors, spreadsheets, desktops, and web browsers" are all more than serviceable in Linux. I seldom find Firefox less useful on Linux than any other platform. For the rest the problem is that they are moving targets whose spec is dictated by others.
You want Excel to act like Excel and you also want Libre Office Calc to act like Excel, but Libre Office can never really be Excel, so it's Excel 97.
We don't want a coordinated anything. The Linux community practices "fork it for fun", at the drop of a Red Hat. That's one of its core values and it is antithetical to the desires of the vast majority of users.
Canonical's Unity Desktop is ok. I mean, I don't really care about it. Its not interesting. Its idea of a sandboxed, polished, branded experience gets in the way of interesting. I don't use it. I don't even install it.
That's my definition of Windows and OS X too.
Anyone with "leadership skills" and a business know-how realises that the Linux community is like a herd of cats, and there will be no marshalling us into contributing towards a cohesive and coordinated anything.
If we wanted to be corporate citizens, we would be happy with Windows and OS X. But we had that and thew it away. A proper corporate effort will be distrusted, cracked, forked, and rewritten if need be. Look at Android: we fight for our right to root it.
Gentlemen such as yourself may be happy with a corporate Linux presentation, but it ranges from uninteresting to offensive to many long term Linux users.
Who is this 'we' you so authoritatively speak for ? i also used linux for the same decade you did, and i do not recognize myself in any of what you are saying.
We wanted technical superiority and good software. Torvald himself is quite clear on it, opensource is simply a way to build great software...That's what we sign up for.
This idea that coordinated anything is antithetical to the linux community is nonsense. Opensource is almost by definition a way to draw more contribution in , to allow more people to actively engage, collaborate and improve a given software. The fact that you can fork a code base is something that we tolerate...
Unfortunately what most experience is the important thing. I constantly hear ppl praising how much better a 1000USD iPhone is than a 150 android phone that the manufacturer has barely finished the firmware on.
> Unfortunately what most experience is the important thing. I constantly hear ppl praising how much better a 1000USD iPhone is than a 150 android phone that the manufacturer has barely finished the firmware on.
Now that you mention it, I wonder if this has the same root cause too, manufacturers/distros trying to differentiate themselves. In both cases they spend a lot of extra money to make "their OS" worse.
I am a minority when I say that I don't want the Linux desktop to flourish?
I've been using Linux since 1996. Though I also use Windows and OSX, I haven't been without a Linux machine for 20 years, and I've seen a lot.
It seems the more Linux distributions try to target the "average user" they just dumb it down and/or make it more proprietary. I can barely stand using Ubuntu these days for that reason.
I know, it's configurable, I can change anything I want but I don't really want to see that trend continue. I would like to see innovation focused on making Linux better for work, not focusing on the average Joe who doesn't want/need it.
I'm with you. I think the best user experience for a home or work desktop is Windows. the best user experience for technical work is Linux. I want to use Windows and have a clean way to do my work in Linux from there. the new Linux subsytem in Windows seems like it might actually be possible.
until then I use osx which gets me ok user experience and pretty good work experience. it's a decent compromise
In what way does Windows offer a superior user experience than macOS? Certainly more games are available for Windows and that's a key feature for many. Are there any other advantages?
I've used both and I appreciate a few things macOS gets right - installing and uninstalling applications is copy and delete, which is simple. Most applications have a consistent look and feel.
> installing and uninstalling applications is copy and delete, which is simple
It's still black magic to most people.
As for user experience, I've never liked OSX, it never clicked with me. I find the menu bar at the top especially confusing, I run apps side by side a lot.
> installing and uninstalling applications is copy and delete
Half the time. The other half you get an installer.
The Mac OS just got the ability to resize windows by every edge or corner within the past decade. However, you still cannot disable an external monitor without physically turning it off or installing some third-party software to manage it for you. My point is that Windows has tons of similar features that the Mac OS simply does not have.
Beyond the features themselves, Windows has a superior UI. The global menu bar in the Mac OS is terrible. The dock is terrible. Window management is terrible. What's left? Oh, the finder. That's terrible too. These are things that everybody complains about over and over again so I offer you no reasoning here for my opinion. The only thing that Macs have going for them is that they are UNIX and you can build iOS apps with them. That's it. And that's why Macs have a 5% marketshare despite killer marketing.
I installed Unity once and found it so unintuitive. I would've given it a chance, I think, but the Amazon adware thing was the final push for me as well.
Plenty of good Linux desktop environments to check out. Cinnamon, XFCE, Mate, or a tiling window manager solution are all really nice options and I have a hard time believing that detractors of the Linux desktop have tried all of those.
My daughter started to use an Xfce desktop when she was five. She knows her way around by now.
But the desktop, as an approach, is waning. The web browser is the new desktop, the new command line, and partly a new window manager. (Also, the new VM / OS to target.)
I do agree. I use Linux every day via ssh as headless servers and find it indispensable in that role. But I really have no interest in using a Linux distro in an end-user graphical desktop environment.
They don't always hit the mark, but at least OS X and Windows feel like they were at least looked over by someone with an interest in UI/UX. The various desktop environments for Linux always feel like they were designed by an engineer, and not in a good way.
Excellent article. Last year both Mac sales and Windows sales declined YoY. Only one "desktop" grew. ChromeOS is Linux base and grew over 35% YoY. Still small numbers but is now #2 in the US and growing quickly.
"Chromebooks surpass Macs in U.S. sales for the first time"
I disagree. The Article i stupid. There's never been a war. A war is destructive. Microsoft is doing well for itself.
Win10 is arguably the best Windows ever. The world is going cross platform. Ubuntu is a great distro, Linux server Systems are awesome. Android is awesome.
Everyone did their best and many people and organisations are doing well for themselves. If that was a "war", everyone seems to have won it.
That is complete speculation. There are tons of reasons why Google wouldn't switch Android "over", including established development environments, drivers, and licensing.
It's fun to speculate on the purpose of this but it's just that: speculation.
Don't get me wrong: there are massive improvements to the existing linux ecosystem just ripe to be made, but many of them are in the environment and not just the kernel.
And by the same coin there are a lot of reasons they would.
Drivers can be fork lifted by implementing KPI compatibility, and that can be done in a shortest path fashion or done in a way to realize new protection from the new OS. Development environments can be preserved indefinitely with image activation and personalities. The first never mattered to Apple, and the second was effectively realized on OS X.
There is no reason to believe that Linux, and other possibly OSs, won't all grow during this time. As long no one is monopolistic and anti-competitive there is a ton of room for competition in this space.
The network effect, the winner takes all effect, is becoming less pronounced as there are more ways to build cross platform software.
>There is no reason to believe that Linux, and other possibly OSs, won't all grow during this time. As long no one is monopolistic and anti-competitive there is a ton of room for competition in this space.
Well, grow enough to counter-balance the sudden loss of billions of Android devices? I don't think there's reason to believe that. It had stalled for ages on the desktop, and mostly got to devices outside the embedded space through Android.
(Of course Android-Linux devices will continue to exist -- but eventually those with the new Google kernel/OS underneath will overtake them, especially if Google insists on OEMs using it to get access to the various Google apps).
>The network effect, the winner takes all effect, is becoming less pronounced as there are more ways to build cross platform software.
It's not about having some cross platform software though, it's about the whole ecosystem. Which is e.g. one of the reasons why Firefox OS, et al never got anywhere.
The ecosystem is important. What do you think about systems that make it easy to ship software for multiple platforms?
Consider Unity, the game making tools. When shipping there is pretty much a series of checkboxes for what kinds of packages you want to make. Unless you went out of your way to use some OS specific stuff it will work everywhere. There are other tools like this for other non-gaming domains, do they contribute in your view of OS ecosystems?
There is reason to believe that. Linux hasn't fundamentally changed in .. well has it ever? The driver model still sucks (I guess this is probably the main reason Google is moving away from it), the security is still embarrassing, the graphics story is very much stuck in the 90s (does display hot-plugging work yet? what about secure password entry?). It started as a Unix clone and it's still a Unix clone. What makes you think that will change?
I think the time is ripe for a new OS. Especially given that Google's OSes don't expose much of the OS to apps - it should make it quite easy to port their userspaces over.
It's funny to see a downvote brigade on anything calling into question Linux's supremacy. The overwhelmingly devops HN crowd is probably the worst group to poll on systems software advancement.
Why do downvote brigades even EXIST on this website? I feel like this upvote downvote garbage is an echo chamber in the making. Just sort comments by replies or randomly, or not at all. Or just have a "garbage post" flag and call it a day. Anything is better than this pseudo-democracy thing that Reddit has shown to be useless for real discussion time and again.
Agreed (and upvoted). I think HN has a generally high quality of discussions (except on certain political or fanboyism-related topics where only one side of the debate is tolerated). But, the best discussions on sites I frequent are on blogs that simply sort comments by time and thread, and where the host encourages open and intelligent discussion. Scott Alexander's (https://slatestarcodex.com) is a good example.
The flipside is that "report" buttons can become surrogate downvote buttons, and the lopsided volume of reports leads to inconsistent enforcement of rules, so commenters unpopular with the crowd or mods are more and more likely to be banned, while discussion devolves to groupthink, cliquishness and sycophancy towards the locally-powerful. IME BBSes are the most prone to fall victim to this; the communities are small enough that you can't be "anonymous," and with threads that can continue and be bumped to the top indefinitely, there's no mechanism besides moderator intervention to stop a flame war. Blogs and HN/Reddit-style aggregators do better because old threads are quickly buried.
The success rate for new ground-up modern OS projects achieving broad adoption isn't great - Microsoft's Midori, GNU Hurd, Apple's "Pink", etc.
AFAIK there basically hasn't been a broadly successful one since Windows NT in 1993. Which isn't to say people should stop trying and it'll be fantastic if they do succeed with this, I just wouldn't count on it.
It may not be so bad because most of Android apps are built on Java which isolates a lot of what OS does. For other native apps, if Google builds a POSIX compliant OS, it should be fairly easy to port/run.
That's because those weren't completed in time due to feature creep and second system effects (Apple's Pink/Copland), lack of interest and slow progress (Hurd), etc. Midori is just a research project, so noone pushed for it to be adopted anyway.
With a completed OS (which nowadays isn't so difficult, large parts of the stack are either ready or there for the taking), and a big vendor behind it, it's another story. Microsoft pulled it off with NT, and Apple's OS X was a whole new OS as far as Mac customers were concerned (the fact that it already existed as NeXT didn't matter much -- it still required all new software (it didn't run old NeXT stuff as is), new drivers, had a new SDK that Mac developers had to cater too, etc.
Besides Android is not like a full desktop OS Linux -- it abstracts most things for apps through its SDK/"Java", and Linux is just the kernel.
I recall reading a book about the development of NT. One of the key figures joined in because it was seen as the last great chance to create an OS from scratch. Seemed very short sighted to me at the time.
And then Fuchsia will replace Desktops. If the vast majority of devices runs Fuchsia and it comes embedded on Laptops everywhere it will be the end of Windows on the desktop.
There's really nothing special about Linux, but timing was right and political blunders beset all other contenders. In the end the price tag of a free UNIX was too good for cheap web hosts and eventually hyperscalers and Linux had insatiable hype and sidestepped most blunders. Less importantly for numbers but critically important for actual engineering talent (hyperscalers are pretty low on anything other than ops talent), high end IT applications like RDBMS and three letter acronym software from IBM, Oracle and HPC also needed to land somewhere as the commercial UNIX market shit the bed. That brought in things like RCU and high core count scalability relatively early on that turned out to be well suited for mainstream servers a short few years later.
One interesting thought experiment is that cost model is the primary thing holding the WinNT kernel back as again hyperscalers tend to have very low expertise and contribution to the systems software space. A free WinNT heavily focused on the Linux (I want to say POSIX but that would be unnecessary friction at this point) personality that did containers and orchestration well would actually be interesting and a pretty easy path for MS to navigate. The space is terrible enough the MS could navigate a superior solution due to vertical integration. Think Joyent's SDC but from MS. In effect, this was what happened to Linux on the desktop -- Apple executed in a way that vertical integration can always out-engineer the bazaar model on coherence, and OS X took over desktop UNIX.
Linux on the phone and embedded space I actually expect to see this erode as IoT disasters come home to roost. Things like Google's Magenta kernel, capability kernels, seL4 will probably displace it. As in the server space, the Linux personality may live on in image activation here due to entrenched developer mindset, but it's really never been a good platform to build these things on.
> Meanwhile Linux advocates have to be happy with the hollow victory of The Year Of Linux (Android) On The Smartphone, because how do you hack on a smartphone? Type in C code on the onscreen keyboard?
If that's victory, I don't even want to imagine what defeat looks like...
"Meanwhile Linux advocates have to be happy with the hollow victory of The Year Of Linux (Android) On The Smartphone, because how do you hack on a smartphone? Type in C code on the onscreen keyboard?"
So what Linux did not have was a proper pitch to the OEMs, so that you could buy a pre-configured, pre-installed Linux computer from Walmart that would be compatible with existing standards.
This was pretty much impossible because of many things, but mainly because of the proprietary standards in both hardware and software.
Even if you'd manage it technically, you'd then have to face with the legal trouble.
I'd love to know the number of people who bought a computer with Linux installed on it and decided to put Windows on it. I think this must be some alternate dictionary definition of zero.
Linux is so impressive that people will spend days and months learning and figuring out how to switch. It's a really painful process (because of all the problems you mention with proprietary hardware).
I tried to persuade Novatech in the UK [0] who will sell machines without operating systems to actually sell machines with Linux on them. It just wasn't worth the support overhead.
As recently as two weeks ago their support were replying on their forum about their reasons for not being Linux compatible [0].
> To add to this, to be able to say a system is "Ubuntu Certified" for example, requires sending one of every unit to Canonical for them to run tests and ensure everything works 100%. Every time we have looked into it the admin & costs involved is onerously high unfortunately :(
Best i recall there was one company that claimed this, and they made one with a half-assed Suse install that were lacking drivers for various parts (webcam included).
Sadly they were also the company many retail chains turned to for rebranded netbooks.
It does seem that Apple are keen on migrating their users to Linux of late.
I'm actively considering going back to Linux myself, and have already jettisoned a bunch of apple products.
Is it conceivable that the conventional desktop market will shrink to a core of Linux hackers developing software that runs on the tablets and phones of everyone else? With IDEs provided by MS/Google/whoever but not OSs? Finally, the year of the Linux desktop arrives, when the desktop comes to an end.
> It does seem that Apple are keen on migrating their users to Linux of late.
Indeed, especially MacOS gamers. Wine on Apple won't be able to play DX11 games (because Apple abandoned OpenGL there), and it's some incentive for them to switch to Linux.
This history misses netbooks. 2007 was the year of Linux on a desktop: when Microsoft actually had to start giving Windows to netbook manufacturers at $5 down to $0 because they were selling computers with a fully-functioning Linux desktop on them.
Because they were pretty much a non story? Less than 1/2 the impact of something like the iPad, and lots of people switched them that could to Windows anyway. On their best years (around 2008-2011) they sold like 16-20 million units per year (iPad has sold ~350 million units thus far).
Because Intel freaked and pushed everyone over to ATOMs where as the early EEEPCs etc had used discount celerons.
Basically Intel feared that the netbooks plus citrix would strangle the lucrative ultraportable market segment.
At the same time MS gave Windows XP another stay of execution just to have a Windows option to offer OEMs (Vista was just too darn bulky).
Thing is that XP, or more correctly the version of the Windows Update version it used, had a flaw where it would grow slower and slower over time as it tried to enumerate all the patches on each boot.
10 years ago maybe not, but today things like Unity, Gnome, Mate etc given you a modern, stable and reasonably seamless desktop experience. Even games work now if you look at the recent benchmarks done by phoronix.
But the whole idea of installing an OS is something very few people do or plan to ever do. That itself limits Linux desktops to enthusiasts or managed deployments. People are also used to applications like Office and will always be reluctant to try something else because its not worth their time. There are applications like adobe creative suite and games that tie users to Windows or OSX and force the OS choice.
The only way Linux becomes widespread is if an entity pushes it aggressively like Google does for Android. Without that for the general user what possible reason could there be to turn away from their preinstalled perfectly working windows desktops to reach for Linux?
The privacy concerns. Also, older people like me don't enjoy things constantly changing, so having more control over updates is appealing.
I despise Windows 10, enough that it prevented me from buying a laptop, until I eventually found a vendor willing to install Windows 8.1, which I merely dislike.
I wanted Windows 7 or was willing to try Mint. However they are not available, "no drivers" and so Linux is dead to me.
I am a linux die-hard, and for my use case (dev C++ target other UNIX platforms), it think it's simply the best OS around by far. For anybody even remotely interested in computer science and software engineering, linux is a great place to be.
But i am having a really hard time understanding the reasoning of people trying to put the failure of linux on desktop on anything else than the fact that the linux desktop experience sucks... always have and probably always will be lagging. The recent uptake and use of linux desktop have much more to do with the browsers overtaking everything than anything else...
Don't really agree with the claims in the article. I myself and dozen of my friends and colleagues have been trying to get linux as a desktop OS all the way back when the Tango / Gnome aesthetics emerged as a counter to the glossy 3d aesthetics of XP and OSX.
The thing is... It just didn't work. There wasn't any available software for it. The open-source alternatives such as OpenOffice, Inkscape and Gimp were lacking to put it mildly. Games were out of the question. But that isn't the worst part.
The desktop itself, and by extension the OS as a desktop OS didn't work.
Getting hardware video acceleration to work was almost impossible. Time to edit xorg.conf again and run the glsl demo. Maybe it won't crash the xserver this time.
You want your microphone to work? Time to download and compile some obscure ALSA forks.
UTF-8? Here, have some EN-US whether you like it or not. God help you if you have a USB device which isn't a mouse or a keyboard.
And then there were the crashes. I still remember typing `startx` regularly as the GUI would crash all the time.
I really want linux to succeed as a desktop OS, but I don't see that happening ever.
I think it's a matter of expectations and needs. I've been using Linux as my primary desktop OS off and on since 2005ish (first Gentoo, now Arch); it's been fine for me, but I suspect my experience is largely different for a variety of reasons--mostly due to needs.
However, therein lies the rub: If you use Linux as a desktop environment with the expectation that everything is going to magically Just Work™, there's a greater chance you're going to be unpleasantly surprised. Then again, you might not (again, depending on needs), but if you're using predominantly Windows software and expect to be able to carry that over, it might not happen. I play some Windows games under Wine, and they generally work well, but there's a few that don't (like Guild Wars 2, but it's probably because the game is CPU bound). It happens. Use whatever works for you.
As to your specific complaints, it seems to me that most of them are largely outdated by now. OpenOffice/LibreOffice are good enough for the most part (certainly not as polished as MS Office), and the same applies to Inkscape and Gimp (both of which I use under Windows, so there's that). My GeForce GTX 1060 works fine with the NVIDIA kernel drivers; I've not had to edit my xorg.conf file in years (with dual monitors!); audio works fine, yes even using PulseAudio; and UTF-8 is a non-issue (installing the correct fonts does wonders, but make sure to change them in your browser profile!). My webcam works great as well, but I selected it specifically for its Linux compatibility. Same for an ancient HP LaserJet printer.
So, I think things have certainly improved--they have since I started using Gentoo. That's not to say you should expect smooth sailing, of course. For my part, I like tinkering with my OS and generally don't mind it when I break something. If that's out of the question and you need something that's going to work out of the box, Linux probably isn't something you should be using in the first place, especially if you have esoteric needs. I won't wax philosophical about "oh, you just need to do X" when certain software is going to work better on Windows, for instance, but having used Linux for my desktop OS as long as I have, I can't complain. I actually grumble more when I have to boot to Windows. ;)
However, I have a friend who has a particular penchant for breaking literally everything he touches (coincidentally, he works in QC now). Every time he speculates out loud about trying some random Linux distro, I politely change the subject!
And your point is? Having average user, without any esoteric needs, jump over these hoops, is not "probably", but definitely unfeasible. Complaints, such as about Office, aren't and probably won't ever get outdated, all diehard Excel users I know rejected LibreOffice Calc due to incompatibilities and missing features.
I know enough C to mess with the kernel, but even I must choose my battles - sometime around kernel 4.0 ACPI support for my otherwise perfectly functional and performant 8y old mainboard broke, freezing on boot. I had to disable ACPI, it means shutting it down manually by power switch. It reminds me every time not to lie to myself "I can't complain" :) Fortunately it's desktop, not laptop.
>And your point is? Having average user, without any esoteric needs, jump over these hoops, is not "probably", but definitely unfeasible.
Did you read my post or the OP?
1) The OP made claims about specific features not working in Linux. Generally, those complaints aren't true anymore and haven't been for a long time (audio, UTF-8, among other things). I can't remember the last time I had xorg crash or the last time I had to use startx and not my login manager.
2) I acknowledge on a number of points that it's most certainly not something the average user should use. I honestly have no idea how else to express this in a way that is suitable enough to satisfy everyone. I think you're misunderstanding the tone of my original post?
>my otherwise perfectly functional and performant 8y old mainboard broke, freezing on boot. I had to disable ACPI, it means shutting it down manually by power switch. It reminds me every time not to lie to myself "I can't complain" :)
If it's a hardware problem due to a fault, I'm not sure what you expect...
I guess you have your anecdotal evidence that "Generally, those complaints aren't true anymore..", me and parent poster have ours and different. I can agree audio is now OK, but everything else is as sketchy as ever was.
It was _the software ACPI support that broke,_ not hardware. With old kernels it all works. That at least I'd like to expect not broken.
> I can agree audio is now OK, but everything else is as sketchy as ever was.
I politely disagree. It has its warts, but most of the OP's complaints were true--10-12 years ago. Now, not so much, and it's not anecdotal either. Let's go back over it by point:
- Hardware video acceleration? Works on most hardware ("most" in this instance refers primarily to vendor). NVIDIA cards are almost always guaranteed to work well, provided there's driver support. This is usually the case except for brand new cards.
- xorg.conf edits required? Nope. Maybe for exceedingly special cases. It's rare now unless you're using a card that's not well supported (see above).
- xserver crashes? Almost unheard of outside hardware problems or driver-related issues. Usually a PEBKAC-instigated fault.
- UTF-8? No point talking about this...
- Random USB devices? Depends on the device. Same for Windows, really.
- Manual invocation of `startx` required? Uncommon. Most distributions ship with login managers that work quite well. If you're using a rolling release, you'll probably have to do this infrequently if an update broke something (but then, you should be expecting that with a rolling release anyway; if not, then you shouldn't be using rolling releases).
> It was _the software ACPI support that broke,_ not hardware
I apologize. I misread your original comment due to misleading verb ordering, and the fact that I'm a horrible, horrible person who has an awful habit of skimming comments.
Generally this shouldn't be the case and is certainly the fault of the kernel. However, 4.0 was released on April 12, 2015 and there have been a number of improvements since, including with ACPI. I would encourage you to try a newer kernel. Preferably something around the 4.7-4.8 vintage. (Or lack thereof.)
It's also plausible your issue was cased by module renaming/removal. My memory is hazy on this part, but I recall support for some hardware being rolled up into a single module, requiring some manual intervention. Unfortunately, the actual working solution can be hit or miss with some hardware (what motherboard are you using?). I've had ACPI work fine on ancient boards from circa 2000ish or earlier (tested recently for my own amusement) but fail completely on recent laptops (2014-2015) out of the box. I'd suggest starting here [1], although this guide may or may not be useful as it's for Arch. It'll at least give you a good start.
ACPI is one of Linux's primary warts, but it depends on your hardware selection. Is it a problem? Absolutely. But I'd also point out that Microsoft used to publish hardware compatibility guides for NT and its various successors. :)
Common hardware will go a long way to improving your experience with Linux, but it's no replacement for researching one's selection. Linux may be condemned to a hobbyist's operating system (on the desktop, at least), but some of us do use it everyday with no ill effects!
Agreed. I started ~2005 with Gentoo as my desktop (currently using Arch). Most of these complaints are no longer valid. Others (OpenOffice, Gimp) are a matter of taste/needs.
I use the non-preemptive (server) scheduler on my linux desktop machines with fvwm as favorite long-time WM. At work: RHEL with dwm, which I'm starting to love, also forced to use Windos 7. Boy, this MS sh*t drives me mad. The focus stealing alone is enough to go berserk. The slowness, the deadkeys, the placement of the windows, the lack of virtual desktops...
well, running heavy multiprocess pipes (netpbm, imagemagick, tesseract, TeX) in the background does not slow down the graphic apps, mainly emacs, firefox, gimp, feh, etc. Even watching a video with mplayer is mostly without any freezing. Distro is Gentoo as cleaned and lean as possibe. And this on 10 years old Opteron PCs.
The year of Linux on the desktop was somewhere around 2009 (whenever we got multiple monitors working for the most part), but nobody noticed, because we had all switched to laptops. The year of Linux on the laptop will probably be a few years after we all stop using laptops.
I have a laptop, running Linux Mint, which uses multiple monitors through a docking station. Since when does using a laptop preclude having multiple monitors?
And how would people possibly stop using laptops? What are they going to switch to? Tablets? Don't make me laugh. Maybe they'll switch to using "laptops" with VR goggles.
I think you're misinterpreting my comment. Pretty much the last reasonably common piece of desktop hardware that Linux had trouble supporting was multiple monitors, which was largely a solved problem by 2009. As of then, you could use Linux on a desktop and get all the functionality you had on other OSs.
By contrast, suspend and Wi-Fi on laptops has been a perpetual headache, and suboptimal GPU utilization is a common cause of battery life issues on Linux laptop setups. Furthermore, the peculiarities in laptop keyboards, webcams and touchpads are often incompletely supported; generally only the most common laptop models will get support for the topbar keys before they're obsolete. These problems all have a common root, which is that installing an OS on a laptop is fundamentally harder than a desktop and often requires some intervention/software from the OEM, and this happens because of the more idiosyncratic nature of laptop hardware, which is necessary to exploit the small form factor as efficiently as possible. However, because OEMs are forever tied to the commercial enterprise market, the patent encumberance of some popular Linux software creates big legal problems for any manufacturer who wants to ship a laptop with the kind of hardware support usually reserved for Linux laptops. The result is that only huge companies (Dell, IBM) and a few dedicated "ideological businesses" like system76 ship Linux on the laptop, and the preinstalled OS often requires significant intervention from the user in order to do tasks which are mundane on Windows, like playing mp4s or Flash, or watching Netflix. These factors combine to stymie the deployment of quality Linux laptops at a reasonable price point despite the apparent technical and economic feasibility of the project.
>By contrast, suspend and Wi-Fi on laptops has been a perpetual headache
I completely disagree about WiFi. I haven't had problems with that on Linux in many years. Maybe you're buying some really crappy WiFi chips, but I've done fine with both Broadcom and Intel. I'd stick to Intel though.
>and suboptimal GPU utilization is a common cause of battery life issues on Linux laptop setups
I'd say that GPUs are the perpetual headache on Linux with regards to hardware support. More than anything else. All the problems with getting Linux to run on desktop/laptop machines has usually come down to poor GPU support for many years now.
>Furthermore, the peculiarities in laptop keyboards, webcams and touchpads are often incompletely supported; generally only the most common laptop models will get support for the topbar keys before they're obsolete.
I haven't had any trouble with this stuff for ages. It all "just works" on Latitudes and Thinkpads. I don't do anything special; I just install the latest Linux Mint and it just works.
>and the preinstalled OS often requires significant intervention from the user in order to do tasks which are mundane on Windows, like playing mp4s or Flash, or watching Netflix.
Watching Netflix hasn't been a problem for a while now. All you have to do is install Google Chrome and use that.
The big problem, outside of questionable conduct from MS, is the lack of stability surrounding the userspace plumbing. The DEs and their friends lower in the stack can't stop CADTing the APIs etc, and that scares away third parties.
Sales are tiny -- probably around 8 million a year -- and the majority are in the education market. (This is why sales are hard to track.) This isn't a lot for a product launched in 2011.
I'll be delighted if you can find an authoritative figure.
I think the recent years of TLA over-reach has not helped much, because the major sales pitch of the Chromebooks is that everything is stored (backed up?) by Google.
> You set up Windows for grandma, installed AOL, and then came back once a month to scrape the inevitable viruses and malware out of her computer until the box was dead, and then you bought a new desktop with more Windows on it
Arguably that's a place where Linux has won in the desktop; all my family uses is the browser, some times LibreOffice Writer and Skype. So I'm guessing like me, many other developers installed Ubuntu or other user-friendly distro for the family so the maintenance described here is down to 0.
I've yet to find a reason to switch to Linux as a user of computers when I could instead use a Mac. I get most of the Unixy goodness along with a sensible structure for my personal documents and applications.
Oh, and the dock doesn't suck / require reboots / break constantly / be forbidden to be on the bottom of the screen due to fiat of crazy owner of distro, and installing other apps doesn't mean that I get all of the ugly lack of aesthetics as beta-era Swing.
Even when installed on a desktop, Linux is not a desktop system but remains a server OS from its heart. This makes it so extraordinary powerful for any users who want to keep control of their software/system.
For folks like me, there is no big point to distinguish between server and desktop. But yes, this view does not hold for the mainstream/non-technical IT user.
This is pretty much why I switched from Mac. The main apps I use are a terminal emulator and web browser. Yes it's not going to be as pretty, but in terms of work it's a lot more productive.
Not really. The author identified the real reason before that. Users wouldn't switch because most users don't even think about the choice of OS, which would mean switching. I.e. they only ever use what comes on the computer pre-installed, and that's it. They don't care much what it even is.
Therefore the main blocker wasn't the bad advocacy or anything of the sort, but the old and crooked stranglehold of MS on manufacturers. The only way to seriously increase Linux desktop adoption is for availability of Linux pre-instsalled computers to significantly rise. And MS use their heavy leverage of Windows pricing to prevent that.
> and the desktop itself became irrelevant
I hear this often, but it remains as wrong as ever. Desktop usage has its place, and it's not going anywhere in the foreseeable future.
> They still just want something that plays games and checks the Internet.
And Linux is good for that already. So, as above, that's not what prevents its adoption.