I assumed (perhaps incorrectly) that they meant non-macOS Apple hardware servers.
Also this was never intended for people using an old laptop 'as a server' anyway, of course that works, and people doing that don't need the enterprise features from a server-specific version of the OS.
This. Hell, most corporate users are still using Windows 7 (I try to avoid working some place I can't have a Mac, but despite Windows 8 having been released in 2012 and 20 in 2015, I have never seen either in a corporate environment).
It is best to make your own decisions there based on some telemetry tbh. It will likely depend on your demographics but also upgrades may depend on your business model. For instance, you may be motivated to upgrade if you realize users who are on older iOS versions don't generate revenue.
I mean, there's a time and a place for speed reading. Part of being good at speed reading is identifying the areas you need to slow down for and pay attention to.
Another aspect of writing code is to think about future speed readers. Can your code be skimmed and understood on a cursory level?
Of course it will have to be read, what makes you think you can guess what people will have to read? This seems such an anti-pattern, instead of writing readable code that lives inside the domain where its used, you make an unnecessary abstraction. This is exactly how redundant complexity gets made and you wind up with 16 abstraction layers where each is used just once and a completely unreadable mess.
Making things into units and making things in generalities is something very different. There is no such thing as a new introduction of a generality that would lessen the mental load in itself, it is both a source of bugs and an obstacle to reading but of course sometimes the tradeoffs are worth it. It just that your solution to reading seems to imply that generalization as such is helpful, and I think its the main cause of redundant complexity.
I think you're too stuck remembering the bad implementations of generality to understand what the original poster is saying. Something like a cache is general and most likely will need to be used in multiple places. If I have a user cache and an application cache, it wouldn't hurt to build both off a general cache. That way I don't have to understand two different cache implementations that are supposed to do the same thing. Now when a new cache implementation is made in a particular domain, there will be no need to review the code that does the caching since developers should already be familiar from working with it in the past. I don't see how it could be "both a source of bugs and an obstacle to reading" unless done improperly.
While it is important to separate concerns where applicable (might be user related functionality and caching) generalization most of the time just adds complexity.
jQuery promotes bad development patterns, AND brings a cost for end users. Developers will do themselves and futures devs a favor by not choosing to use it. Is it really gatekeeping to expect people to be aware of commonly used 10 year old features?
I assume developers who share YMNNJQ are just sick of inheriting messy projects built with jQuery. I know I am.
You've never inherited a terrible, unmaintainable React or Angular app? I have on multiple occasions, and in fact my team is dealing with one right now. I'd prefer a jQuery mess to an Angular/React mess any day of the week. With old-school jQuery apps, at least I can get them to build and can debug the control flow.
A couple of years ago I inherited a client's static site that was written in React before Next.js took off. It is a real accomplishment to make a static site as unnecessarily unmaintainable and convoluted as that one was.
Part of my screening process is making sure something like that doesn't fall in my lap again. Even static sites aren't safe these days.
If you do web development as a job, it's your responsibility to stay up to date on what best practices are. With EC2015 being several years ago now, jQuery wouldn't be a learner's starting point today.
Those that developed bad habits during jQuery's heyday a decade ago may still be writing bad code. And it's someone else's job to clean that up. That's the circle of life, and it's hardly exclusive to jQuery users.
The article explains why jQuery doesn't promote bad development patterns. That is a consequence of bad development and a misunderstanding of what jQuery is.
Could it be used as a Linux desktop environment? I.e. Alternative to Gnome?
I would love to be able to contribute improvements to the desktop environments I use, but I don’t have the time to learn languages that aren’t applicable to my daily work.
Yea, far more web developers know PHP than Ruby or Python.
Python devs don't necessarily know web development, and Ruby is pretty old-school (I don't mean that in a bad way, it's just not what they teach these days and not as applicable outside of web development).