Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

That's already the reality for most of this century. Openjdk, go, rust, docker, npm/yarn, etc. all provide up to date Debian, Red Hat, etc. packages for what they offer. There's zero advantage to sticking with the distribution specific versions of those packages which are typically out of date and come with distribution specific issues (including stability and security issues).

Debian's claims to adding value in terms of security and stability to those vendor provided packages are IMHO dubious at best. At best they sort of ship security patches with significant delays by trying to keep up with their stable release channels. Worst case they botch the job, ship them years late, or introduce new bugs repackaging the software (I experienced all of that at some point).

When it comes to supporting outdated versions of e.g. JDKs, there are several companies specializing in that that actually work with Oracle to provide patched, tested, and certified JDKs (e.g. Amazon Coretto, Azul, or Adopt OpenJDK). Of course for Java, licensing the test suite is also a thing. Debian is probably not a licensee given the weird restrictive licensing for that. Which implies their packages don't actually receive the same level of testing as before mentioned ways of getting a supported JDK.

On development machines, I tend to use things like pyenv, jenv, sdkman, nvm, etc. to create project specific installations. Installing any project specific stuff globally is just unprofessional at this point and completely unnecessary. Also, aligning the same versions of runtimes, libraries, tools, etc. with your colleagues using mac, windows, and misc. Linux distributions is probably a good thing. Especially when that also lines up with what you are using in production.

Such development tools of course have no reason to exist on a production server. Which is why docker is so nice since you pre-package exactly what you need at build time rather than just in time installing run-time dependencies at deploy time and hoping that will still work the same way five years later. Clean separation of infrastructure deployment and software deployment and understanding that these are two things that happen at separate points in time is core to this. Debian package management is not appropriate for the latter.

Shipping tested, fully integrated, self-contained binary images is the best way to ship software to production these days. You sidestep distribution specific packaging issues entirely that way and all of the subtle issues that happen when these distributions are updated. If you still want Debian package management, you can use it in docker form of course.



> That's already the reality for most of this century. Openjdk, go, rust, docker, npm/yarn, etc. all provide up to date Debian, Red Hat, etc. packages for what they offer. There's zero advantage to sticking with the distribution specific versions of those packages which are typically out of date and come with distribution specific issues (including stability and security issues).

The advantage is the very reason one would choose Debian to begin with — an inert, unchanging, documented system.

A large part of this problem seems to be that users somehow install a system such as Debian whose raison d'être is inertia, only to then complain about the inertia, which makes one wonder why they chose this system to begin with.

> Debian's claims to adding value in terms of security and stability to those vendor provided packages are IMHO dubious at best. At best they sort of ship security patches with significant delays by trying to keep up with their stable release channels. Worst case they botch the job, ship them years late, or introduce new bugs repackaging the software (I experienced all of that at some point).

Evidently they add value in terms of stability, but methinks many a man misunderstands what “stable" means in Debian's parlance. It does not mean “does not crash”; it means “is inert, unchanging” which is important for enterprises that absolutely cannot risk that something stop working on an upgrade.

> Shipping tested, fully integrated, self-contained binary images is the best way to ship software to production these days. You sidestep distribution specific packaging issues entirely that way and all of the subtle issues that happen when these distributions are updated. If you still want Debian package management, you can use it in docker form of course.

Not for the use case that Debian, and RHEL attempt to serve at all — these are systems that for good reasons do not fix non-critical bugs but rather document their behavior and rule them features, for someone might have come to rely upon the faulty behavior, and fixing it would lead to breaking such reliance.


That's why most shops deploy docker containters: it's not convenient at all for them for Debian, Red Hat, etc. to repackage the software they deploy or be opinionated about what versions of stuff is supported. For such users, the OS is just a runtime and it just needs to get out of the way.

Ten years ago, we were all doing puppet, chef and what not to customize our deployment infrastructure to run our software. That's not a common thing anymore for a lot of teams and I have not had to do stuff like that for quite some time. A lot of that work btw. involved working around packaging issues and distribution specific or distribution version specific issues.

I remember looking at the puppet package for installing ntp once and being horrified at the hundred lines of code needed to run something like that because of all the differences between platforms. Also such simple things like going from one centos version to the next was a non trivial issue because of all the automation dependencies on stuff that changed in some way (I remember doing the v5 to v6 at some point). Dealing with madness like that is a PITA I don't miss at all.

There's definitely some value in having something that is inert and unchanging for some companies that run software for longer times. Pretty much all the solutions I mentioned have LTS channels. E.g. If you want java 6 or 7 support, you can still get that. And practically speaking, when that support runs out I don't see how Debian would be in any way positioned to provide that in a meaningful way. The type of company caring about such things would likely not be running Debian but some version of Red Hat or something similarly conservative.


>It does not mean “does not crash”; it means “is inert, unchanging” which is important for enterprises that absolutely cannot risk that something stop working on an upgrade.

But would enterprises accept being forever stuck with any bugs that aren't security related? Even RHEL backports patches from newer kernels while maintaining kABI.


We're talking about entitites that run COBOL code from the 60s and are too afraid to update or replace it, for fear that something break.

There's a reason why most enterprise-oriented systems take inertia quite seriously — it has been something their customers greatly desire who are losing considerable capital on even minor downtime.


> Debian's claims to adding value in terms of security and stability to those vendor provided packages are IMHO dubious at best.

That’s not true. The idea is that the distribution is tested and stable as a whole and replacing something as OpenJDK can cause a lot breakage in other packages.

There is a reason why enterprise distributions provide support only for the limited set of packages that they ship.


Depends, if you install a statically linked version from a third party it won't create much headaches. That kind of is the point of vendoring and static linking: not making to much assumptions about what is there and what version it is in. Works great at the cost of a few extra MB, which in most cases is a complete non issue for the user.

Debian self-inflicts this breakage by trying to share libraries and dependencies between packages. That both locks you in to obsolete stuff and creates inflexibility. Third parties actively try to not have this problem. Debian is more flaky on this front than it technically needs to be.

Kind of the point of the article is that to vendor or not to vendor is a hot topic for Debian exactly because of this.


> There's zero advantage to sticking with the distribution specific versions of those packages which are typically out of date and come with distribution specific issues (including stability and security issues).

Uh, other than "apt install foo" versus "ok, let's go search for foo on the internet, skip that spam listing that Google sold ad space to, ok no I am on foo.net, let's find the one that corresponds to my computer…yeah amd64 Linux, rpms? no, I want debs…download, dpkg -i…oh wait I need libbar".




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: