... from the "what can we learn from Windows" dept.
Application Binary Interfaces, that is. Today's article is about the question of... should you be able to just ship compiled versions of your application and count on the OS to... just run them? (... admittedly, this involves more than "ABIs of libraries"; it's also about distributions. But... the general concept is similar.)
By the way, if this entire question sounds confusing to you, you probably haven't been exposed to Linux enough.
One of my formative experiences about the topic was when I was trying to run Veles the binary analysis tool on a Linux laptop. (It's really cool, by the way; you can turn any binary file into a 3D cube and analyze what's in it just by looking at patterns!)
They have a Linux, 64 bit (Ubuntu 16.04) package... and a Windows package.
Any guesses on which one was easier to get working on my Linux, 64-bit (Debian) laptop?
(Hint: the process involved installing Wine.)
... but binaries are stupid!
The thing is that... Windows is majorly better at keeping binaries running, as compared to Linux. (I don't know much of the BSDs, but... they seem to be even more of a fan of "compile from source"?)
Of course, this is not supposed to be a problem, because this is just not the way Linux is supposed to work. Which is... just... either compile from source yourself, or (more likely) wait for someone pack it up for your distribution and provide binaries. Once the distribution itself changes, they'll just recompile your code; as long as sources are available, it's all good! It's only Evil Proprietrary Software people who need to be worried about not having to recompile things... since they can't.
So... if you want to run a Linux binary on a different Linux system, you're just... doing it wrong.
This is exactly why no one is using Snap, Flatpak or AppImage; they're solving problems that... ohh wait.
It's all supposed to be just
$ make install
Well, after you install all the dependencies.
Including the dev packages.
(Unless, of course, some of them are the wrong version.)
But Gentoo, Guix, Nix etc. do the recompiles seamlessly! Just run emerge and... ... well, the thing is, you still need to put in the work as a packager to figure out how to compile these packages.
The fact that you're compiling them yourself doesn't change the fact that there needs to be someone for each distro and each piece of software who figures out how to compile that particular package for that particular distro.
This is work that just shouldn't exist.... or, at least, we should try minimizing it. Since... yes, it's technically possible to do, but we could use the same line of reasoning to convince everyone that compilers are useless & we should just write everything in assembly instead, for each architecture, separately, because there just aren't that many kinds of CPU anyway.
It's just Linux people value binary compatibility less, so that not much effort goes into making it better.
Compare this to Windows and COM; the latter is basically just a standardized way of doing vtables and generally undoing the fact that C++ doesn't have a reasonable stable binary interface. As a side effect, you can use any programming language you want!
Even for things like Python, which is supposed to be similar between platforms... for Windows apps, you just bundle all the Python (since there is no one "system" version anyway, but you can ship actual binaries!), vs. on Linux you rely on the package manager (which involves less bloat but breaks once the package you want to install wasn't built on your exact distribution... thanks to Python's similarly bad binary compatibility story).
At this point, I'm not entirely sure how to fix this. (Apart from shipping statically linked binaries, which works because the kernel itself does have a stable ABI!) I do think it should start with the general goal though... that it should be an actual objective to have stable ABIs and cross-distribution shippable binaries. We might just come up with the actual technical solutions later.