Email or username:

Password:

Forgot your password?
Top-level
crab

@blacklight With the sudden popularity of things like Nix and the development of built-in container support in systemd I do believe there's a better future for Linux dependency management here, I just hope we'll actually get there and won't spend another decade stuck in Docker hell.

That just leaves the problem of GUI toolkits being so bad that everybody would rather use Electron.

1 comment
Fabio Manganiello

@operand You've raised some valid points, but I feel like some of them require a bit more analysis. For example:

> you have to defer to 50 different library versions packaged by 50 different distros who may apply 50 different patches to your software and potentially take months to get a new version through.

I can talk through first-hand experience here. I'm the main developer of Platypush, which has basically hundreds of optional integrations and addons. Every time I push out a new release, I've got a CI/CD pipeline that does the following:

1. Builds a .deb package that targets Debian stable/oldstable, one that targets the current Ubuntu LTS, one that targets the current Fedora release, and a PKGFILE that targets Arch.

2. Spawns a container for each of these distros, installs the packages, starts the app and runs all the tests (of course that means having high test coverage).

3. If all is green, then packages are automatically uploaded to the relevant repos.

Is it ideal? Definitely not, I'm one of those folks who has been waiting for an end to the fragmentation problem on Linux for two decades. But today there are means to at least mitigate the problem - in my case, 2-3 weeks invested on building a CI/CD pipeline that creates packages that target ~90% of the most common distro configurations out there, and then it's fine if the remaining ~10% either builds from sources, runs a container or installs a Flatpak. It's unlikely that I'll have to touch that pipeline again in the future. It's not ideal, but it's not even impossible to have a piece of software packaged for the most common distro configurations out there - not to the point where we have to entirely give up on solving the dependency sharing problem on bare metal anyways.

> Luckily containers are getting better at build times, sharing dependencies and file sizes nowadays

I see some progress in those areas indeed, but it literally took years to go from "let's start wrapping everything into containers" to "let's figure out a clever and standardized way to avoid replicating the same Alpine/Ubuntu base dependencies across dozens of containers on the same box". Layers are already a step in that direction, but I'd love it if the burden of layering and managing shared dependencies wasn't put on the app developer/packager.

> the development of built-in container support in systemd

I love systemd-containerd. I already migrated many of my production Docker containers to systemd a while ago and I haven't looked back (many of them where anyway started through a systemd service, so I've removed a pointless intermediary). And I also see a lot potential in Podman. But I would also love to see a solution that isn't bound to Linux+systemd. The unfortunate reality is that most of the devs out there use MacOS, and Docker has become so popular because it allows them to easily build, run and test on the fly an Ubuntu or Fedora base image that runs their software in the same environment as the production environment, directly on top of their hardware or even in an IntelliJ tab, without having to configure and run CentOS 6 VMs like many of us used to do until a few years ago. A new successful container system must just be as Mac/Windows-friendly as Docker currently is, or many developers will just think along the lines of "now I have to install a Linux VM again just to run my containers".

> That just leaves the problem of GUI toolkits being so bad that everybody would rather use Electron.

IMHO I still believe that a Web-based (or Web-like) solution is the best way out there. There's plenty of JavaScript developers out there, but finding a skilled Qt/Gtk developer is as rare as finding a white fly - and there's a reason: those frameworks are hard to learn and even harder to master.

The main problem is that JavaScript hasn't grown organically, it has grown with a bunch of frameworks thrown at every problem and the ECMAScript standard trailing behind, to the point that it basically doesn't have a standard library for doing basically anything (even parsing a query string, a cookie or making an HTTP request), and everything is solved with fat node_modules folder made of frameworks that reinvent the wheel hundreds of times.

Electron would have no reason to exist in a world where building a hello world Web app was something doable with a few lines of vanilla JavaScript. That was possible 20 years ago, it was still possible 10 years ago (you just had to add a <script> tag for jQuery), but it's no longer possible now.

@operand You've raised some valid points, but I feel like some of them require a bit more analysis. For example:

> you have to defer to 50 different library versions packaged by 50 different distros who may apply 50 different patches to your software and potentially take months to get a new version through.

Go Up