If you still think that distribution #packaging is not necessary and upstreams should be doing that, let me remind that you that downstream packagers are the ones who report that your latest release doesn't include the changes from your latest release:
The killer feature in this version is support for GIT_CRATES. Please test and let me know how it works for you (yeah, I know you can't because of workspace package data, sigh, that's coming up next).
It took some effort but it can guess the fancy paths for GIT_CRATES, remove the variable if it's stale or add one if it's missing.
Is there a tool that would tell me if I'm going to lose any followers/follows when migrating to another #Mastodon server? In other words, basically something that would check if the destination server doesn't defederate any of their servers.
I ran out of ideas, so it's time to publish v1.0.0. That said, I consider this an open-ended article, with more cases possibly being added in the future.
"Problems faced when downstream testing Python packages"
I've pushed the #cargo.eclass optimizations to #Gentoo now, and I've released #pycargoebuild 0.7 that takes advantage of them!
Note that the `-i` mode will update ebuilds to use `@` separator in CRATES list but it will not replace `$(cargo_crate_uris)` with `${CARGO_CRATE_URIS}` for you — you have to do that one manually.
I'm looking for an example of "non-obvious" timeout-related test failure in a #Python package. That is, something that initially doesn't look like a timeout at all but after deeper investigation turns out to be caused by some operation taking longer than expected.
Whenever I see human-operated railway stations, I feel a bit nostalgic. While I don't feel that there's a real need to replace these with automation, at the same time I realize these people could be doing something else than being stuck at a post for hours.
We definitely have the technical means to replace people with machines there. What we're lacking are sociopolitical means of letting people enjoy their life without working.
Unpopular opinion: planned obsolescence is not only pursued by corporations that benefit from it. It's also pushed forward by unaffiliated #OpenSource developers.
If you write inefficient and resource-hungry software (because "your time is valuable" and "space is cheap"), you push it forward. If you choose a newfangled programming language with portability issues, you push it forward. If you straightaway reject support for old hardware, you push it forward.
When everyone around you is like "you're obsolete, #Flatpak is the thing, #snap is the thing, upstream #packaging is the thing"…
You can just shrug and tell them you don't care. You're not on corporate payroll, you don't have to do whatever cool kids do, you don't have to make IT worse for easy profit or push forward the agenda of planned obsolescence.
You just do things the way your users want them done. Uncool kids are important too.
@mgorny I'm happily using #Gentoo for years now, and to my experience it is actually a lot more stable than more 'user friendly' distro's.
Sure, it takes awhile to get a system running (depending on the amount of optimization), but once it runs, it has felt really stable as everything is build from source, against its dependencies, instead of living on its own island.
Also, underdogs do sometimes become the cool kids at some point :)
Rewriting everything in #RustLang improves quality, right?
I suppose that's why I've spent the whole morning figuring out cryptography regressions, just to discover they are due to random function being rewritten from #Python to Rust:
Is there a better work to spend your time on than adding support for a pointless rename of a #NIH#PEP517 backend that was used by less than a dozen packages? The first one (by the backend's author, of course) just made a release purely for the purpose of switching to the rename, while the rest still block on the old package.
#AnyIO may sound like a good idea at first -- supporting multiple backends with no extra effort. However, from Linux distribution perspective it means "this package is now blocked on #trio", and well, trio's not very actively developed, it's not handling new #Python versions well, and it tends to get pinned to old releases, on top of everything.
it's like that "two at the price of one" sale when you need only one and the expiration date is short.
If anyone still has any doubt that using Internet in #Python package tests is bad, here's a new example: two packages failing because they resolved some domains, and DNS records changed (probably).
#PyPy 3 support in #Gentoo has been limited to one variant, 3.9 right now, about to change to 3.10. I don't think supporting multiple variants permanently is worth the effort, especially that PyPy is barely catching up with our supported CPython versions (3.10 being the lowest right now). However, I think supporting multiple interpreter versions (i.e. pypy3.9 and pypy3.10) for use with venv may make sense.
A wild boar is considered dangerous, and some parents would rather not see their children in forests. A female boar may weigh 95 kg and run at 40 km/h. At impact, that's a kinetic energy (E=mv²/2) of 5.9 kJ. The probability of being attacked is very small.
An average car weighs around 1.5 t, and drives in urban areas at 50 km/h. That's 145 kJ of kinetic energy. Cars are everywhere, and accidents are frequent.
A wild boar is considered dangerous, and some parents would rather not see their children in forests. A female boar may weigh 95 kg and run at 40 km/h. At impact, that's a kinetic energy (E=mv²/2) of 5.9 kJ. The probability of being attacked is very small.
An average car weighs around 1.5 t, and drives in urban areas at 50 km/h. That's 145 kJ of kinetic energy. Cars are everywhere, and accidents are frequent.
@mgorny while I agree on the whole that we should strive to limit car use and focus more money on public transport infrastructure I am not sure why you picked boar as a comparison? Is something going on in Poland that makes this comparison relevant?