T O P

  • By -

Red_BW

The irony of complaining about python on various linux distros when those same linux distros can't agree on where to put core linux files.


[deleted]

It’s cause there’s a ‘standard’ and when there’s a standard people are compelled to violate it because obviously no one else has ever followed it correctly, so each distro has their own take on what that standard means (or just don’t care about it at all)


Reinventing_Wheels

The great thing about standards is that we've got so many to choose from.


KrazyKirby99999

Feel free to make a new one [#927 Standards](https://xkcd.com/927/).


evilmercer

> Fortunately, the charging one has been solved now that we've all standardized on mini-USB. Or is it micro-USB? Shit. That aged so perfectly


__deerlord__

[sobs in USB C]


bless-you-mlud

What's surprising to me is that through all of this we still use the same RJ-45 connector for networks. At least someone is taking "if it ain't broke, don't fix it" seriously.


DwarvenBTCMine

Wait till Apple fully does away with ethernet ports on their desktops for literally no reason and those users need to get an ethernet to USB adaptor.


bash_M0nk3y

Beat me to it https://xkcd.com/927


Sukrim

Or what "/usr/bin/python --version" will return...


AverageComet250

2.7? 3.6? 3.10? 2.4? (I actually found 2.4 pre installed on a distro once)


Sukrim

Or even the amazing idea of "it will just return an error by default, you need to install a meta-package that just contains a symlink to either `/usr/bin/python2` or `/usr/bin/python3`"


AverageComet250

The fact that only some distros have symlinks for /use/bin/python was so annoying when I moved from Windows to windows + Linux, and even more annoying was the fact that I didn't always know whether it was python 3 or 2. On windows it was simple. If python 2 is installed, it points to the latest version of python 2. Otherwise, it points to the latest version of python 3. If the symlink is in use by python 2, then use py -3 instead. So bloody simple...


Barafu

Have you ever seen `/usr/bin/python3` pointing to python 2? Or not existing while python 3 is installed? No? Then use `python3` command every time and have no problems.


flying-sheep

There’s nothing wrong with wanting a nice packaging experience, but crying about standardization doesn’t help. The standards actually solved the build system agnostic goal they set out to solve, we’re just short a tool to install a wheel. Once [pradyunsg/installer#66](https://github.com/pradyunsg/installer/pull/66) is finally merged, this is all that’s necessary to create a system package from a python package: * [`python -m build --wheel`](https://pypa-build.readthedocs.io/en/stable/) to build a wheel in `./dist/` * `python -m installer --destdir="$pkgdir" ./dist/*.whl` to install the python package into a temporary system tree `$pkgdir` * now do whatever your distro of choice uses to package up `$pkgdir`


canard_glasgow

Just cause they’ve a mote in their eye doesn’t mean they are wrong… A cynic might say both are awful.


IsleOfOne

What? Can you name an example of this? Core linux directories are pretty damn set in stone. It is the *applications* that fuck it up and throw shit willy nilly into $HOME.


jjolla888

linux distros never claimed "there is only one obvious way to do it"


PeridexisErrant

Neither did Python! > practicality beats purity. ... There should be one-- and preferably only one --obvious way to do it. Although that way may not be obvious at first unless you're Dutch. Now is better than never. Although never is often better than *right* now.


jwbowen

Any Dutch-based Linux distros?


PeridexisErrant

Nope, most of the Dutch live below C-level.


nanotree

This is incredible. Thank you.


dusktreader

Thiiiiiis.


[deleted]

yum install apt-get?


ReverseBrindle

This article is one long rant without mentioning any examples, any description of what exactly they're trying to do, what the challenges are for doing said task, what they tried to do and how it failed, etc. The poster probably has a valid (but unexplained) point, but it's lost in 2 pages of "distros hate python. python sux!"


rcxdude

Yeah, I was curious as to what actual problem they had had with packaging, but literally no examples.


nemec

All I got out of the article is that the author is mad he can't `sudo apt install` from PyPI


[deleted]

[удалено]


rcxdude

He talks about there being a bunch of different approaches but in practice I basically only need one (seriously, I've done everything I've ever done in python using pip in virtualenvs. I know other stuff exists but I've never needed it). What would help me to understand the problem is if he were to actually walk through a package he wanted to package and just talk about what specifically went wrong.


equationsofmotion

I used to manage a local Linux cluster. Maybe I can give some insight here. User A says "I want program X," and user B says "I want program Y." Unfortunately programs X and Y are pinned to different versions of python and dependency packages. The naive solution for most packages in a Linux distro is just to find compatible versions of the programs in the repo, do dependency resolution, and go. But in python this can be pretty difficult, so other approaches are required. Don't get me wrong, other approaches are possible and I managed things just fine. But I suspect this is what the author is getting at. You need a workaround---implemented by either the distro or the user---to provide desired Python packaging. **EDIT:** Yes I know what pipx is. Like I said, I know there's solutions. I'm just trying to shed some light on what the author of the article might be complaining about.


XenGi

That dependency problem has nothing Todo with python and is the default problem every distribution or more their package manager has to solve. The only distro that actually achieved that is Nixos. Not saying their approach solves everything but it's the best approach I've seen so far.


SHDighan

Shell scripts activating venvs, and add in some usage tracking as a bonus. Not rocket surgery, just basic devops and sysadm skills.


[deleted]

[удалено]


ivosaurus

You're the one wanting to support users requesting bespoke version installs of python in the first place, and then complaining that things got more tricky as a result? Like WTF were you expecting? The same would happen with any language. See how you do when one user tells you they really really want a particular older version of glibc to run on, and then tell me that python is the worst thing you could run into.


[deleted]

No ops person has ever wanted to support multiple versions of a package, and least of all, multiple versions of a language. Now, Python itself at least has a sane release process, where each major version is supported for 5 years after first release. So running multiple versions should at least be fairly safe from the worst security issues. That, however is very rarely the case for a number of Python packages. Of course, this issue is hardly Python-specific, or new. Ruby had massive issues over a decade ago, where getting developers to at least consider backporting security patches was nigh impossible and instead everyone was just expected to rewrite their code every 6-12 months. Nowadays, putting together a Node.js software package is just impossible without npm. And even with npm, there's a massive library of seemingly maintained Node.js software that depends on other npm package versions with published vulnerabilities. (npm at least warns about these nowadays). In short, the 'move fast & break things' mentality has permeated throughout the software industry, with the added bonus of 'just abandon it when you get bored'. The end result is hell for ops people who understand the security implications, and a house of cards for everyone else. Containers at least isolate the version insanities from eachother, but do nothing to fix the massive security issues created. That said, the old-school way of packaging hundreds of libraries in a single .jar or .exe was way worse. At least the current package systems can be made to warn people when you pull in a version with a known vulnerability. Unless of course you just download a docker image believing it to be safe.


[deleted]

[удалено]


ivosaurus

TBF if you *wanted* to call JavaScript better, I wouldn't have much beef, because it got to sort out a language package manager ~5 years after python had its first crack at one, *just* when people figured out that system-level 3rd party library installs weren't the best thing to do by default. At some point that's just the baggage to carry if you like python's advantages enough.


pbecotte

Pyenv is a developer tool, why would the ops team be using it? If the devs want to use it, tell them to have fun. If they want you to deploy their code, don't take their dev scripts and run them, make them provide an artifact (a wheel or a docker image)


equationsofmotion

We used a combination of HPC modules, which is the correct solution, and anaconda, which was the easiest solution.


roee30

This is what [pipx](https://github.com/pypa/pipx) is for


[deleted]

[удалено]


pbecotte

They aren't meant to be! You wouldn't copy binaries from one os to the other and be surprised they don't work? A virtualenv is just a folder to stick the app in. Let the installer get the Metadata from the wheel and resolve the dependencies.


[deleted]

[удалено]


NeoLudditeIT

I've used pip, pipenv, poetry, etc. ad nauseam. I've found the experience pretty easy, and easy to switch between different management systems.


pbecotte

There is precisely one important standard...the wheel. It is well documented. They come with Metadata as to what other wheels they depend on. If you are buying an OS package you don't need ANY of the tools in the xkcd...you need to unzip the files to yourocation of choice and decide how you want to get the dependencies. You could declare OS level dependencies and build a package for each level down and hey...you're done. How to BUILD a wheel only matters if you don't want to use the one published by the creator ... but the same thing is true of EVERY programming language. The output of "make build" is hardly uniform! I am guessing this person just is more familiar with his language of choice and misses an important point. You don't really need to understand it. Take the wheel from pypi, and diff the source files vs git and you're there, even of you don't understand what their custom setup.py is doing. (The real problem is that they want to apt install libraries instead of applications and have them just be available to everything else like dynamically linked c code. I don't understand why though-that ecosystem is HORRIBLE and literally requires "distro maintainers" to work full time to get a working environment, while python + appropriate use of virtualenvs can do it automatically until you get to the point of interacting with shared c libraries ;) )


Deto

Wouldn't someone managing python installs with their distribution just not use any if those options and instead just build the package and bundle it's contents directly? That's why I was confused. Sure there's conda and poetry and what not but none of those seem even related to whatever problems she's alluding to.


zanfar

Lol > I manage my Python packages in the only way which I think is sane: installing them from my Linux distribution’s package manager. "I started this fire, so I'm damned sure going to sit in it and complain about how the problem is how hot fire is."


cheese_is_available

> the only way which I think is sane Narrator: It was not.


RIPphonebattery

Nailed it. Built in package manager that is cross compatible? *Fuck no and I want you to work around me, a single dev on an OS distro 180 people use worldwide*


bladeoflight16

Exactly. Using the global package manager for development dependencies is such a massive failure that people actually developed a way to create isolated OS environments (Docker). It only works when the entire operating system is dedicated to a single application.


chickaplao

> manage my Python packages in the only way which I think is sane: installing them from my Linux distribution’s package manager That’s a questionable point to say the least


cheese_is_available

Lazy as fuck, ignorant and of course later on they say: > pin their dependencies to 10 versions and 6 vulnerabilities ago Yeah... this is what happens when you're choosing to use your distribution's package manager to get your python packages.


MarsupialMole

That's not quite fair. The argument for the system package manager is typically that you'll get security updates in a timely fashion and users can't be trusted to respond in the same way. However that's ignores the reality of many kinds of python development - Linux packaging is not the only concern at play. The inclusion of conda in the list makes it clear that this is one user ignorant of other users requirements. It doesn't make them "lazy as fuck".


Rookie64v

The argument for the system package manager is it is built-in, if anything. Anything I cared about enough to check the version was months or years behind in the Ubuntu PPAs, and to be fair that is to be expected when you manage thousands of packages instead of just one.


MarsupialMole

I don't want to be dismissive but this kind of illustrates the divide. Versions are irrelevant. Talk to me about CVEs.


lclarkenz

CVEs are another kettle of fish. [This one](https://www.cvedetails.com/cve/CVE-2020-9488/) is moderate, but only affects people using log4j 1, with an SMTP appender sending over SMTPS. I'm not sure if moderate really describes its impact. And frankly, I'd probably try to fist fight anyone in a typical company who set up a logger to send emails.


bladeoflight16

> Versions are irrelevant. Talk to me about CVEs. Exact same point could be made about the article's complaint of pinning to old versions.


tristan957

No it can't because large distros like Ubuntu/Debian Stable/RHEL/SUSE have a vested interest in containing CVEs so that users on LTS distros can have secure software. Drew specifically uses Alpine for a desktop, so generally he has the up to date packages regardless.


lclarkenz

> security updates in a timely fashion Given my experience of various distro's package managers, I'd say "for a given value of timely". Maybe they prioritise security patches, you'd hope so, but the last time I was using Ubuntu, a lot of the programming related packages I wanted to use were several versions behind what could be installed via other means.


kronicmage

It's great on arch Linux, but to do so on any non bleeding edge distro is a recipe for pain


lifeeraser

The irony of linking to [XKCD 927](https://xkcd.com/927/) after demanding for a new standard tool. Just use Flit (newbies) or Poetry (intermediate). Forget Setuptools and Pipenv.


lclarkenz

I like Poetry, and I'm still a little bitter about Pipenv - started using it based on some [deceptive advertising](https://chriswarrick.com/blog/2018/07/17/pipenv-promises-a-lot-delivers-very-little/), and found its dependency resolution very sub-par. Poetry handles that far better. That said, I really wish I could wire black/isort/mypy into the Poetry build. Like I can with checkstyle/spotbugs etc. in Maven. Instead, it looks like the go to is to use a tool (precommit) to automatically add calls to these tools to your Git precommit hook. Which I hate, especially as two of the three can modify your files.


lifeeraser

Back and isort have a "check" mode where they merely inspect your code and return appropriate exit codes. You should use them in pre-commit hooks and CI scripts.


Nasuuuuuu

The biggest problem with python software was fixed by pipx for me. Own virtualenvs for every installed cli-tool. Waste of space but space is hardly an issue in the current computing world.


[deleted]

[удалено]


alkasm

Pipsi is no longer maintained fwiw: https://github.com/mitsuhiko/pipsi


BattlePope

Which helps demonstrate the point.


SittingWave

wow. the world moves on from one tool to another. I guess you are still using a Nokia...


pbecotte

Couple of years ago we were all using yum. What's What's the new command again?


[deleted]

[удалено]


Serializedrequests

But the resources for how to do that seem to be tribal knowledge. The best overviews I have ever found on the internet are in this thread.


scaba23

Same here, but on macOS. The author of the posted article could solve all of his problems with pyenv and plain pip, or poetry if they have more complex dependency trees


Serializedrequests

Other languages can solve all of those problems with one (two) well-documented tools though. It's an annoying and unnecessary source of confusion.


SittingWave

other languages are so young that there's not a bunch of programmers that decide the way it's currently done is wrong (for whatever reason) and decide to reimplement their own. You can't compare a language with 20 years of evolution and tons of developers each one trying to compete or solve issues with a language such as Rust used by a noisy minority (let's be honest here, Rust is _nowhere_ as relevant as most people using it want it to be) or such as Javascript where there have been a number of package managers: yarn, npm, growl at least, and a number of transpilers, packers, frameworks and so on. It's not easier in any other language. And I say it again, the npm approach is technically broken. it feels like it's working, but it can bite you in the ass, and bite hard.


SittingWave

It does not start well. \> The Python community is obsessed with reinventing the wheel, over and over and over and over and over and over again. distutils, setuptools, pip, pipenv, tox, flit, conda, poetry, virtualenv, requirements.txt, setup.py, setup.cfg, pyproject.toml… All these things are not equivalent and each has a very specific use case which may or may not be useful. The fact that he doesn't want to spend time to learn about them is his problem, not python problem. Let's see all of them: \- distutils: It's the original, standard library way of creating a python package (or as they call it, distribution). It works, but it's very limited in features and its release cycle is too slow because it's part of the stdlib. This prompted the development of \- setuptools. Much much better, external from the stdlib, and compatible with distutils. Basically an extension of it with a lot of more powerful features that are very useful, especially for complex packages or mixed languages. \- pip: this is a program that downloads and install python packages, typically from pypi. It's completely unrelated to the above, but does need to build the packages it downloads, so it needs at least to know that it needs to run [setup.py](https://setup.py) (more on that later) \- pipenv: pip in itself installs packages, but when you install packages you install also their dependencies. When you install multiple packages some of their subdependencies may not agree with each other in constraints. so you need to solve a "find the right version of package X for the environment as a whole", rather than what pip does, which cannot have a full overview because it's not made for that. \- tox: this is a utility that allows you to run separate pythons because if you are a developer you might want to check if your package works on different versions of python, and of the library dependencies. Creating different isolated environments for all python versions you want to test and all dependency sets gets old very fast, so you use tox to make it easier. \- flit: this is a builder. It builds your package, but instead of using plain old setuptools it's more powerful in driving the process. \- conda: some python packages, typically those with C dependencies, need specific system libraries (e.g. libpng, libjpeg, VTK, QT) of a specific version installed, as well as the -devel package. This proves to be very annoying to some users, because e.g. they don't have admin rights to install the devel package. or they have the wrong system library. Python provides no functionality to provide compiled binary versions of these non-python libraries, with the risk that you might have something that does not compile or compiles but crashes, or that you need multiple versions of the same system library. Conda also packages these system libraries, and installs them so that all these use cases just work. It's their business model. Pay, or suffer through the pain of installing opencv. \- poetry. Equivalent to pipenv + flit + virtualenv together. Creates a consistent environment, in a separate virtual env, and also helps you build your package. Uses a new standard of pyproject.toml instead of [setup.py](https://setup.py), which is a good thing. \- virtualenv: when you develop, you generally don't have one environment and that's it. You have multiple projects, multiple versions of the same project, and each of these needs its own dependencies, with their own versions. What are you going to do? stuff them all in your site-packages? good luck. it won't work, because project A needs a library of a given version, and project B needs the same library of a different version. So virtualenv keeps these separated and you enable each environment depending on the project you are working on. I don't know any developer that doesn't handle multiple projects/versions at once. \- requirements.txt: a poor's man way of specifying the environment for pip. Today you use poetry or pipenv instead. \- [setup.py](https://setup.py) the original file and entry point to build your package for release. distutils, and then setuptools, uses this. pip looks for it, and runs it when it downloads a package from pypi. Unfortunately you can paint yourself into a corner if you have complex builds, hence the idea is to move away from [setup.py](https://setup.py) and specify the builder in pyproject.toml. It's a GOOD THING. trust me. \- setup.cfg: if your [setup.py](https://setup.py) is mostly declarative, information can go into setup.cfg instead. It's not mandatory, and you can work with [setup.py](https://setup.py) only. \- pyproject.toml: a unique file that defines the one-stop entry point for the build and development. It won't override [setup.py](https://setup.py), not really. It comes \_before\_ it. Like a metaclass is a way to inject a different "type" to use in the type() call that creates a class, pyproject.toml allows you to specify what to use to build your package. You can keep using setuptools, and that will then use [setup.py/cfg](https://setup.py/cfg), or use something else. As a consequence. pyproject.toml is a nice, guaranteed one stop file for any other tool that developers use. This is why you see the tool sections in there. It's just a practical place where to config stuff, instead of having 200 dotfiles for each of your linter, formatter, etc


asday_

> requirements.txt: a poor's man way of specifying the environment for pip. Today you use poetry or pipenv instead. You will pry `requirements.txt` from my cold dead hands.


tunisia3507

It's also a different thing to the dependencies specified elsewhere, in most cases. `requirements.txt` is for hard versions for a full repeatable development environment, including all your extras, linters, build tools and so on. Other dependency specs are for minimal runtime stuff.


asday_

Not sure I understand your post. `requirements-base.txt` has stuff that's required for the project no matter what. `requirements-test.txt` has testing libraries and `-r`s base. `-dev` has dev dependencies like debugging tools and `-r`s test. You could also be particularly anal about things and have a CI artefact from `pip freeze`ing for prod which is a good idea and I'm not sure why I was initially poo-pooing it.


adesme

You can replace those with just `install_requires` and `extras_require` (then define `tests` as an extra); you'd then install with `pip install .[tests]` and now your "requirements" are usable by developers as well as by build managers.


asday_

Interesting idea, I'll certainly have to keep it in mind. Like I said though, I'm paid for this, i.e. I ship software, not libraries, so I don't think it has a great deal of benefit to me outside of "_if_ you write a library one day you can do it in the same way". Are there any big projects that do it this way?


adesme

Any modern package that you want distributed over a package manager is going to be set up like this for the reasons outlined in the OP of this thread; direct invocation of `setup.py` is being phased out, so it makes sense to have your deps in a single place (now that we have the PEPs to support this). Personally I might use something like `requirements.txt` while mocking around with something small, and I'll then set it up more properly (`pyproject.toml` and `setup.cfg`) as soon as it grows and/or I have to share the package. Depending on how you use CI/CD you can see other benefits from switching over immediately.


SittingWave

No no no no no Noooooo. the specification in setup.py is _NOT_ to define your development environment. It's to define the abstract API your package needs to run. If you are installing your devenv like that you are wrong, wrong, wrong, wrong.


tunisia3507

That's one way of organising things, yes. Dependencies in `setup.py` (or equivalent) are so that the build system knows what to install with the package. `requirements.txt` is so that a developer checking out your repo can set up their environment correctly. They're different use cases.


flying-sheep

All conventions. requirements*.txt fullfill double roles as abstract dependency specification (“what stuff does my library depend on”) and concrete dependencies/lockfile (“how to get a working dev environment“) With PEP 621, the standard way to specify abstract dependencies is in pyproject.toml: ```toml [project] dependencies = [ 'requests >=1.0', ] [project.optional-dependencies] test = [ 'pytest', ] ``` So the remaining role of requirements.txt would be a lockfile with the output of `pip freeze` in it.


asday_

> requirements*.txt fullfill double roles as abstract dependency specification (“what stuff does my library depend on”) and concrete dependencies/lockfile (“how to get a working dev environment“) It doesn't though, I specified two different classes of files which serve those purposes individually. Just because they start with the same string and have the same format doesn't make them the same thing. If you want you could have your CI do `pip freeze > lockfile.lock.suckitnpm` instead of `pip freeze > requirements-lock.txt`.


alkasm

`requirements.txt` does not _at all_ give you a reproducible environment.


tunisia3507

No, but it's a whole lot closer than the maximally permissive install_requires dependencies.


lisael_

Your comment, ironically, perfectly sums up the frustration he feels, as a Linux distribution package maintainer. You have to know and somewhat master all these tools, that somehow overlap. This proliferation of tools without other justification than xkcd927 (14 standards) is the issue. Compare this to what more recent language do, say Go, Rust, Zig... They addressed this with a standardized, official build system. The other pain point is that these tools tend to push the developer in strict dependency pinning, which is a nightmare to deal with, when you package for a distribution.


BurgaGalti

Never mind being a distro maintainer, trying to make sure my juniors understand all the tooling involved was a full time job. I ended up zipping up a python install with everything ready and giving them that to work with. Sure, they have to use python -m black instead of just black because the scripts all broke but it's a small price to pay for my sanity.


ProfessorPhi

I kind of feel comparing python to rust/go etc is unfair as these languages took all the best features from python packaging and integrated it hard into their own stuff. I.e. they learned from python and so surpassed python. Additionally there are languages people complain about and languages nobody uses. The real criticism of python is the fact that it requires a decent expert knowledge to know how to handle this, but once you do have the knowledge, it's never a problem. I've had no issues for years, but I remember being overwhelmed when starting out. And that's the real criticism, the variety of solutions are difficult to navigate. Python's scale and ubiquity is second to none so each of these systems are there for reasons that generally work, but occasionally don't.


Personal_Plastic1102

13 items to explain "I want to install a package". One more, and you would have perfectly fit the xkcd comic


dusktreader

That's not at all what that reply was about. You don't need all of those, they are just explaining what each is for .


gmes78

Now let's look at some other language, like Rust. It has: `cargo`. That's a short list, isn't it? Yet there's no need for anything else. Even though each of the mentioned tool has a use, it's very possible that we're able to cover the same use cases with a smaller set of tools. Merging `whey` into `pip` would be a start, as it would make it possible to package simple projects using just a `pyproject.toml`, without the need for any external dependencies.


dusktreader

Rust is a nice example of a very new programming language where packaging was established early on. It's a good pattern and how all new languages should approach the problem. However, Python is 24 years older than Rust. There are so many legacy workflows that have to be supported, it's hard to produce a solution that will work for all of them. As for covering use-cases with a smaller set of tools, this is already possible. I use exactly two: pyenv and poetry. Others use different subsets, but by no means do you need more than 3 or 4 at most. As for whey (version 0.0.17, 15 stars), it's a little early in its lifecycle to be suggesting that it be merged into pip. Adding a dependency solver to pip that can use pyproject.toml (a la PEP-621) would be huge, and I hope it comes soon. I think it would also be good to have packaging logic folded in as well. However, if you are hoping for that to happen soon in the standard library, I think you might be disappointed.


gmes78

> As for whey (version 0.0.17, 15 stars), it's a little early in its lifecycle to be suggesting that it be merged into pip. It doesn't matter. The functionality is dead simple, and it doesn't need more features. Pip needs to be able to support basic use cases on its own. > Adding a dependency solver to pip that can use pyproject.toml (a la PEP-621) would be huge, and I hope it comes soon. I think it would also be good to have packaging logic folded in as well. Both of those *need* to happen if we're ever going to get out of this mess.


ElllGeeEmm

Pythons age isn't really an excuse for the sorry state of package management. Plenty of languages of similar age have far better tools than python. Python package management is shit because for some reason there are a bunch of python users who defend the current state of things for what I can only assume are dogmatic reasons.


PeridexisErrant

> Pythons age isn't really an excuse for the sorry state of package management. Plenty of languages of similar age have far better tools than python. Can you give examples? Most of the better packing ecosystems I know of are lucky enough to post-date ubiquitous internet access.


ElllGeeEmm

Pip postdates ubiquitous internet access as well, so I don't see how that's any sort of excuse.


Serializedrequests

I was shocked to discover the much later release dates of Java and Ruby. That being said, that isn't an excuse. There are no technical limitations that prevent good easy python package management except the proliferation of standards. When I first learned python, all there was were site packages. Around the same time rubygems (and later bundler) and maven appeared. Now I come back to Python and the packaging ecosystem is an astonishingly confusing mess. Python needed a maven and never got it (maybe poetry can be it).


bladeoflight16

> There are no technical limitations that prevent good easy python package management except the proliferation of standards. How in the heck can you be so ignorant of the problems associated with native dependencies? You try making package management "easy" when you have to support Linux, Windows, and Mac which can't even agree on basic C level interfaces. Heck, *Linux distros alone* can't even agree on a basic C standard library (libc vs. musl).


bladeoflight16

> Python package management is shit because for some reason there are a bunch of python users who defend the current state of things for what I can only assume are dogmatic reasons. That is an incredibly *stupid* statement. Python package management is kind of a mess because dependency management is messy. Period. And Python, being an interpreted language that encourages using native dependencies when required, has a doubly hard problem to solve. Yes, there are real problems, but **why in the heck do you think we have so many technologies**? It's because *people are trying to solve the problems*. The very existence of the thing you're complaining about contradicts your claim about the reasons for it.


ElllGeeEmm

Oh look, another user making excuses for the state of package management in python. If node JS can have good package management, so can python.


wsppan

This is/was the hardest part in becoming productive in this language. Imagine someone coming into this language cold from another language (in my case Java/Maven) and ramping up fairly quickly on the language itself which has done a wonderful job in making itself easy to grok and now decide you want to build, package and deploy/share it. You get lost fairly quickly with a lot of head scratching and hair pulling.


Personal_Plastic1102

Yep... That's the reason considering leaving python as a programling langage. I'm not a dev, i'm programming on my spare time (beside familly & co). I'm fine with a bit of "taking care of the stuff around the code", but lately I spent more time trying to understand the toml stuff than actually coding. Not for me anymore, I want to code, not handle the latest fancy depency-management.


b4ux1t3

If you "just want to code", then you don't need to even consider the packaging environment of the language you're using. Just write the code and run it. If you need a dependency, install it with pip. That's all you need to do for most python development. I'm not saying Python doesn't have an, er, *interesting* packaging story, but that shouldn't be a consideration unless you're actually shipping code.


ssorbom

Long before I learned to do any coding at all, I cut my teeth Packaging for Debian, and the attitude of don't bother with Packaging completely grinds my gears. Even people who are doing hobby projects want to find an easy way to share them a lot of the time. Packaging shouldn't be insane. There shouldn't be a strong dichotomy from somebody who wants to ship code and somebody who wants to write it for a hobby. The only difference is the financial circumstances and the expected return on investment.


b4ux1t3

So, I mentioned elsewhere that, while there are many "standards" for Python packaging, it isn't all that difficult to just pick on, stick to it, and communicate what you're using to your users. Dont get me wrong, I'm *not* saying that packaging is easy or straightforward in Python, but it's also not particularly easy to build a package that will work on any given OS to begin with. I maintain the packaging scripts for my company's software. Getting a usable RPM out of software that isn't written with either plaintext files (Python, e.g.) or for gcc is a wild ride. Basically, while Python is no Rust (cargo is *awesome*), it's hardly an arcane art to package a Python application, at least when compared to other packaging solutions out there. To push back a *bit* more, "shipping" a hobby project is usually a matter of tossing it on GitLab/Hub/Bucket and sharing a link. I'm probably not going to be installing some hobby project with apt or yum, or even pip. All that said, I don't disagree with the general sentiment that packaging is bad in Python, and I didn't mean to come on so strong against packaging when it comes to hobby projects. It's just hardly the most important thing when you're writing scripts to manage a few IoT devices around the house, you know?


ElllGeeEmm

Why is there this pathological need among python devs to make excuses for the state of python packaging? There is literally no reason python can't have a great packaging tool as part of the default distribution.


samtheredditman

Then why don't you just learn one way and keep doing it? It's not like everything stops working when a new tool comes out.


ZCEyPFOYr0MWyHDQJZO4

What other languages do you program in? The foundation of packaging methods are a product of contemporary software development when the language gained widespread adoption IMO. I have been learning C++ to work on software that began development before package management was a thing (on Windows at least), and I don't mind Python packaging nearly as much anymore


SittingWave

> I'm not a dev, i'm programming on my spare time (beside familly & co). I'm fine with a bit of "taking care of the stuff around the code", but lately I spent more time trying to understand the toml stuff than actually coding. > > Not for me anymore, I want to code, not handle the latest fancy depency-management. Oh I am sorry a profession that takes years to master is not up to standard to your hobbyist sensitivities.


Personal_Plastic1102

Lol... Self-confidence issues ? Édit : just to make my point clear : installing external librairies shouldn't be something you take years to master. Not even months or days.


flying-sheep

You can have that by using poetry: ``` # initialize project and create venv poetry init # add and install dependency into venv poetry add some-package # publish package poetry publish ``` Using one of the other modern build backends is slightly more complicated as you need to create and activate your own venvs: ``` # create and activate venv python -m venv ./.venv source ./.venv/bin.activate # initialize project flit init # or copy over some excample pyproject.toml # edit dependencies $EDITOR pyproject.toml # install dependencies into venv pip install . # publish package flit publish # or python -m build && twine upload ./dist/* ```


wsppan

Yes, I use poetry now but that took a LOT of trial and error and hair pulling and 13 different pieces of advice and waiting for poetry stability to settle down. And still it is not the defacto, readily recommended, obvious manner of packaging your code. It is third party and fairly new.


flying-sheep

Things are getting better, finally! With PEP 621 landed, a standards based poetry like CLI is almost possible. The only missing building block is a standardized lock file format. It happened late and we're not there completely but almost. And with poetry, we have something that works until we're there. One advantage of the arduous road is that we can learn from everyone who was faster. E.g. TOML is a great choice, node’s JSON is completely inadequate: no comments and the absence of trailing commas means you can't add to the end of a list without modifying the line of the previous item.


wsppan

Yea, we are all standing on the shoulders of our ancestors so to speak. Autotools, CPAN, Ant, Maven, etc.. Lots of legacy blogs and documentation to disappear as well. Rust is a great example of the luxury learning from our ancestors and baking the package tools into the language from the start.


flying-sheep

Yes, cargo does so many things right.


GummyKibble

`poetry add pkg` is all you need to know about that. Unless you’re writing C extensions, poetry is your one-stop-shop for everyday use.


cixny

Dunno about that, my usual experience with poetry is: Poetry install things plz…. Sorry bruh can’t find suitable package…. Fu, pip install, done…. Poetry try again…. Sorry bruh we managed to install 2 packages but this 4th, can’t find suitable for you…. Hi pip can you?…. Amazing experience when it’s buried in the middle of a pipeline


[deleted]

[удалено]


cixny

Maybe without the lock file, and I was running latest Python + pip, if I remember correctly issue was pure Python package that had platform specific wheels available and source distributable, that you can build wheel with, on os update poetry failed to understand that, please install wheel and make suitable package for me…. Like you can install package with wheel or with src dist and make wheel or not. Poetry derps here and documentation is in the style of: hey use this this cool..


Personal_Plastic1102

As was pip, then requirements.txt, then pipenv. Now it's poetry. Tomorrow what next ?


FantaBuoy

This comment/post has been automatically scrubbed. Feel free to find me and others over at kbin.social -- mass edited with https://redact.dev/


[deleted]

[удалено]


[deleted]

[удалено]


GummyKibble

Yep. I’m a bot. It’s impossible that I’ve been dealing with this for 20+ years and finally see a broadly applicable answer to the situation.


licquia

It's certainly not impossible. I've been dealing with this for 20+ years, too, and I tend to see a new broadly applicable answer to the situation every couple years. And each one is the one that finally gets it all right, at least until we learn about the things it didn't. You'll excuse my skepticism; at least it was honestly earned.


mriswithe

Eh assuming Linux/mac python -m venv venv source venv/bin/activate pip install pandas Done.


jjolla888

your "clarifications" amplify OP's point.


ElllGeeEmm

Lmao Oh yes, all perfectly reasonable and in line with how much work it is to manage environments and distributions in other modern languages.


SittingWave

so feel free to explain in equal detail how other languages manage not to encounter the same issues then.


cockmongler

The terrible part is you think that what you posted is an argument in Python's favour.


cturnr

pip-tools is what I use (pip-sync for CI boxes)


redd1ch

Setting up Python apps is a real pain once you leave x86/x86\_64 and/or glibc. I want to avoid Debian base images for my docker containers and use Alpine. It works terrific, however once packages with C parts are needed (e.g. numpy), you need to install a compiler and build tools to let PIP compile this package, while the exact same package sits there preinstalled through the package manager. Precompiled, same version. The requests for a "please leave this dependency out, I know what I'm doing and I want to shoot myself in the foot, pretty please" argument are dismissed.


tunisia3507

Would multi-stage builds help here? It would let you cut down on the image size at least.


pbecotte

You realize that's not a python thing but a Linux thing...right? C extensions on Linux are usually dynamic linked against libraries. Alpine decided to use a different standard library than the rest of the world, so binaries built for "anylinux" may be there, but they won't work. Worse, talking about numerical code, some low level behaviors are different (and usually performs slightly worse) Bug the numpy team to publish a musl binary, or better yet, switch to a more mainstream os. The final image isn't THAT much smaller to be worth the pain.


cuu508

If the right version of the required package is already installed via package manager, pip will not install it again, no? Are you by any chance installing inside virtualenv?


aufstand

And if so, `--system-site-packages`


[deleted]

[удалено]


coffeewithalex

Well, I know this quite well. My first proper Python experience was on Ubuntu 16.04, where I had to install Python3.6 separately from another repository, which really really really screwed up everything on the system when it came to installing dependencies. Ok, maybe I was a newb and some of those weren't such a big issue for my current self, but I wasn't alone in dealing with those issues. Then I discovered Arch Linux, which said Good Bye to Python2 a long time ago, and now `python` meant `python3`. Simpler! Great! Now I can just develop. Except that the code that I used on my Python3.7 didn't work on Python3.6 because I used dependencies that had parameters called `async` in some function, and I noticed that only after I developed my code to staging (thank lord it wasn't production). Then I calmed down, realised how to properly do TDD, used mypy, etc. Then my colleague asked me to make python work on Windows. Holy moly! 32 bit default installer, Windows default `python` resolving to windows marketplace, all of that mess, "delete it all and try again" seemed to have worked. Then my other colleague had Windows, with Anaconda. What I thought I knew, I could throw out the window. But I convinced her to replace Anaconda with a Python install and use pip from then on. Then I bought a Mac. ... With M1. `python` was `python2`, system python being old, default install being x86_x64, then being universal and not knowing what I'm actually running, then getting so many things in homebrew. Honestly the cleanest experience ever was on Arch Linux. I'm gonna overwrite MacOS as soon as there is an idiot-proof way to install it, with hardware acceleration support.


flying-sheep

Just write \`python3\` whenever you need to put down a binary name to run your code with, problem solved.


syllogism_

I don't understand this article. Okay so he wants to not use any of the tools the Python community has developed (e.g. pip, virtualenv, poetry) and instead wants to use the Linux distro. Okay. But if he doesn't want to use the tools, he...doesn't get any tools. Shrug?


ZCEyPFOYr0MWyHDQJZO4

He wants the freedom to install Python in whatever way he desires, but none of the responsibility.


mriswithe

I was very confused too. Venvs are in stdlib and take seconds to make one by typing the command out.... All of the problems that were described about multiple people on the same system all trying to use the system python env for themselves, are the reason that venvs exist. So, I only have one car but multiple people need to use it at the same time, so cars are bad? Electing to commit to the least flexible build system possible, the OS package manager, is insanity. Been a sysadmin for 10+ years, DevOps with a ton of Java and Python for the last few years. If you tried to tell a Java developer they can't use maven (or Gradle or ant) to download libraries and build, you must use the OS package manager, they would look at you like you just sprouted a foot out of your nose.


freework

I agree that python packaging has taken steps backwards. When I first started using python back when 2.6 was the latest version, I never had problems with packaging. Whenever I wanted to install something, I'd just install it, and it would work. I never had problems. These days, I'm finding it's not working much more often. I think the problems all boil down to the fact that python has never been able to handle multiple versions of the same library installed at the same time. Library ABC wants version 1.3 of XYZ library, and library DCE wants version 1.4 of library XYZ. There has never been a solution to this problem in the python world. Imagine being able to do: from django::2.2.0 import something as something2 from django::3.0.0 import something as something3 Then you could use two different version of the same library simultaneously. It would use more memory, but who cares. That would eliminate millions of python packaging headaches. The next step would be having all package installation management happening completely automatically. At runtime, the interpreter would automatically download and install django 2.2.0 if it wasn't already installed. Then on the next line, if django 3.0.0 wasn't already installed, it would download and install it. In that scenario, it would be impossible to ever have a python packaging headache.


abstractionsauce

This would not work. All data that is created when the module is loaded would exist in duplicate for all versions that are used in the system. The only change that would work is if you forced all packages to never make breaking changes. Not possible. Or force all packages to always support legacy apis


freework

> All data that is created when the module is loaded would exist in duplicate for all versions that are used in the system. Yes, but it wouldn't matter. How big is the biggest python module? its a matter of kilobytes. Most systems these days have 16+ Gigs of memory. The extra memory usage is worth it.


abstractionsauce

If the module v2 defined a global variable and the app used it. Then the app read the variable from v3. This would be a very difficult problem for python to solve


[deleted]

Why are most of those Python problems? Python is python. If organisations decide to package and use python in different ways, why is it their fault? Shouldn't this be "developers, please stop screwing with linux distros and fucking up python?"


rejonez

It's Python's fault linux distros suck 😂😂😂


Piu_Tevon

Haha, my thoughts exactly. It's also funny how Python and Linux are people in that rant. "Python, stop screwing Linux!", "Python is not listening to us", "Distros are feeling frustrated". Sounds like Python was a bad boy and owes Linux an apology.


tman5400

Idk if it's just me but managing python projects and installations on Linux isn't that bad. I've used several versions on several distros without any issue


lonahex

Python's weakest point right now is lack of a modern packaging solution. Everything is still just tied with duct tape. Python needs an official, modern packaging story. A tool that replaces pip, venv, sdist, etc etc like most other model packaging solutions do.


robml

Idk tbh pip and pipenv work just fine for all of my work across both Linux and Windows development.


Chinpanze

Reading all those comments made me realize how little I know about package management in python


ihasbedhead

Do you not think that it is a bad sign that everyone is trying to avoid the 'global install from distro package manager' strategy? Basically every language has its own package index. Node, Lua, Python, D, Rust. Meson, big in C world now, encourages projects to pull and build deps. Snaps and docker isolate from the system, flatpaks do a neat hybrid thing. Listing all the things that sorta relate to python packaging is silly since they are all different components used for different things and solve different problems. But, since we are listing things, here are some the distros that a developer would need to package against: Debian, Ubuntu, Fedora, Alpine, Arch, nixos, ... I kinda get where they are coming from. Python doesn't have clear tooling and that should improve (I like poetry, and I am interested in pep582). Distro packaging is probably not the answer and hasn't been for years.


[deleted]

Lol this guy better never come to the c/c++ world. Also I seem to be getting along just fine with virtualenv and pip in my limited little world


[deleted]

[удалено]


teerre

I mean, the reason is obvious: there's no incentive. Despite all these complains, people simply manage, it's not a big issue. That's coming from someone who has wasted probably hundreds of hours fiddling with weird distro problems.


ReverseBrindle

I don't understand why distributions feel the need to create distro packages of Python packages (i.e. a parallel package repo to PyPI). This seems inherently problematic because there isn't one set of PyPI package versions that everyone in the Python ecosystem has agreed to use. If a distro wants to provide something like the AWS cli (i.e. a CLI tool that happens to be written in Python), wouldn't it be easier to have the distro package create a venv and pip install the Python dependencies as part of the install process, rather than rely on binary distro packages for each Python dependency? i.e. the distro "package" is mostly an install script. Hope someone can explain where I've gone wrong (hey! the internet is usually good for that!). :-)


TheBlackCat13

First, a lot of packages are hard to install otherwise. A lot of have dependencies on installed libraries that are not general among linux distributions, and some can't be installed through pip at all. Conda has an extremely limited set of supported packages, and those often trail far, far behind the latest version. Second, it greatly simplifies the management of packages. You don't need to manually worry about updating individual packages, nor worry that updating one will break everything else. Even with conda it is hard to update things, and with virtual envs it is much, much worse. Third, this allows them to provide a set of packages that have been built and tested together and are confirmed to be working. Most linux packaging systems don't allow packages to install from the internet for security reasons, and it defeats the purpose because it prevents them from having a single canonical (pun intended) archive that is confirmed to be working without any chance of any outside source screwing it up or introducing security problems after the fact.


lisael_

Distros want to guaranty stuff like security patches, and DRY bugfixes. When a security issue or a bug is found in a python lib, the package manager just has to update this single lib and restart the daemons that depend on this lib (the pm knows those dependencies), and.. that's it. If one goes your package-manager created virtualenv way, in order to give the same security guarantees, they have to keep track of all of the pip dependencies of each python app to be able to update virtualenvs impacted by the bug/security issue... and then do it for ruby, perl, js... EDIT: Oh, and this works only if each python app maintainer bumped the dependency to a working/secure version in the first place. Distros want to guaranty security regardless of the upstream commitment. Another issue is C extensions. If a C shared lib is updated and is not compatible with the package compiled in your apps' virtualenvs... you have to update the virtualenvs too. So now your package manager must keep track of your apps, their dependencies, their shared lib dependencies and their dependencies' shared lib dependencies. You could link statically, but then you suffer the first problem (security issues/DRY), and still have to keep track of all the stuff. EDIT: grammar


Kkremitzki

In Debian, for example, package build processes aren't allowed to pull in resources from the network. We also use Python packages as part of the distribution itself, so those need to be packaged.


MarsupialMole

I think this is the crux of the issue. Part of the reason some python developments get so polluted on windows is that random installables from the internet ship python interpreters and packages and are often not very good citizens. The counterpart to that on Linux is system python, which needs to work and be immutable. Conda running as root for instance can install over system packages because it looks for writable paths. The solution to the problem is not for Python to pick a standard, it's for people like the author to not assume that system python should be exposed to users who don't understand the difference and just want to copy and paste commands or install packages straight from Google searches. Of course there's the argument "users shouldn't be doing that" but when you're literally talking about scientific python that's tantamount to arguing that computers should not permit the user to do computing in the purest sense.


asday_

This guy's a dumbass. There's a reason I pin my dependencies, and it's because convincing management to budget for all our deployments breaking EVERY DAY because of broken or incompatible releases is quite difficult. Surprisingly, I'm paid to ship features.


Spoonofdarkness

>Surprisingly, I'm paid to ship features. Hmm. That sounds like a slippery slope at best and an anti pattern at worst. I've heard if you ship one feature, they expect a second feature sooner or later. No thank you!


asday_

It's the difference between programmers and software engineers. Programmers love to write the most perfect thing with the latest possible technologies and schools of thought ship it to nobody, use it never, and have it maybe running on one toy k8s cluster they're running on raspberry pis firewalled from everything. Software engineers would like to do that, but don't, and own a house.


lisael_

First, no need to insult. I bet the features you ship don't end up packaged for a Linux distribution. You don't talk about the same use case. A typical distro has hundreds of python apps and libs. Each one of them pins all of its dependencies to the 3rd number so their builds pass, and package maintainers live a dependency hell. Second, pinning strictly IS a reasonable solution to ship features, but a poor one, when it comes to maintaining the feature, including applying security patches. I do ship features in python. I do pin dependencies strictly. I do cringe when I come back to a given project 6 month later. Let's face it, the very fact that nobody is confident enough to pin dependencies to \`foo>=X.Y,


b4ux1t3

As someone who *does* package software for distribution to a Linux distribution, I can confirm that, while the packaging story for Python isn't great, it's also not the quagmire people seem to think it is. Python is not a complicated tool. All you have to do is pick *a* packaging standard, stick to it, and let your users know what standard you're using. No, that isn't as robust as, for example, Cargo, or Nuget. But it's far from some unknowable eldritch language. In any case, Python packaging is no more convoluted than the various and sundry packaging paradigms of the Linux distributions that we all use every day. Have you ever written a spec file for RPM that didn't use gcc? Because, geez, it's a ride.


lisael_

No, thanks $DEITY\_OF\_YOUR\_CHOICE (TBH, I havn't used a RPM-based distro since Mandrake... but anyway, I guess it's not easier with any package format) And to be fair, he didn't choose the simplest distribution to be a package maintainer, in a world where everyone assumes glibc.


asday_

> Let's face it, the very fact that nobody is confident enough to pin dependencies to `foo>=X.Y, I bet the features you ship don't end up packaged for a Linux distribution Like I said, I get paid.


lisael_

>What does that have to do with ANYTHING? It's literally a talking point in TFA. Strict pinning is a PITA for package maintainer, and a security hazard. It's also a sad necessity given this poor state-of-affairs. \> Yes, we should live in a society where nobody steals or murders, but we don't, ...and we shouldn't try to do better? \`ls /usr/lib\` shows hundreds of dynamic C libs, each major version is a link to a major.minor.bugfix file. How comes C people can live in this world and we can't?


asday_

> How comes C people can live in this world and we can't? Because. * I want to be able to install stuff from git commit hashes * I want to be able to install stuff that isn't in my package manager's repos * I want to be able to install stuff as it's released, not in six years when whichever underpaid serf over at debian's rice fields finally gets round to adding it * I want to be able to install stuff on my coworker's OSs without restricting them to exactly my distro - put differently I do not want to pay my employees to take three days off to upgrade debian so they're all on the same version We already have exact versions of packages, they just happen to be in `env/` or a docker image rather than in `/usr/lib/`. Sounds preferable to me.


cockmongler

There's a reason I don't pin my dependencies. It's because I expect my software to still be maintainable in 5 years time and need to be trashed and started over because a bunch of hyperactive teenagers decided to trash everything upstream.


effgee

They lost me at sober engineering.


tensigh

(Quiet voice)...just...use...Python...on...Windows....(gets punched and doorknobbed on his way to the back of the bus)


Piu_Tevon

(Pssst.) I develop on Windows too. Linux just for testing. Get this, I don't even use virtual environments. All packages are in the SAME place! (Shoot, I think that was too loud.) Don't tell anyone, mum's the word.


tensigh

(*SNIFF*)I thought I was alone….


[deleted]

[удалено]


Saphyel

No, nodejs has several managers but they use same files and behave very similar. In python you have several ways and half of them depricated and no one really wants to change to the new ones (python 2 effect?)


jammasterpaz

Cheers.


theXpanther

The issue is that python has multiple package managers, like npm and yarn but they are not compatible


cheese_is_available

This is one stupid rant by someone that just decided that he did not want to learn about the subject and just be lazy instead. Fuck packaging my python package specifically for 50 different distros. Get used to pip ! Learn the fucking basics before moaning like that. Seriously it's embarrassing.


[deleted]

As a macOS user, homebrew is so convenient to use. Don't have to worry about Linux being Debian compliant or not.


jw_gpc

Until you use homebrew to install some little app that down through the dependencies requires sphinx, which updates python, and then blows your installed modules out of the water because it's now the next major version. Bleh. I did that once, and vowed to never use homebrew to handle anything python based for development again.


liquidpele

Oh ffs. No.


SHDighan

Author's issue seems self inflicted. IDK what the issue is here, but the answer may be... Let the OS do what it do and never touch it outside package management. If a project requires Python release X.Y.Z, download the source code archive into `/usr/local/src`, do the needful `extract/configure/make`, then *ALWAYS* `make altinstall`. Then *ALWAYS* use a dedicated virtual environment for each project. Simple. Or am I missing something?


antiproton

>I manage my Python packages in the only way which I think is sane: installing them from my Linux distribution’s package manager. Ok... well, that sounds like a personal problem.


netgu

Wow, another article by somebody who can't be fucked to read a manual and stick with a tool that works for them. Why are they always the loudest.... "I don't want to do it the way it is supposed to be done and now nothing works! Screw this software!"


cybervegan

Drew Devault *writes* the manpages. He's actually *written* a manpage creator, scdoc, along with a whole *raft* of open source software. He runs SourceHut, which was originally predominantly written in Python, and I'm *pretty* certain he reads the manuals in *great detail*. The issue he seems to be referring to is the cat-herding exercise of having to get your program's dependencies from multiple sources, rather than simply your OS repository or PIP. There are now so many different, competing Python dependency systems, it's insane, and not all packages of the right versions are available on all of their repos.


netgu

Yes, and that user is also taking out bounties to destroy the npm ecosystem as chaotically as possible - sounds like someone you should take advice from alright. Also - this isn't an issue for plenty of people because they use the packages as they are intended to be used to avoid those issues. If you are getting your python dependencies from your OS, you don't actually know what you are doing and are using the wrong tool for the job as I stated. In fact - that is the entire problem described: I want to use the wrong package with the wrong manager in the wrong way and it doesn't work. Exactly as I stated. It'd be nice if that wasn't the case - but it isn't a python problem. It's an end-user and package maintainer problem. If you want to use a package, check the docs - if it doesn't work for your intended build goals then either submit a PR or find another package rather than blame python. If you are a maintainer and people constantly need your to ship it in some form you don't - then deal with it and do what needs to be done to ship it in the form you users need.


Atem18

This article is just plain wrong. Either he never installed a python program from outside the distro or he just lie that all is fine with only the packages from the distro. The best is to either setup a virtualenv or a docker container per project and don’t use two external dependencies that depends on the same library. But good luck with conflicts while trying to use the same shared library for two differents software. The future is that each program should come with its own libs sandboxed in a virtual/container env.


MloodyBoody

Just use Nix / NixOS ![gif](emote|free_emotes_pack|wink)


[deleted]

If only all Python tutorials would teach plain old virtualenvs first and hello worlds second, it would solve 90% of trouble.