For example, I have A, B and C in my requirements.txt but I want to install C from my own private PyPI. Everything works fine until someone uploads a package name C to the public PyPI then suddenly I’m not installing my private package anymore.
Yeah, I remember now. the name squatting was from people putting malicious packages under misspelled names of well known packages, like "requets" instead of requests.
That's not a controversial opinion. I'd say it's worse than pip. At least pip doesn't put nag messages on the console or fill up your hard drive with half a gigabyte of small files. OP is confused.
About the only good thing about npm is that I can use one of the superior alternatives. Using npm is almost always a headache as soon as you start working with a decent number of packages.
In my experience npm is not great but it does work most of the time. I just tried installing bunch of stuff using pip and NONE of them worked. Python is backwards compatibility hell. Python 2 vs 3, dependencies missing, important libraries being forked and not working anymore. If the official installation instructions are 'pip install X' and it doesn't work then what's the point?
npm has A LOT of issues but generally when I do 'npm i' i installs things and they work.
But the main point is that cargo is just amazing :)
The problem is 2 and modules for 2 still tend to worm their way in somehow. I always use python3 -m pip because I never trust that "pip" alone is going to be python3 pip and I think that's what the people who have lots of trouble with pip aren't doing.
It would be weird to have python2-pip installed if you don't have python2 installed, pip should be python2-pip by default on most systems.
I… Dunno, are you suggesting that sometimes pip2 is the default and that that somehow mixes 2 and 3 modules? Pip 2 should install into python 2's directory and pip 3 to python 3's. The only times I have had messy python environments is when I mix pipenv, conda and/or pip, and when people install into the main python with specific versioning, use a virtual env for God's sake, that's what npm does.
packages installing but not working due to missing dependencies
This is the fault of the package author/maintainer
packages installing but not working due to broken dependencies
Sometimes the fault of the package author/maintainer. Sometimes this is the fault of a different package you're also trying to use in tandem. Ultimately this is a problem with the shared library approach python takes and it can be 'solved' by vendoring within your own package.
packages not building and failing with obscure errors
Assuming the package is good, this is a problem with your build system. It's like complaining a make file won't run because your system doesn't have gcc installed.
one package was abandoned and using Python 2.7
Unfortunately there's a ton of this kind of stuff. I suppose you can blame pypi for this, they should have some kind of warning for essentially abandoned projects.
Hmm, I personally haven't seen that kind of issue myself though. I also tend to not use random packages from random authors though, so that might help.
The main issue with JS is that every 6 months someone comes up with the next great tool that misses half of basic features and dies after 6 months when someone comes up with the next great tool. But at least the old tested solution still works unlike in Python where the main goal seems to be breaking the backwards compatibility as often as possible.
pnpm is already very well established, it's not completely different from npm either so they didn't have to reinvent the wheel, they just made some things much better.
Python is is just a mess on the other hand, a thousand tools all with some overlap in what they're trying to achieve because they didn't have the balls to make pip an all-in-one solution, there are 2 great alternatives that do almost everything though: poetry and pdm. I read a spot on analysis on this article, maybe it can help you make a choice
But at least the old tested solution still works unlike in Python where the main goal seems to be breaking the backwards compatibility as often as possible.
lol what. Node does a new majorrelease every six months. And you're shit talking python? There's probably never going to be another major version change, and minor versions have several years of support
In like 10 years of python development I don't think I've ever been mad about breaking changes in python.
Sorry but nah. My last job we had a couple different python microservices. There was pipenv, venv, virtualenv, poetry, Pipfile.lock, requirements.txt (which is only the top level???), just pure madness
Apparently all this shit is needed because python wants to install shit globally by default? Are you kidding?
Well, we also had a couple node microservices. Here's how it went: npm install. Done.
Afraid you fucked something and want a clean environment? Here's how you do it with node: delete node_modules/. Done.
Want a clean python env? Uhhhhhhhh use docker I guess? Maybe try reinstalling Python using homebrew? (real actual answers from the python devs who set these up)
Well what's currently installed? ls node_modules, or use npm ls if you want to be fancy.
In python land? Uhhhhhh
Let's update some dep–WHY AREN'T PYTHON PACKAGES USING SEMVER
So yeah, npm may do some stuff wrong, but it seems like it does way more shit right. Granted I didn't really put in the effort to figure out all this python shit, but the people who did still didn't have good answers. And npm is just straightforward and "works".
"But JS projects pull in SOOOO many dependencies" Oh boohoo, you have a 1TB SSD anyway.
Apparently all this shit is needed because python wants to install shit globally by default?
None of that was needed. It was just used because nobody at your company enforced a single standard for developing your product.
Afraid you fucked something and want a clean environment? Here’s how you do it with node: delete node_modules/. Done.
rm -rf venv/. Done.
Want a clean python env? Uhhhhhhhh use docker I guess?
python -m venv venv
Well what’s currently installed? ls node_modules, or use npm ls if you want to be fancy.
In python land? Uhhhhhh
pip freeze. pip list if you want it formatted.
Let’s update some dep–WHY AREN’T PYTHON PACKAGES USING SEMVER
Janky, legacy python packages will have random versioning schemes. If a dependency you're using doesn't follow semver I would question why you're using it and seek out an actively maintained alternative.
I really don't see the hassle… just pick one (e.g. pip/venv) and learn it in like half a day. It took college student me literally a couple hours to figure out how I could distribute a package to my peers that included compiled C++ code using pypi. The hardest part was figuring out how to cross compile the C++ lib. If you think it's that hard to understand I really don't know what to tell you…
Sure, for a new project. But when inheriting code I'm not in a position to pick.
The point is that the state of python package managers is a hot fucking mess compared to npm. Claiming that "npm is just as bad" (or worse) honestly seems ridiculous to me.
(And isn't pip/venv the one the requirements.txt one? Completely flat, no way to discern the difference between direct dependencies and sub-dependencies? No hashes? Sucks when it's time for updating? Yeah no thanks, I'd like a proper lock file. Which is probably why there are a dozen other tools.)
npm is objectively worse. Base pip packages aren't getting hijacked.
Maybe I’m misremembering, but didn’t pip have it’s own security concerns earlier this year?
I believe that was just name squatting.
It’s less the name squatting and more pip not supporting a certain PyPI resolution order: https://github.com/pypa/pip/issues/8606
For example, I have A, B and C in my requirements.txt but I want to install C from my own private PyPI. Everything works fine until someone uploads a package name C to the public PyPI then suddenly I’m not installing my private package anymore.
Yeah, I remember now. the name squatting was from people putting malicious packages under misspelled names of well known packages, like "requets" instead of requests.
That's not a controversial opinion. I'd say it's worse than pip. At least pip doesn't put nag messages on the console or fill up your hard drive with half a gigabyte of small files. OP is confused.
npm is so good there are at least 3 alternatives and every package instructs on using a different one.
About the only good thing about npm is that I can use one of the superior alternatives. Using npm is almost always a headache as soon as you start working with a decent number of packages.
In my experience npm is not great but it does work most of the time. I just tried installing bunch of stuff using pip and NONE of them worked. Python is backwards compatibility hell. Python 2 vs 3, dependencies missing, important libraries being forked and not working anymore. If the official installation instructions are 'pip install X' and it doesn't work then what's the point?
npm has A LOT of issues but generally when I do 'npm i' i installs things and they work.
But the main point is that cargo is just amazing :)
P.S. Never used ruby.
Well there’s your problem lol.
Don’t use 2 for anything, it’s been “dead” for almost 4 years.
The problem is 2 and modules for 2 still tend to worm their way in somehow. I always use python3 -m pip because I never trust that "pip" alone is going to be python3 pip and I think that's what the people who have lots of trouble with pip aren't doing.
Valid point.
I force everything to 3 and don’t accept any 2.
And in fairness, there were some moderate breaking changes 3.6-3.8
It would be weird to have python2-pip installed if you don't have python2 installed, pip should be python2-pip by default on most systems.
I… Dunno, are you suggesting that sometimes pip2 is the default and that that somehow mixes 2 and 3 modules? Pip 2 should install into python 2's directory and pip 3 to python 3's. The only times I have had messy python environments is when I mix pipenv, conda and/or pip, and when people install into the main python with specific versioning, use a virtual env for God's sake, that's what npm does.
Ahh the blissful ignorance of not having to manage tech debt
No, I just don’t ignore it for 4 years.
The bliss is in having management that actually DOES manage the debt instead of ignoring it until it shits the bed
I don't think it's fair to blame pip for some ancient abandoned packages you tried to use.
The issues I had:
If a 'pip install X' completes successfully but X doesn't work it's on pip. And when it fails it could tell you why. Cargo does.
This is the fault of the package author/maintainer
Sometimes the fault of the package author/maintainer. Sometimes this is the fault of a different package you're also trying to use in tandem. Ultimately this is a problem with the shared library approach python takes and it can be 'solved' by vendoring within your own package.
Assuming the package is good, this is a problem with your build system. It's like complaining a make file won't run because your system doesn't have gcc installed.
Unfortunately there's a ton of this kind of stuff. I suppose you can blame pypi for this, they should have some kind of warning for essentially abandoned projects.
Hmm, I personally haven't seen that kind of issue myself though. I also tend to not use random packages from random authors though, so that might help.
deleted by creator
The main issue with JS is that every 6 months someone comes up with the next great tool that misses half of basic features and dies after 6 months when someone comes up with the next great tool. But at least the old tested solution still works unlike in Python where the main goal seems to be breaking the backwards compatibility as often as possible.
I’m still rocking the fuck out of PHP (8) 😘
pnpm is already very well established, it's not completely different from npm either so they didn't have to reinvent the wheel, they just made some things much better.
Python is is just a mess on the other hand, a thousand tools all with some overlap in what they're trying to achieve because they didn't have the balls to make pip an all-in-one solution, there are 2 great alternatives that do almost everything though: poetry and pdm. I read a spot on analysis on this article, maybe it can help you make a choice
This is great, thanks. Will definitely read even though I don't do much work in python. It's good to know how NOT to do things.
lol what. Node does a new major release every six months. And you're shit talking python? There's probably never going to be another major version change, and minor versions have several years of support
In like 10 years of python development I don't think I've ever been mad about breaking changes in python.
I'd personally take PECL over npm and I loathe PECL.
Composer, though, is excellent.
Sorry but nah. My last job we had a couple different python microservices. There was pipenv, venv, virtualenv, poetry, Pipfile.lock, requirements.txt (which is only the top level???), just pure madness
Apparently all this shit is needed because python wants to install shit globally by default? Are you kidding?
Well, we also had a couple node microservices. Here's how it went: npm install. Done.
Afraid you fucked something and want a clean environment? Here's how you do it with node: delete
node_modules/
. Done.Want a clean python env? Uhhhhhhhh use docker I guess? Maybe try reinstalling Python using homebrew? (real actual answers from the python devs who set these up)
Well what's currently installed?
ls node_modules
, or usenpm ls
if you want to be fancy.In python land? Uhhhhhh
Let's update some dep–WHY AREN'T PYTHON PACKAGES USING SEMVER
So yeah, npm may do some stuff wrong, but it seems like it does way more shit right. Granted I didn't really put in the effort to figure out all this python shit, but the people who did still didn't have good answers. And npm is just straightforward and "works".
"But JS projects pull in SOOOO many dependencies" Oh boohoo, you have a 1TB SSD anyway.
None of that was needed. It was just used because nobody at your company enforced a single standard for developing your product.
rm -rf venv/. Done.
python -m venv venv
pip freeze. pip list if you want it formatted.
Janky, legacy python packages will have random versioning schemes. If a dependency you're using doesn't follow semver I would question why you're using it and seek out an actively maintained alternative.
Im honestly surprised someone using Python professionally appears to not know anything about how pip/venv work.
The points you think you are making here are just very clearly showing that you need to rtfm…
More like rtfms. I really didn't feel like learning 20 different tools for repos my team didn't touch very often.
I really don't see the hassle… just pick one (e.g. pip/venv) and learn it in like half a day. It took college student me literally a couple hours to figure out how I could distribute a package to my peers that included compiled C++ code using pypi. The hardest part was figuring out how to cross compile the C++ lib. If you think it's that hard to understand I really don't know what to tell you…
Sure, for a new project. But when inheriting code I'm not in a position to pick.
The point is that the state of python package managers is a hot fucking mess compared to npm. Claiming that "npm is just as bad" (or worse) honestly seems ridiculous to me.
(And isn't pip/venv the one the
requirements.txt
one? Completely flat, no way to discern the difference between direct dependencies and sub-dependencies? No hashes? Sucks when it's time for updating? Yeah no thanks, I'd like a proper lock file. Which is probably why there are a dozen other tools.)