128
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 11 Feb 2024
128 points (95.7% liked)
Programming
17314 readers
93 users here now
Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!
Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.
Hope you enjoy the instance!
Rules
Rules
- Follow the programming.dev instance rules
- Keep content related to programming in some way
- If you're posting long videos try to add in some form of tldr for those who don't want to watch videos
Wormhole
Follow the wormhole through a path of communities !webdev@programming.dev
founded 1 year ago
MODERATORS
This is hilarious, but now I'm wondering, what would a saner package manger look like?
I'd say
pip
is saner, though not by much as its support for private registries is very bad and seems designed to facilitate supply-chain attacks. I've heard a lot of good things aboutcargo
but haven't used it enough myself to have a strong opinion.The lack of a standard library is really the worst offender. Most of a given node_modules directory is filled with middleware to handle JS's lack of everything.
Compared to other languages it's still very barebones – but admittedly some of the bloat is also because the JS world is kinda set in its ways. I still see people use jQuery for basic selector queries and SASS for basic CSS variables.
Another factor is that developers these days assume that users have fast unmetered connections. Loading 800 kB of minified gzipped JS from ten different domains is seen as no big deal. When the cost of adding piles of dependencies is considered nil there's no impetus to avoid them.
It's saner, not perfect. With virtualenvs it does basically what you describe except that it re-downloads everything for every virtualenv, but that does not typically matter much since it's not downloading a billion dependencies.
With NPM there's no choice but to have hundreds of duplicates installed for every project, that's not just inefficient but it is a security, maintainability, and auditability nightmare.
The standard library thing is a really valid point, but how do you avoid recursive dependencies? Do you just not allow library packages to depend on anything?
Is it? It is very bare bones in my experience, I could never bring myself to use it until they make it a more fully fledged tool, such as the cargo you mentioned, yes
Other package managers, like nuget, throw errors if all dependencies on a package cannot be met by a single version.
This is probably the result of it copying all libraries in the same output directory and that .net cannot load 2 different versions of the same library so more an application restriction.
The downside of this is that packages often can't use newer features if they want to not block the users of that library and that utility libraries have to have his backwards compatibility so applications can use the latest version while dependent libraries target an older version. Often applications keep using older versions with known security issues.
Damn, sounds like a big headache x.x
npm
downloads every dependency recursively. Ifa
depends ond (= 1.2.3)
andb
depends ond (= 1.2.4)
, then both versions ofd
get downloaded intoa
andb
's respectivenode_modules
.All other package managers I'm aware of resolve dependencies into a flat list then download, and you can only have one version of the same package on your system.
You mean npm duplicates even if the the two dependency versions are compatible?
That couldn't be, right? Otherwise, if you installed two packages that rely on different incompatible versions of another package, one of the two would break. Reading a bit they should check for "satisfiability", I found some really interesting things on the topic looking around:
By default yes, unless you explicity use the "peer dependency" system which isn't the default. The "default" naive implementation is for every package in your
node_modules
to have anode_modules
of its own, all the way down recursively. There are tricks nowdays to deduplicate packages with the exact same version, but not to automatically detect "compatible" versions and use those instead (in my experience nothing would work if that was the case, deleting package-lock.json causes way too many issues due to the... uh, let's call it "brave" approach of JS devs to stability).Correct. This is intended behavior which is solved in several ways:
1.2.3
should not break something built against1.1.2
. JS and NPM's cascade of stupid implementations bred a culture of "move fast and break things", but that's not the norm in any other commonly used ecosystemglibc6
isglibc6
, notglibc_string (1.2.3)
+glibc_memory (2.6.5)
+glibc_fs (1.5.3)
+glibc_stdio (1.9.2)
+glibc_threads (6.1.0)
+ ...Internally
glibc6
is a bunch of modules, but they get bundled into one package specifically to simplify dependency management.Not being able to install two versions of the same package sounds restrictive, but it's a HUGE security benefit:
glibc6 (1.2.3)
is vulnerable to CVE-2024-1, then updating toglibc6 (1.2.4)
secures your entire system at once. With NPM though, you have to either wait for every. single. dependency on that vulnerable package down your tree to recursively update, or patch those versions yourself (at your own risk because again, small version changes often break things since developers think that NPM's dependency model means they don't have to actually provide stability guarantees).Wow, awesome explanation! I think I understand now
IDK any full-time JS or Node developers but they seem like they're lazy and all have ADD. Packages developed for years still on version 0.x, packages depending on deprecated packages that were replaced by core functionality, packages still using CommonJS format (which I actually like better unfortunately), and popular packages without an update for 3 years. It feels like the entire ecosystem is for hobbyists only and businesses are like, "Cute language, but not for us."
Ryan Dhal, the creator of node, litterally saw the npm problem(s) before incidents like this happened, and created Deno to fix his mistakes. And fix them he did! The Deno import system is incredible. Its basically the only reason I use deno. You can just import URLs directly, the deno vendors (aka caches) them. Deno has an equivlent to npm.org (Deno.land/x) but anyone can import straight from github, or make their npm.org equivlent, or import from their own private server. So if a company wants reliability, they can mirror deno.land while also avoiding unpublishing.
Yes, that's really nice! Even though I haven't touched it in a long time, I remember messing around with it out as soon as it came out a few years ago. There's also nest.land between the alternative repositories, I find their concept interesting
have a look at nix