this post was submitted on 11 Aug 2025
570 points (98.6% liked)

Programmer Humor

25699 readers
1839 users here now

Welcome to Programmer Humor!

This is a place where you can post jokes, memes, humor, etc. related to programming!

For sharing awful code theres also Programming Horror.

Rules

founded 2 years ago
MODERATORS
 
top 50 comments
sorted by: hot top controversial new old
[–] dejected_warp_core@lemmy.world 14 points 2 days ago* (last edited 2 days ago) (2 children)

I used to struggle with this, until I realized what's really going on. To do conventional web development, you have to download a zillion node modules so you can:

  • Build one or more "transpilers" (e.g. Typescript, Sass support, JSX)
  • Build linters and other SAST/DAST tooling
  • Build packaging tools, to bundle, tree-shake, and minify your code
  • Use shims/glue to hold all that together
  • Use libraries that support the end product (e.g. React)
  • Furnish multiple versions of dependencies in order for each tool to have its own (stable) graph

All this dwarfs any code you're going to write by multiple orders of magnitude. I once had a node_modules tree that clocked in at over 1.5GB of sourcecode. What I was writing would have fit on a floppy-disk.

That said, it's kind of insane. The problem is that there's no binary releases, nor fully-vendored/bundled packages. The entire toolchain source, except nodejs and npm, is downloaded in its entirety, on every such project you run.

In contrast, if you made C++ or Rust developers rebuild their entire toolchain from source on every project, they'd riot. Or, they would re-invent binary releases that weekend.

[–] catnip@lemmy.zip 5 points 2 days ago

Boy do i have news concerning rust :p

[–] silasmariner@programming.dev 7 points 2 days ago

And if you made JavaScript developers use compatible versions for everything they'd riot. And also every build would fail for, like, at least a week

[–] dormedas@lemmy.dormedas.com 54 points 3 days ago (7 children)

Feels like a lot of “not inventing the wheel” - which is good? There are plenty of good wheels out there.

[–] Shayeta@feddit.org 71 points 3 days ago (1 children)

But I don't NEED a wheel, I just need a tarp to put over this metal frame on my patio, and for some reason the tarp manufacturer attaches wheels and plane wings to it!?

[–] jol@discuss.tchncs.de 28 points 3 days ago (1 children)

The package comes with all the bells and whistles but the final build only contains the tarp, if you import it right and tree shake it.

[–] marlowe221@lemmy.world 16 points 3 days ago

This person nodes

[–] ICastFist@programming.dev 12 points 3 days ago (1 children)

"Yes, I'd like a wheel. I don't want to invent it. Why, of course, give me the full package of wheel, axis, rotor, engine, fuel tank, windshield, mirrors, tire, front panel, brakes. This wheel will be great for me manually spinning cotton!"

load more comments (1 replies)
[–] fmstrat@lemmy.nowsci.com 14 points 3 days ago (1 children)

The problem is "I need function, library with 1000 functions has function, include." Library's 823rd function turns out to have a vulnerability.

load more comments (1 replies)
[–] dohpaz42@lemmy.world 21 points 3 days ago (2 children)

Until those wheels contain malware and spyware.

load more comments (2 replies)
[–] dejected_warp_core@lemmy.world 4 points 2 days ago (2 children)

You say that, but I've watched the JS community move from one framework and tool suite to the next quite rapidly. By my recollection, I've seen a wholesale change in popular tooling at least four times in the last decade. Granted, that's not every developer's trajectory through all this, but (IMO) that's still a lot.

load more comments (2 replies)
[–] scarilog@lemmy.world 2 points 2 days ago

Is this why pip packages are called wheels...?

load more comments (1 replies)
[–] fahfahfahfah@lemmy.billiam.net 33 points 3 days ago (5 children)

And this is why tree shaking exists.

[–] candyman337@lemmy.world 15 points 3 days ago (2 children)
[–] NewDark 42 points 3 days ago (1 children)

If you import 1% of your module code, you only compile the actual used code. Tree shaking is removing dead code paths that aren't used.

[–] candyman337@lemmy.world 13 points 3 days ago

Ah ok gotcha

[–] zea_64@lemmy.blahaj.zone 14 points 3 days ago (1 children)

Dead code elimination but with a different name for some reason

[–] QuazarOmega@lemy.lol 4 points 3 days ago (1 children)
[–] mathiouchio@sh.itjust.works 2 points 2 days ago

We ARE all apes

load more comments (4 replies)

the one on the right is also packages in node_modules that you're actually using and specifically requested.

[–] Munrock@lemmygrad.ml 8 points 3 days ago (1 children)

Except in the picture on the left, someone's actually reading it.

Something's gone wrong if you're looking in the node_modules folder.

[–] invertedspear@lemmy.zip 5 points 3 days ago (1 children)

Sometimes you gotta monkey patch that library because they won’t accept your pull requests to fix that bug.

[–] joyjoy@lemmy.zip 3 points 3 days ago

At least you can monkeypatch it.

[–] mesamunefire@piefed.social 18 points 3 days ago (2 children)

Very true.

Python feels like that sometimes too. Except much more standard library which is much better than node modules.

[–] CameronDev@programming.dev 21 points 3 days ago (3 children)

Rust as well. Seems to just be a modern language thing.

[–] PhilipTheBucket@piefed.social 18 points 3 days ago* (last edited 3 days ago) (1 children)

I sort of have a suspicion that there is some mathematical proof that, as soon as it becomes quick and easy to import an arbitrary number of dependencies into your project along with their dependencies, the size of the average project's dependencies starts to follow an exponential growth curve increasing every year, without limit.

I notice that this stuff didn't happen with package managers + autoconf/automake. It was only once it became super-trivial to do from the programmer side, that the growth curve started. I've literally had trivial projects pull in thousands of dependencies recursively, because it's easier to do that than to take literally one hour implementing a little modified-file watcher function or something.

[–] CameronDev@programming.dev 14 points 3 days ago (1 children)

Its certainly more painful to collect dependencies with cmake, so its not worth doing if you can hand roll your own easily enough.

The flip side is that by using a library, it theoretically means it should be fairly battle-tested code, and should be using appropriate APIs. File watching has a bunch of different OS specific APIs that could be used, in addition to the naive "read everything periodically" approach, so while you could knock something together in an hour, the library should be the correct approach. Sadly, at least in rust land, there are a ton of badly written libraries to wade through... 🤷

[–] PhilipTheBucket@piefed.social 10 points 3 days ago (2 children)

Yeah. I have no idea what the answer is, just describing the nature of the issue. I come from the days when you would maybe import like one library to do something special like .png reading or something, and you basically did all the rest yourself. The way programming gets done today is wild to me.

load more comments (2 replies)
load more comments (2 replies)
load more comments (1 replies)
[–] MTK@lemmy.world 7 points 3 days ago

Wait until OP finds out about interpreters and compilers.

[–] TootSweet@lemmy.world 16 points 3 days ago (4 children)

Be the change you want to see in the world, people. Don't use any Node (or Rust or Python or Java or whatever) modules that have more dependencies than they absolutely, positively, 100%, for real have to. It's really not that hard. It doesn't have to be this way.

[–] CameronDev@programming.dev 15 points 3 days ago* (last edited 3 days ago)

Too late, is_even_rs now depends on tokio

[–] who@feddit.org 16 points 3 days ago (1 children)

This applies to developers, too.

External dependencies put end users at risk, so I avoid them as much as possible. If that means I have to rethink my design or write some boring modules myself, then so be it.

[–] kibiz0r@midwest.social 11 points 3 days ago (1 children)

Depends on the use case, and what you mean by “external dependencies”.

Black box remote services you’re invoking over HTTP, or source files that are available for inspection and locked by their hash so their contents don’t change without explicit approval?

Cuz I’ll almost entirely agree on the former, but almost entirely disagree on the latter.

In my career:

I’ve seen multiple vulns introduced by devs hand-writing code that doesn’t follow best practices while there were packages available that did.

I have not yet seen a supply chain attack make it to prod.

The nice thing about supply chain attacks though: they get publicly disclosed. Your intern’s custom OAuth endpoint that leaks the secret? Nobody’s gonna tell you about that.

[–] who@feddit.org 6 points 3 days ago* (last edited 2 days ago)

I didn't think I would have to spell this out, but when I wrote "as much as possible", I was acknowledging that some libraries are either too complex or too security-sensitive to be reasonably homebrewed by the unqualified. (Perhaps "as much as reasonably possible" would have been better phrasing.) Where the line lies will depend on the person/team, of course, but the vast majority of libraries do not fall into that category. I was generalizing.

And yes, some third-party libs might get so much public scrutiny as to be considered safer than what someone would create in-house, depending on their skills. But safety in numbers sometimes turns out to be a false assumption, and at the end of the day, choosing this approach still pushes external risks (attack surface) onto users. Good luck. It hardly matters to the general point, though, because most libs do not have this level of scrutiny.

Let's also remember that pinning dependencies is not a silver bullet. If I didn't trust someone to follow "best practices", I don't think I would trust their certification of a third-party library hash any more than I would trust their own code.

With all that said, let me re-state my approach for clarity:

  • I minimize dependencies first. Standard libraries are great for this.
  • When something more cannot reasonably be avoided, I choose very carefully, prioritizing the safety of my users over my own convenience.
  • Sometimes that means changing my original design, or spending my time learning or building things that I hadn't planned to. I find the results to be worth it.
[–] InvalidName2@lemmy.zip 13 points 3 days ago (5 children)

Which sounds like great, practical advice in a theoretical perfect world!

But, the reality of the situation is that professionals are usually balancing a myriad of concerns and considerations using objective and subjective evaluations of what's required of us and quite often inefficiency, whether in the form of programmatic complexity or in the form of disk storage or otherwise, has a relatively low precedent compared to everything else we need to achieve if we want happy clients and a pay check.

[–] kautau@lemmy.world 7 points 3 days ago

Lol yeah working in enterprise software for a long time, it's more like:

  1. Import what you think you need, let the CI do a security audit, and your senior engineers to berate you if you import a huge unnecessary library where you only need one thing
  2. Tree shake everything during the CI build so really the only code that gets built for production is what is being used
  3. Consistently audit imports for security flaws and address them immediately (again, a CI tool)
  4. CI

Basically just have a really good set of teams working on CI in addition to the backend/frontend/ux/security/infrastructure/ whatever else teams you have

load more comments (4 replies)
[–] ICastFist@programming.dev 5 points 3 days ago

cries in legacy systems

[–] joyjoy@lemmy.zip 4 points 3 days ago

Also C programmers using glibc

load more comments
view more: next ›