this post was submitted on 09 Oct 2025
402 points (96.7% liked)
Technology
76046 readers
3960 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
The article is very much off point.
The main issue is the software crisis: Hardware performance follows moore's law, developer performance is mostly constant.
If the memory of your computer is counted in bytes without a SI-prefix and your CPU has maybe a dozen or two instructions, then it's possible for a single human being to comprehend everything the computer is doing and to program it very close to optimally.
The same is not possible if your computer has subsystems upon subsystems and even the keyboard controller has more power and complexity than the whole apollo programs combined.
So to program exponentially more complex systems we would need exponentially more software developer budget. But since it's really hard to scale software developers exponentially, we've been trying to use abstraction layers to hide complexity, to share and re-use work (no need for everyone to re-invent the templating engine) and to have clear boundries that allow for better cooperation.
That was the case way before electron already. Compiled languages started the trend, languages like Java or C# deepened it, and using modern middleware and frameworks just increased it.
OOP complains about the chain "React → Electron → Chromium → Docker → Kubernetes → VM → managed DB → API gateways". But he doesn't even consider that even if you run "straight on bare metal" there's a whole stack of abstractions in between your code and the execution. Every major component inside a PC nowadays runs its own separate dedicated OS that neither the end user nor the developer of ordinary software ever sees.
But the main issue always reverts back to the software crisis. If we had infinite developer resources we could write optimal software. But we don't so we can't and thus we put in abstraction layers to improve ease of use for the developers, because otherwise we would never ship anything.
If you want to complain, complain to the mangers who don't allocate enough resources and to the investors who don't want to dump millions into the development of simple programs. And to the customers who aren't ok with simple things but who want modern cutting edge everything in their programs.
In the end it's sadly really the case: Memory and performance gets cheaper in an exponential fashion, while developers are still mere humans and their performance stays largely constant.
So which of these two values SHOULD we optimize for?
The real problem in regards to software quality is not abstraction layers but "business agile" (as in "business doesn't need to make any long term plans but can cancel or change anything at any time") and lack of QA budget.
I agree with the general idea of the article, but there are a few wild takes that kind of discredit it, in my opinion.
"Imagine the calculator app leaking 32GB of RAM, more than older computers had in total" - well yes, the memory leak went on to waste 100% of the machine's RAM. You can't leak 32GB of RAM on a 512MB machine. Correct, but hardly mind-bending.
"But VSCodium is even worse, leaking 96GB of RAM" - again, 100% of available RAM. This starts to look like a bad faith effort to throw big numbers around.
"Also this AI 'panicked', 'lied' and later 'admitted it had a catastrophic failure'" - no it fucking didn't, it's a text prediction model, it cannot panic, lie or admit something, it just tells you what you statistically most want to hear. It's not like the language model, if left alone, would have sent an email a week later to say it was really sorry for this mistake it made and felt like it had to own it.
32gb swap file or crash. Fair enough point that you want to restart computer anyway even if you have 128gb+ ram. But calculator taking 2 years off of your SSD's life is not the best.
It's a bug and of course it needs to be fixed. But the point was that a memory leak leaks memory until it's out of memory or the process is killed. So saying "It leaked 32GB of memory" is pointless.
It's like claiming that a puncture on a road bike is especially bad because it leaks 8 bar of pressure instead of the 3 bar of pressure a leak on a mountain bike might leak, when in fact both punctures just leak all the pressure in the tire and in the end you have a bike you can't use until you fixed the puncture.