this post was submitted on 08 Sep 2025
49 points (73.3% liked)

Technology

74961 readers
3325 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

cross-posted from: https://programming.dev/post/37090037

Comments

you are viewing a single comment's thread
view the rest of the comments
[–] x1gma@lemmy.world 5 points 1 day ago (1 children)

On the other hand, detrimental reliance is a tort and if someone is relying on an app for a specific safety function, the app could be civilly liable if it fails it's function in some way.

Yes, if the app would be any kind of official tool.

Imagine if you had this attitude about an insulin use tracker/calculator, that sometimes gave wildly wrong insulin dose numbers.

Yes, and that's why regulations for those kinds of things exist, that prevent those things. There is no regulation for the ice tracker.

Maybe down the road, it's decided that aiding and abetting ICE is a crime, and providing misinformation intentionally or unintentionally is a criminal act. App developer dude could be criminally liable if he knew or ought to have known he had vulnerabilities. You know, in your New Nuremberg trials that you are going to get sometime in the next decade or so.

If down the road a regulation would happen for, app developer dude would be forced to either comply or to stop operations.

[–] Bane_Killgrind@lemmy.dbzer0.com 1 points 20 hours ago (1 children)

Wouldn't need so much regulation if things were just well reasoned and fit for purpose. Or if they would stop only pretending to be those.

[–] x1gma@lemmy.world 2 points 12 hours ago (1 children)

No matter how well reasoned, allegedly fit for purpose or how much something pretends to be it, we shouldn't be trusting those promises, especially not from people we don't know. That does not end well neither for the free candy van nor for cybersecurity. Trust like that has been responsible for a lot of attacks over varying vectors and for projects going wrong.

[–] Bane_Killgrind@lemmy.dbzer0.com 1 points 10 hours ago

Well yeah, that just requires a consensus on what is trustworthy. There are some things that are trustworthy, and you need to have some way to identify that, if you are going to protect yourself.

But that just shifts the blame to the user, who is a non expert, and we don't really have good ways to identify safe software products. There's stuff like CSA for physical products. It's short-sighted to say "well if you don't know, use nothing", because that's not going to happen.