this post was submitted on 26 Mar 2025
443 points (91.4% liked)

Technology

68187 readers
4599 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] liv@lemmy.nz 2 points 2 days ago* (last edited 2 days ago) (1 children)

That's really interesting. Its output to this prompt totally ignored the biggest and most obviously detrimental effect of this problem at scale.

Namely, emotional dependence will give AI's big tech company owners increased power over people.

It's not as if these concepts aren't widely discussed online, everything from Meta's emotional manipulation experiments or Cambridge Analytica through to the meltdowns Replika owners had over changes to the algorithm are relevant here.

[–] interdimensionalmeme@lemmy.ml 2 points 2 days ago (1 children)
[–] liv@lemmy.nz 1 points 2 days ago (1 children)

Sort of but I think influence over emotional states is understating it and just the tip of the iceberg. It also made it sound passive and accidental. The real problem will be overt control as a logical extension to the kinds of trade offs we already see people make about, for example data privacy. With the Replika fiasco I bet heaps of those people would have paid good money to get their virtual love interests de-"lobotomized".

[–] interdimensionalmeme@lemmy.ml 1 points 2 days ago

I think this power to shape the available knowledge, removing it, paywalling it, based on discrimination, leveraging it, and finally manipulating for advertising, state security and personnal reason is why it should be illegal to privately own any ML/ AI models of any kind. Drive them all underground and only let the open ones benefit from sales in public.