263
submitted 11 months ago by L4s@lemmy.world to c/technology@lemmy.world

YouTube Video Recommendations Lead to More Extremist Content for Right-Leaning Users, Researchers Suggest::New research shows that a person’s ideological leaning might affect what videos YouTube’s algorithms recommend to them. For right-leaning users, video recommendations are more likely to come from channels that share political extremism, conspiracy theories and otherwise problematic content.

you are viewing a single comment's thread
view the rest of the comments
[-] vexikron@lemmy.zip 17 points 11 months ago

They optimize recommendations to a large degree to induce anger and rage, because anger and rage are the most effective ways to drive platform engagement.

Facebook does the same.

[-] PoliticalAgitator@lemm.ee 1 points 11 months ago

We also have no idea what measures they take to stop the system being manipulated (if any).

The far-right could be working to ensure they're recommended as often as possible and if it just shows up as "engagement" or "impressions" on their stats, YouTube is unlikely to fight it with much enthusiasm.

this post was submitted on 16 Dec 2023
263 points (96.1% liked)

Technology

59419 readers
2840 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS