444
submitted 1 year ago by L4s@lemmy.world to c/technology@lemmy.world

Very, Very Few People Are Falling Down the YouTube Rabbit Hole | The site’s crackdown on radicalization seems to have worked. But the world will never know what was happening before that::The site’s crackdown on radicalization seems to have worked. But the world will never know what was happening before that.

you are viewing a single comment's thread
view the rest of the comments
[-] Duamerthrax@lemmy.world 100 points 1 year ago

Weird. Youtube keeps recommending right wing videos even though I've purged them from my watch history and always selected Not Interested. It got to the point that I installed a 3rd party channel blocker.

I don't even watch too many left leaning political videos and even those are just tangentially political.

[-] nutsack@lemmy.world 37 points 1 year ago* (last edited 1 year ago)

i think if you like economics or fast cars you will also get radical right wing talk videos. if you like guns it's even worse.

Nah. Cars or money has nothing to do with it. I've never once gotten any political bullshit and those two topics are 60% of what I watch.

[-] nutsack@lemmy.world 12 points 1 year ago* (last edited 1 year ago)

i made a fresh google account specifically to watch daily streams from one stocks channel (the guy is a liberal) and i got cars, guns, right wing politics in the feed.

my general use account suggestion feed is mostly camera gear, leftist video essays and debate bro drama.

[-] Edgelord_Of_Tomorrow@lemmy.world 3 points 1 year ago* (last edited 1 year ago)

Oh, you like WW2 documentaries about how liberal democracy crushed fascism strategically, industrially, scientifically and morally?

Well you might enjoy these videos made by actual Nazis complaining about gender neutral bathrooms!

[-] afraid_of_zombies@lemmy.world 2 points 1 year ago

I started to get into atheist programs and within a month I was getting targeted ads trying to convert me.

[-] MrScottyTay@sh.itjust.works 1 points 1 year ago

I started getting into Motorsport recently. I just get the ID video essay on racing and videos similar to top gear like overdrive. I don't get any right wing stuff or guns. But I'm also in the UK so it probably uses that too. For American maybe it's like "ah other Americans that line fast cars also like guns, here you go"

[-] Kuya@lemmy.world 19 points 1 year ago

I've been watching tutorials on jump ropes and kickboxing. I do watch YouTube shorts, but lately I'm being shown Andrew Tate stuff. I didn't skip it quick enough, now 10% of the things I see are right leaning bot created contents. Slowly gun related, self defense, and Minecraft are taking over my YouTube shorts.

[-] doggle@lemmy.dbzer0.com 13 points 1 year ago

Kickboxing to Andrew Tate is unfortunately a short jump for the algorithm to make, I guess

If you don't already, you can view your watch history and delete things.

I do that with anything not music related, and it keeps my recommendations extremely clean.

[-] ech@lemm.ee 2 points 1 year ago

I like a few Minecraft channels, but I only watch it in private tabs because I know yt will flood my account with it if I'm not careful. There is no middle ground with The Algorithm.

[-] MrScottyTay@sh.itjust.works 2 points 1 year ago

Yeah it's too much skewed by recent viewing. Even if you're subscribed to X amount of channels about topic Y but you just watched one video on topic Z, then say goodbye to Y, you only like Z now.

[-] spacebirb@lemmy.world 6 points 1 year ago

I know everyone likes to be conspiracy on this but it's really just trying to get your attention any way possible. There's more right wing popular political videos, so the algorithm is more likely to suggest them. These videos also get lots of views so again, more likely to be suggested.

Just ignore them and watch what you like

[-] Duamerthrax@lemmy.world 3 points 1 year ago

I've already said I installed a channel blocker to deal with the problem, but it's still annoying that a computer has me in their database as liking right wing shit. If it was limited to just youtube recommendations, it would be nothing, but we're on a slow burn to a dystopian hell. Google has no reason not to use their personality profile of me elsewhere.

I made this comment elsewhere, but I have a very liberal friend who's German, likes German food, and is into wwii era history. Facebook was suggesting neo-nazi groups to him.

I watch a little flashgitz and now I'm being recommended FreedomToons. I get that's some people that like flashgitz are going to be terrible, but I shouldn't have to click Not Interested more then once.

[-] doggle@lemmy.dbzer0.com 5 points 1 year ago

I'm sure YouTube hangs on to that data even if you delete the history. I would guess that since you don't watch left wing videos much their algorithm still thinks you are politically right of center? Although I would have expected it to just give up recommending political channels altogether at some point. I hardly ever get recommendations for political stuff, and right wing content is the minority of that

[-] Duamerthrax@lemmy.world 4 points 1 year ago

I watch some left wing stuff, but I prefer my politics to be in text form. Too much dramatic music and manipulative editing even in things I agree with. The algorithm should see me as center left if anything, but because I watch some redneck engineering videos(that I ditch if they do get political), it seems to think I should also like transphobic videos.

[-] Brokkr@lemmy.world -2 points 1 year ago

Indicating "not interested" shows engagement on your part. Therefore the algorithm provides you with more content like that so that you will engage more.

You can try blocking the channel, which has mixed results for the same reason, or closing youtube and staying away from it for a few hours on that account.

[-] Duamerthrax@lemmy.world 8 points 1 year ago* (last edited 1 year ago)

If I click Not Interested increases the likely hood of getting more of the same, then all the more reason to run ad blockers.

The Channel Blocker is a 3rd party tool. It just hides the channel from view. Google shouldn't know I'm doing it.

[-] jrburkh@lemmy.world 6 points 1 year ago

I don't know if this is accurate or not, but it's the most nonsensical thing I've heard in a while. If engaging with something to say, "I don't want to see this," results in more of that content - the user will eventually leave the platform. I'm having this concern right now with my Google feed. I keep clicking not interested, yet continue getting similar content. Consequently, I'm increasingly leaning toward disabling the functionality because I'm tired of fucking seeing shit I don't care to see. Getting angry just thinking about it.

[-] Brokkr@lemmy.world 1 points 1 year ago* (last edited 1 year ago)

I can only offer my own experience as evidence, but this is what I was advised to do (stop engaging by not selecting anything) and it worked. Prior to that I kept getting tons of stuff that I didn't want to see, but it stopped within a few days once I stopped engaging with it. And I agree, it is infuriating.

Because I got this advice from someone else, I guess it has worked for others too.

this post was submitted on 01 Sep 2023
444 points (93.2% liked)

Technology

59419 readers
3017 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS