YouTube Video Recommendations Lead to More Extremist Content for Right-Leaning Users, Researchers Suggest::New research shows that a person’s ideological leaning might affect what videos YouTube’s algorithms recommend to them. For right-leaning users, video recommendations are more likely to come from channels that share political extremism, conspiracy theories and otherwise problematic content.

  • shalafi@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    ·
    1 year ago

    LiberalGunNut™ here! (Yes, we exist.) I do not experience this. Bear with me a moment…

    I consume loads of gun related content on YouTube. Historical, gunsmithing, basic repair, safety, reviews, testing, whatever. My favorite presenters are apolitical, or at least their presentations are.

    My recommendations should be overrun with right-wing bullshit. Yet they are not. My recommendations are more of the same, and often include interesting and related media. I may stray off into other fringe areas like prepping, but even that doesn’t get radical, and my feed comes back to center in a hurry.

    Can someone explain what I’m seeing here?

    As a side note, I do experience this with my default “news” tab on Edge. Yes, it’s 95% crap, but I sometimes see real news I want to follow up. But fuck me, one time I clicked on a GenA vs. GenB article, flooded with it. My Android feeds me like this. Clicked on a couple of stories about wild pigs, flooded. Hummingbird story? Flooded.

    But I’m not getting this on YouTube. 🤷🏻‍♂️

    • serial_crusher@lemmy.basedcount.com
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      Subjective biases can play a huge part in stuff like this. The researchers behind this story had to go through a bunch of YouTube channels and determine whether they constitute extremist right wing content or not.

      I think it’s a safe assumption that if you took the people consuming that content and asked them whether the video they just watched was right wing extremist content, most of them would say no.

      So, it’s possible that you don’t think you’re being overwhelmed with right wing extremist content, but that somebody else looking at your viewing history might think you are.

    • dexa_scantron@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      They don’t do it to everyone. Some people get put in test groups that get ‘nice’ algorithms that don’t try to make you angry, so they can measure the effect on their revenue.