YouTube shorts disproportionately promotes alt-right content according to this experiment
TLDR if you don't wanna watch the whole thing:
Benaminute (the Youtuber here) creates a fresh YouTube account and watches all recommended shorts without skipping. They repeat this 5 times, where they change their location to a random city in the US.
Below is the number of shorts after which alt-right content was recommended. Left wing/liberal content was never recommended first.
Houston: 88 shorts
Chicago: 98 shorts
Atlanta: 109 shorts
NYC: 247 shorts
San Fransisco: never (Benaminute stopped after 250 shorts)
There however, was a certain pattern to this. First, non-political shorts were recommended. After that, AI Jesus shorts started to be recommended (with either AI Jesus talking to you, or an AI narrator narrating verses from the Bible). After this, non-political shorts by alt-right personalities (Jordan Peterson, Joe Rogan, Ben Shapiro, etc.) started to be recommended. Finally, explicitly alt-right shorts started to be recommended.
What I personally found both disturbing and kinda hilarious was in the case of Chicago. The non-political content in the beginning was a lot of Gen Alpha brainrot. Benaminute said that this seemed to be the norm for Chicago, as they had observed this in another similar experiment (which dealt with long-form content instead of shorts). After some shorts, there came a short where AI Gru (the main character from Despicable Me) was telling you to vote for Trump. He was going on about how voting for "Kamilia" would lose you "10000 rizz", and how voting for Trump would get you "1 million rizz".
In the end, Benaminute along with Miniminuteman propose a hypothesis trying to explain this phenomenon. They propose that alt-right content might be inciting more emotion, thus ranking high up in the algorithm. They say the algorithm isn't necessarily left wing or right wing, but that alt-right wingers have understood the methodology of how to capture and grow their audience better.
I like Youtube (with adblocker) but shorts are pretty trashy. It's mostly shorts of women as naked as they can get on Youtube without breaking the rules who have purposefully given themselves super camel toes and set the thumbnail for the short to show their camel toe to get people's attention. And it's just a front to get you to their OnlyFans.
True, but the comparison lies more in the fact that- according to her fanbase, she can seemingly do no wrong. They gobble up anything she says like a nest of hungry baby birds.
I see Rogan’s army of dudebros as being no different, only less intelligent.
Alt right videos are made to elicit outrage, hate, and shock which our lizard brains react to more due to potential danger than positive videos spreading unity and love. It’s all about getting as many eyeballs on the video to make money and thi is the way that’s most effective.
Are people making clickbait/ragebait articles about climate change? Are people seeking out clickbait about climate change?
I don't need to be constantly reminded of climate change, but an old "friend" is constantly telling me about the politics of video games he doesn't even have a system to play with.
I feel like they have a hard time defining alt right. If you type in is drinking coffee alt right there is a article, playing video games, driving cars.
I realized a while back that social media is trying to radicalize everyone and it might not even be entirely the oligarchs that control its fault.
The algorithm was written with one thing in mind: maximizing engagement time. The longer you stay on the page, the more ads you watch, the more money they make.
This is pervasive and even if educated adults tune it out, there is always children, who get Mr. Beast and thousands of others trying to trick them into like, subscribe and follow.
This is something governments should be looking at how to control. Propaganda created for the sole purpose of making money is still propaganda. I think at this point that sites feeding content that use an algorithm to personalize feeds for each user are all compromised.
The problem is education. It's a fools game to try and control human nature which is the commodification of all and you will always have commercials and propaganda
What is in our means is to strengthen education on how to think critically and understanding your environment. This is where we have failed and I'll argue there are people actively destroying this for their own gain.
Educated people are dangerous people.
It's not 1984. It's Brave New World. Aldous Huxley was right.
I think we need to do better than just say "get an education."
There are educated people that still vote for Trump. Making it sound like liberalism is some result of going to college is part of why so many colleges are under attack.
From their perspective I get it, many of the Trump voters didn't go, they hear that and they just assume brainwashing.
We need to find a way to teach people to sort out information, to put their immediate emotions on pause and search for information, etc, not just the kind of "education" where you regurgitate talking points from teachers, the TV, or the radio as if they're matter of a fact ... and the whole education system is pretty tuned around regurgitation, even at the college level. A lot of the culture of exploration surrounding college (outside of the classroom) is likely more where the liberal view points come from and we'd be ill advised to assume the right can't destroy that.
sites feeding content that use an algorithm to personalize feeds for each user are all compromised.
Not arguing against this at all because you’re completely correct, but this feels like a key example of governments being too slow (and perhaps too out of touch?) to properly regulate tech. People clearly like having an algorithm, but algorithms in their current form are a great excuse for tech companies to use to throw their hands up in the air and claim no foul play because of how opaque they are. “It only shows you what you tell it you want to see!” is easy for them to say, but until consumers are given the right to know how exactly each one works, almost like nutrition facts on food packaging, then we’ll never know whether they’re telling the truth. The ability for a tech company to have near unlimited control and no oversight over what millions of people are looking at day after day is clearly a major factor in what got us here in the first place
Not that there’s any hope for new consumer protections during this US administration or anything, but just something I had been thinking about for a while
The people where I live are -- I guess -- complete morons because whenever I try to check out Youtube without being logged in, I get the dumbest of dumb content.
But as another weird data point, I once suggested my son check out a Contrapoints video which I found interesting and about 1 year later she told me she wanted to get a surgery -- I don't exactly remember which kind as I obviously turned immediately into a catatonic far right zombie.
I use YouTube and don't get much far-right content. My guess is it's because I don't watch much political content. I use a podcatcher and websites for that. If I watched political content, it might show me some lurid videos promoting politics I disagree with because that tends to keep viewers engaged with the site/app longer than if they just showed videos consistent with the ideology I seek out. That gives people the feeling they're trying to push an ideology.
I made that up without any evidence. It's just my guess. I'm a moderate libertarian who leans Democratic because Republicans have not even been pretending to care about liberty, and for whatever reason it doesn't recommend the far-right crap to me.
I've been happy with BlockTube for blocking channels or single videos. I also use YouTube Shorts Redirect for automatically converting shorts into regular videos.
I don't know if anyone of you still looks at memes on 9gag, it once felt like a relatively neutral place but the site slowly pushed right wing content in the last years and is now infested with alt-right and even blatantly racist "memes" and comment-sections.
Fels to me like astroturfing on the site to push viewers and posters in some political direction. As an example: in the span during US-election all of a sudden the war on palestine became a recurring theme depicting the Biden admin and jews as "bad actors" and calling for Trump; after election it became a flood of content about how muslims are bad people and we shouldn't intervene in palestine...
Real talk: I've been using YouTube without an account and with some ad blocking stuff installed. Based on what I'm seeing, I'm pretty sure the algorithm's datapoint for me is "He was born with a penis and is ok with that."
When I lose my better judgement and start scrolling shorts like an idiot, It is fight videos (IRL, movie scenes, UFC and boxing), auditing, Charlie Kirk and right-wing influencers, and the occasional clip from Shoresy on the basis "He might be Canadian too, idk".
It is noticibly weird, and I have brought it up to my kid who uses an account, is not what Youtube believes me to be, and whose shorts feed is very different.
We do both get that guy who opens Pokemon cards with a catchy jingle, though.
I check friends' Snapchat stories from time to time, and Snapchat suggests public stories on the same page. I think Snapchat has the same sort of singular data point on me that "this account is likely a straight man", because most of what they show me are sports clips, woman influencers in revealing clothing, and right-wing influencers talking about culture war stuff. I never view any of that sort of stuff, but it still shows up any time I try to check my friend's stories. I guess I view public stories so infrequently that they just give me a default generic man feed.
Yeah, I've gotten more right wing video recommendations on YouTube, even though I have turned off my history. And even if I turned on my history, I typically watch left wing videos.
I keep getting recommendations for content like "this woke person got DESTROYED by logic" on YouTube. Even though I click "not interested", and even "don't recommend channel", I keep getting the same channel, AND video recommendation(s). It's pretty obvious bullshit.
You'd think a recommendation algorithm should take your preferences into account - that's the whole justification for tracking your usage in the first place: recommending relevant content for you...
The aim is not to recommend relevant content. The aim is to recommend content you will engage with. That may be because you're interested, or it may be because it's ragebait that you will hate but watch anyway.
Anything but the subscriptions page is absolute garbage on that site. Ideally get an app to track your subs without having to have an account. NewPipe, FreeTube etc.
You are correct. He used a VPN for several US locations in the video. He then compared what content was shown in different regions of the US to see if everyone sees the same thing or if the content is radically different depending on where you are.
My point is, if youtube customizes the feeds based on the IPs, then the youtube accounts used are not really "fresh" but there's already some data entered into the profiles upon their creation.
Almost no corporation has benefits operating in a liberal/left country, they are harder to exploit and make profit of. Why would they promote things like worker protection, parental leave, unions, reducing their own rights to favor the society, paying for healthcare etc?
Edit: Wording
On a true crime video: "This PDF-File game ended himself after he was caught SAing this individual.... Sorry Youtube forces me to talk like that or I might get demonetized" Flagged for discussing Suicide
On PragerU: "The Transgender Agenda is full of rapists and freaks who will sexually assault your children, they are pedophiles who must be dealt with via final solution!" Completely fucking acceptable!
That statement about murdering a hitchiker has no edits, but simply saying that slavery is bad, took three edits. Says everything you need to know about Penis Prager
The view farming in shorts makes it even harder to avoid as well. Sure, I can block the JRE channel, for example, but that doesn’t stop me from getting JRE clips from probably day-old accounts which just have some shitty music thrown on top. If you can somehow block those channels, there’s new ones the next day, ad infinitum.
It’s too bad you can’t just disable the tab entirely, I feel like I get sucked in more than I should. I’ve tried browser extensions on mobile which remove the tab, but I haven’t had much luck with PiPing videos from the mobile website, so I can’t fully stop the app.
Commenting on stuff definitely strengthens it, but I wouldn't know if a shadow ban changes that. I don't think there's much difference if you are shadowbanned or not, you're still interacting with the content.
I think the explanation might be even simpler - right wing content is the lowest common denominator, and mindlessly watching every recommended short drives you downward in quality.
yeah i created a new youtube account in a container once and just watched all the popular/drama suggestions. that account turned into a shitstorm immediately
these days i curate my youtube accounts making liberal use of Not interested/Do not recommend channel/Editing my history and even test watching in a container before watching it on my curated account
this is just how "the algorithm" works. shovel more of what you watch in your face.
the fact that they initially will give you right-wing, conspiracy fueled, populist, trash right off the bat is the concern
Man that seems like a lot of work just to preserve a shitty logarithm that clearly isn't working for you... Just get a third party app and watch without logging in
I was gonna say this. There’s very little liberal or left leaning media being made and what there is is mostly made for a female or LGBTQ audience. Not saying that men cannot watch those but there’s not a lot of “testosterone” infused content with a liberal leaning, one of the reasons Trump won was this, so by sheer volume you’re bound to see more right leaning content. Especially if you are a cisgender male.
Been considering creating content myself to at least stem the tide a little.
I think some of it is liberal media is more artsy and creative, which is more difficult to just pump out. Creation if a lot more difficult than destruction.
Really? As someone who dislikeds both mainstream extremes (I consider myself libertarian), I see a lot more left-leaning content than right-leaning content. I wouldn't be surprised if >75% of the content I watch comes from a left-leaning creator, nor because I seek it out, but because young people into tech tend to lean left, and I'm into tech.
Do these companies put their fingers on the scale? Almost certainly
But it’s exactly what he said that’s what brought us here. They have not particularly given a shit about politics (aside from no taxes and let me do whatever I want all the time). However, the algorithms will consistently reward engagement. Engagement doesn’t care about “good” or “bad”, it just cares about eyes on it, clicks, comments. And who wins that? Controversial bullshit. Joe Rogan getting elon to smoke weed. Someone talking about trans people playing sports. Etc
This is a natural extension of human behavior. Human behavior occurs because of a function. I do x because of a function, function being achieving reinforcement. Attention, access to something, escaping, or automatic.
Attention maintained behaviors are tricky because people are shitty at removing attention and attention is a powerful reinforcer. You tell everyone involved “this person feeds off of your attention, ignore them”. Everyone agrees. The problematic person pulls their bullshit and then someone goes “stop it”. They call it negative reinforcement (this is not negative reinforcement. it’s probably positive reinforcement. It’s maybe positive punishment, arguably, because it’s questionable how aversive it is).
You get people to finally shut up and they still make eye contact, or non verbal gestures, or whatever. Attention is attention is attention. The problematic person continues to be reinforced and the behavior stays. You finally get everyone to truly ignore it and then someone new enters the mix who doesn’t get what’s going on.
This is the complexity behind all of this. This is the complexity behind “don’t feed the trolls”. You can teach every single person on Lemmy or reddit or whoever to simply block a malicious user but tomorrow a dozen or more new and naive people will register who will fuck it all up
The complexity behind the algorithms is similar. The algorithms aren’t people but they work in a similar way. If bad behavior is given attention the content is weighted and given more importance. The more we, as a society, can’t resist commenting, clicking, and sharing trump, rogan, peterson, transphobic, misogynist, racist, homophobic, etc content the more the algorithms will weight this as “meaningful”
This of course doesn’t mean these companies are without fault. This is where content moderation comes into play. This is where the many studies that found social media lead to higher irritability, more passive aggressive behavior and lower empathetization could potentially have led us to regulate these monsters to do something to protect their users against the negative effects of their products
If we survive and move forward in 100 years social media will likely be seen in the way we look at tobacco now. An absolutely dangerous thing that was absurd to allowed to exist in a completely unregulated state with 0 transparency as to its inner workings
With Milo (miniminuteman) in the thumbnail, I thought the video was going to imsinuate that his content was part of the alt-right stuff. Was confused and terrified. Happily, that was not the case.
I'll get downvoted for this, with no explanation, because it's happened here and on reddit.
I'm a liberal gun nut. Most of my limited YouTube is watching gun related news and such. You would think I'd be overrun with right-wing bullshit, but I am not. I have no idea why this is. Can anyone explain? Maybe because I stick to the non-politcal, mainstream guntubers?
The only thing I've seen start to push me to the right was watching survival videos. Not some, "dems gonna kill us all" bullshit, simply normal, factual stuff about how to survive without society. That got weird fast.
I've noticed most firearms channels steer well clear of politics, unless it's directly related to the topic at hand, I think partly to appeal to an international audience.
I do think the algorithm puts firearms and politics into very separate categories, someone watching Forgotten Weapons probably isn't going to be interested in political content.
Yeah, I don't think I've ever seen alt-right nonsense without actively looking for it. Occasionally I'll get recommended some Joe Rogan or Ben Shapiro nonsense, but that's about it.
I consider myself libertarian and a lot of my watch time is on Mental Outlaw (cyber security and dark web stuff), Reason (love Remy and Andrew Heaton videos), and John Stossel, but other than that, I largely avoid political channels. I watch a fair amount of gun content as well.
If I get recommended political stuff, it's usually pretty mainstream news entertainment, like CNN or Fox News. Even the crypto nonsense is pretty rare, even though I'm pretty crypto-positive (not interested in speculation though, only use as a currency and technical details).
If you're seeing alt-right crap, it's probably because you've watched a lot of other alt-right crap.
I have had the opposite experience. I watch a few left-leaning commentary channels. Sam Seder, my boy Jesse Dollomore. If I watch a single video about guns (with no apparent ideological divide), within a single refresh I'm getting Shapiro and Jordan Peterson videos. I'm in a red Western state. My subscriptions are mostly mental health, tech, and woodworking. I have to delete history if I stray even a little bit.
My watch history would peg me as NOT a Republican. Youtube's short feed will serve me
excerpt from youtuber's longer video
tiktok repost from like, the truck astrology guy or "rate yer hack, here we go" guy, etc
Artificial voice reading something scraped from Reddit with Sewer Jump or Minecraft playing in the background
Chris Boden
Clip from The West Wing
Clip from Top Gear or Jeremy Clarkson's Farm
"And that's why the Bible tells us that Jesus wants you to hate filthy fucking liberals."
"Do not recommend channel." "The downvote button doesn't even seem to be a button anymore but I clicked it anyway." "Report video for misinformation and/or supporting terrorism." But the algorithm keeps churning it up.
Instagram is probably notably worse, I have a very establish account that should be very anti that sort of thing and it keeps serving up idiotic guru garbage.
Tiktok is by far the best in this aspect, at least before recent weeks.
A couple of years ago, I started two other Instagram accounts besides my personal one. I needed to organize and have more control of what content I want to see at times I choose. One was mostly for combat sports, other sports, and fitness. The second one was just food.
The first one, right off the bat showed me girls with OnlyFan accounts in the discovery page. Then after a few days, they begin showing me right wing content, and alpha male garbage.
The second one, the food account, showed alternative holistic solutions. Stuff like showing me 10 different accounts of people suggesting I consume raw milk. They started sending me a mix of people who just eat meat and vegans.
It's really wild what these companies show you to complete your profile.
I saw a tiktok video talking about how Instagram starts the redpill/incel stuff early for the young people then they become failures in life at which point they push the guru stuff for "guidance".
EU and even China has at least made a attempt of holding these companies accountable for the algorithm but US and Canadian government just sat there and did nothing.
Agreed 100%. Whenever I'm in India, I get Hindu nationalist content A LOT. I briefly attempted the dislike/don't recommend thing, but nope! I was getting absolutely spammed with stuff like this regardless. I just disabled shorts after that.
If I see any alt-right content, I immediately block the account and report it. I don't see any now. I go to yourube for entertainment only. I don't want that trash propaganda.
Same. I watched one Rogan video in like, 2019, and it was like opening a flood gate. Almost immediately almost every other recommendation was some right-wing personality's opinion about "cancel culture" or "political correctness." It eventually called down once I started blocking those channels and anything that looks like it might lead to that kind of content. I can only imagine what would pop up now.
Don't get me wrong, I prefer if platforms don't take a political stance at all. That's the reason why I use platforms like Lemmy.
I am simply just pointing out that conservative ideologies have been oppressed online and in the media for the greater part of a decade. Funny to see how the left are losing their minds now that they get a little taste of it themselves.
I noticed my feed almost immediately changed after Trump was elected. I didn't change my viewing habits. I'm positive YouTube tweaked the algorithm to lean more right.
Crazy stuff. So not only does YouTube make you generally dumber, it now is pushing the audience to more Conservative viewpoints because of the "emotional engagement" that keeps 'em watching. and YouTube probably sells more premium subscriptions that way. fuck google!
Just scrolling through shorts on a given day, I'm usually recommended at least one short by someone infamously hostile to the LGBTQIA+ community. I get that it could be from people with my interests hate-watching, but I don't want to be shown any of it. (Nearly all of my YouTube subscriptions are to LGBTQIA+ creators. I'm not subscribed to anyone who has ever even mentioned support for right leaning policies on anything.)
I didn't watch the video, but it's YT short, you just swipe like tiktok. The few ways to curate the algorithm is to either swipe away quickly, click on the "not interested" button, downvote, or delete watched shorts from history. If you doesn't interact with any of this and watch the full length of the video, the algorithm gonna assume you like this kind of content. They also will introduce you content you never watched before to gauge your interest, a lot of times it's not even related to what you currently watched, and if you didn't do any curation, they gonna feed you the exact type for some times. I don't know how they manage the curation but that's the gist of it from my experience. My feed have 0 politics, mostly cats. I control the feed strictly so i got what i demand.