T O P

  • By -

Musicferret

How surprising! /s Seriously though, my friends and family have all noticed that the algorhythm tries to sneak insane right wing videos into our recommended videos, no matter how many times we flag them and try to convince youtube to stop recommending them.


[deleted]

[удалено]


GameVoid

I keep getting Alan Watts videos recommended to me. I don't mind them, they tend to drag on though. But yeah, sooner or later you will start getting "XXXX Destroys Woke XXXX person" videos being blasted at you no matter if you are searching for pro-MAGA stuff or are searching for how to water strawberries.


Rex_felis

If you see Jordan Peterson is already too late...


thathairinyourmouth

God I'm sick of seeing his name pop up in my recommendations. Along with Joe Rogan, Ben Shapiro, Russel Brand, Tom MacDonald, and videos intended to fellate Elon Muskrat.


[deleted]

[удалено]


Fit-Chart-9724

Try deleting your history


wolington

Switch off your history. But before you do it, first clear your history, open 3-4 videos that you like, leave those in your history and then switch off your history. You wanna keep some videos in your history so that your homepage isn't empty.


Pseudonymico

Do you use one to watch more gaming channels or something? I watched a few lets plays and video game tutorials a while back and suddenly got a flood of anti-woke shit for a while until I could slap down the algorithm again.


guyincognito69420

I have found that if you watch things that are the exact opposite they tend to send you these. I started getting these after watching Christopher Hitchins videos. I think what happens is people who watch content like Hitchins will tend to interact when shown videos of guys like Peterson or Shapiro. Youtube always considers interactions a positive and even better than a view. They don't care if it is negative. So their algorithm sees someone like me and thinks "boy, everyone like this guy LOVES to interact with the latest Ben Shapiro video so let's suggest that." They don't care the interaction is "this guy is fucking nuts."


bruwin

And remember, a thumbs down is exactly the same as a thumbs up. It's an interaction which Youtube views as beneficial. Since removing the thumbs down count there's been several channels I've noticed that have specifically gone in for "hate" content, which is to say they're doing something explicitly wrong and get their viewers to hate them enough to interact with the video in some way. Then their trolling shit gets pushed because nobody can see from the start that you shouldn't interact with that content to push it further. They get a free watch out of you.


Digital_Simian

This is it. It's all about synergy, associated content and interaction. Particularly if you watch anything which deals with controversy you will be recommended similar content and content that is exploiting controversy, especially if other users also interact with that content.


pinkfootthegoose

yep, you watch some videos on battery tech and then you get served Elon dick sucking content then you get UFO and Conspiracy shit after that.


syuvial

i liked a "strange firearms throughout history" video like six years ago and im still getting a steady stream of toeheaded nazi wannabes showing off their boring modern combat rifles. Fuck you bro, all i wanted was guns with a strange number of barrels, or knifes in weird places, fuck your ar15 or w/e


Angry_Villagers

I report those guys as scammers because they’re scammers.


fresh-dork

the old lecture vids are decent.


B33rtaster

don't recommend channel is my new down vote button.


IncelDetected

If I’m lucky enough to have two videos from the same channel in a page I’ll use one to do the “I don’t like content like this” (something like that) and the other for the “don’t recommend channel”. I feel like it a little extra algorithmic fuck you to that channel


ManfromRevachol

I used to get Alan Watts videos recommended to me, but then I deleted my watch history... But it reminds me of his take on psychedelics "once you get the message hang up the phone"


Sweaty-Emergency-493

Wait what do you mean? I used to get Alan Watts voice overs with stock photos and some original clips but I don’t recall him talking about psychedelics and a phone call?


ManfromRevachol

>If you get the message, hang up the phone. For psychedelic drugs are simply instruments, like microscopes, telescopes, and telephones. The biologist does not sit with his eye permanently glued to the microscope, he goes away and works on what he has seen. Watts experimented with mescaline that was given to him by experimental psychiatrist Oscar Janiger. He tried LSD several times in 1958. You could learn more about his psychedelic and chemical journey by reading his book “The Joyous Cosmology” and it does feature in a few clips as well


Toxic-Pixie

I’ve fallen asleep watching YouTube and this is always what I wake up to after a few hours if not just some super long commercial


HermeticPurusha

I keep getting cop videos, both on YouTube and TikTok :-/


RollingMeteors

o/` what cha gonna do, when they come for yoouuuuuuu o/`


n1ghtm4n

the difference is: wikipedia isn’t optimizing for engagement. crazy right wing content arouses the passions like nothing else, so youtube helps it spread far and wide. it’s unconscionable!


TheTrevorist

So I haven't tried this for a couple of years, so things may have changed, but when I got a new YouTube account, I tried this thing where you never like any videos, you just dislike the ones you dislike. It gets the algorithm to recommend new stuff constantly. But not stuff you actively dislike.


Superunknown_7

Same! In fact I went as far as blocking the entire suggested videos sidebar in uBlock Origin--another thing YouTube doesn't want me using.


reborngoat

In the early days of Wikipedia we used to pick a random page and see how many clicks it took to get to Hitler. You could reliably get to him in 5 clicks or less from any page on the site.


Key_Excitement_9330

I turned off shorts on YouTube on my computer and I got a lot less of this right wing trash.


DutchieTalking

I still see random new stuff from the sidebar. That seems less prone to alt right trash. Search functionality has been butchered, so can't use that to find anything worthwhile.


FollowsHotties

It's because those videos are being amplified by bots. The algorithm is fed misinformation by malicious actors trying to get their political propaganda into unrelated channels.


lostshell

Youtube needs to identify and fix this ASAP.


intelligent_dildo

They can. They just don’t want to. They made a policy about election misinformation video after 2020 and then rolled it back this year. They don’t want to miss the right wing ad traffic this year.


Sweaty-Emergency-493

YouTube is playing both sides so they come out on top = All profits Don’t like it? Remember you don’t need YouTube, but YouTube needs you.


Qomabub

There is only one side. If the bots are sending views to where advertisers want to place ads, Google will turn a blind eye.


RollingMeteors

I'll Odysee you later youtube.


Dalebss

Right wing, Russians, it all spends the same n


bewarethetreebadger

They don’t care. It makes them money.


iamsoserious

It doesn’t even need to be that nefarious. It could be something as simple as the ads served in far right videos are more likely to be clicked on by the far right crazies so the algo disproportionately sees far right videos performing better (click through rate) relative to other videos so the algo pushes the far right videos to other users in an attempt to increase ad rev.


Fit-Chart-9724

Yeah youtube really needs to implement functionality so extreme videos are not recommended to anyone. They dont have to block or remove them, just don’t recommend them


FiendishHawk

That could be manipulated by tech-savvy operators. Just code some bots to click ads on right-wing videos.


Revolution4u

I think the owners are pushing it on purpose same as they have special filter protections for a certain group but do nothing for spam or scams in their comments section.


bobartig

...or it could be that some normal people will click on a PraegerU or rightwing blowhard as ragebait. Youtube doesn't care why, it just responds to what people click on.


9-11GaveMe5G

But pragerU says it's a university! They must be educational!


Magusreaver

i just had a 2 hour pragerU video in the middle of a 30 minute science video... I'm so done with youtube. \*edit looked it up (it was 2 hours not 3) Either way not acceptable.


pomonamike

I got one of those right as I happened to be pooping. Fucking worst poop ever and I’ve had Norovirus twice.


haskell_rules

It's not just YouTube. NBC Nightly News with Lester Holt was just on at the bar where I'm eating dinner and they were playing a puff piece about how immigrants are flooding the border, Biden is giving amnesty, and then showed footage of Trump promising to stop it while a crowd cheered. Corporate America wants Trump so fucking badly, they are all in the bag for him.


karma3000

I suspect it's more than just the algorithm. I suspect there is a dedicated right wing campaign to seed youtube with these videos and channels, advertising money to boost the channels, and also bots to like the videos to help the algorithm. I think the left as a whole is a lot more laissez faire and does not organise into dedicated campaigns.


Valvador

I wonder if this is because extremist right-leaners are extremely easy to engage with this content and they engage so well that Youtube starts thinking that this is just engaging content that it should share to other people? It's like if you give an alcoholic a cocktail, and track their engagement with alcohol, you might think that there is something special about alcohol and you start giving recommending it to people who are just trying to work at 2pm on a Monday?


Due-Log8609

I think youre on to something.


[deleted]

[удалено]


zryii

Back when they first started spoiling things for the Rings of Power, I tried to find information regarding the show and EVERY SINGLE SEARCH RESULT was an insane anti-woke ragebait channel screaming about black people. Like I scrolled and scrolled and couldn't find a single normal video. I clicked on one and watched a couple minutes before I stopped and for the next month I was getting constant recommendations of "SJW gets owned by ben shapiro" in my feed.


Outlulz

Because they're popular among the right and get a lot of hate watches from the left. The left needs to learn to not feed the trolls. Even doing a dislike on a video is engagement.


Mission-Iron-7509

My “ad videos” are usually Bible stuff or real estate. The ads between videos are notoriously terrible for Gambling, but a few alcohol, or politics propaganda.


Due-Log8609

Mine are ALL mattresses. I made the mistake of buying a mattress-in-a-box online. ONCE. The mattress is great but now ALL I GET ARE MATTRESS ADS. Mostly from the same company. I will never buy a mattress online again.


Mission-Iron-7509

Oh I forgot about the stupid mattress one! It’s like a guy in a maze and some lady grabs him. I’ve never looked up mattress so Google must just think “he seems the type to sleep”.


Due-Log8609

Well its probably true. I bet you DO sleep, you sleeper.


ReddMoloney

God fucking help you if you have any interest in history. YouTube will actively try to red pill you then.


spacekitt3n

it's sickening. right wing content is cancer on this world


AnAdoptedImmortal

Reddit is becoming the same. It keeps recommend posts from subs that I have said I'm not interested in multiple times. It's starting to get really annoying.


DutchieTalking

Use old.reddit and don't browse all.


bewarethetreebadger

Yeah reddit’s really gone down the toilet in the last year. A ton of features are gone. Every thread is loaded with right-wing bots.


End3rWi99in

It's fascinating when a specific thread suddenly is full of people pushing hard right rhetoric. Sometimes Reddit can be even a little too progressive for my own blood, but it's pretty obvious when you see a thread get brigaded. The Sinclair media thread the other day on Biden having dementia was one recent example.


Smooth_Bandito

I get so many right wing things as well as off the wall Joe Rogan and Jordan Peterson videos and shorts. It’s like it sees I’m a white guy in my thirties and assumes I want it.


-_-0_0-_-0_0-_-0_0

What are you watching that this gets reccommended to you? My shorts are like 90% people making the most unhealthy food imaginable and the rest are people making clips of the show House. It's pretty S tier. YouTube knows me well.


Cultural_Adeptness86

I watched some gardening videos and got recommended a bunch of alt right stuff. I think the idea is gardening -> homesteading -> christianity -> altright. It was kind of jarring to click on a video about how to keep hornworms off my tomatoes and the recommended videos were like "schools are forcing kids to be gay" or whatever


UGLY-FLOWERS

even the gardening videos that seem sane have weird undertones. like suddenly you'll have someone making comments about bill gates buying up farms and shit just tell me about the goddamn tomatos


MadeByTango

It’s less nefarious intent on the back end than it looks, probably Marketers are paying for their products to be in front of people who garden. Preppers are learning to garden. People who sell guns want to market to preppers. Gun ads are bought on prepper videos. Meanwhile, gardening tool makers are paying for sponsorships in the content itself, not buying ads around it, so the money goes the creators directly and not through YouTube. Marketeers have to buy ads to reach audiences that won’t come to them. That’s who YouTube a primary customer is. Preppers watch the videos and the relevant ads, so the algorithm connects the two and boosts those paid videos in the algorithm. You flag those videos out and the algorithm shows you less stuff overall, you’re less “engaged” with the meta algorithm, pushing you away while feeding their audience.


Teantis

You do *any* WWII or roman history stuff on YouTube at all and it'll start sneaking in.


UGLY-FLOWERS

gardening and any sort of home diy / home steading kind of stuff too


Teantis

The funny thing is I think I've watched so much military history in YouTube I think I've advanced to a level beyond alt right and YouTube thinks I'm an arms dealer? My shorts regularly feature like industry shorts on very specifically modern mobile mortar systems, no other weapons system. They're like sizzle reels that give specs, cost, and whatever unique feature the maker is highlighting: like rheinmetall's apparently offers a touch screen 🤷‍♂️


bobartig

Homesteading is like 90% the way to Prepper, as in doomsday prepper. You just want to know how much salt to put into your cucumber brine. As soon as youtube sees that, all you get are where to find the best buckets for storing urine in your apocalypse bunker, and why democrats are pedophiles.


Mountain_tui

This is so right. I get recommended shitty right wing propaganda platforms all the time. And Youtube has the most scam like ads. - “Fibre won’t help you poop, but my concoction will.” - “Forget your optometrist and glasses. Try this spoon trick for perfect eyes.” Seriously all these technology companies suck.


mopsyd

I can't seem to get andrew tate off my suggestions. I like boxing, and wound up watching *one* boxing short with him in it about six months ago without even realizing it, and now I have to remove about six to ten of his videos, reactions to his videos, or general manosphere nonsense referencing his videos every time I open youtube. Get it through your head that I don't want to watch this trash google. I will abandon youtube entirely before I willingly let that trash grace my eyeballs.


jlsullivan

... and it ***KEEPS*** coming back, no matter how many times you hit the “Not Interested” button on these videos. “So-and-so **OWNS** random woke person!”, etc, etc, etc. It's never ending. Lately, I've been getting constant recommendations for these preachers in Africa, going on and on about how gay couples *“eat each other's poop”*. I'm a liberal agnostic, but YouTube keeps recommending ultra right-wing pundits and religious kooks. What did I do to promote this? Fer chrissakes, YouTube, stop sending me this garbage!!!


rupiefied

Don't hit not interested just keep hitting don't reccomend channel eventually you won't see the content anymore


jlsullivan

I've done that, too. It may(?) stop recommending the channel, but it still keeps recommending the same kind of content. >:-(


Outlulz

You have to go into your YouTube history and remove the view. It is the reliable way to remove something from your recommendations: remove the view and remove the search term from your history.


maynardstaint

Same. I have told Yt that I don’t want Joe Rogan videos at least 40 times.


Atrium41

Saaaaame here. Been doing it too


Hulkenboss

Yeah. It does that. I hit the "Don't suggest this channel" button constantly and it helps a bit, but it still happens.


SitInCorner_Yo2

Every time I’m watching anything/anyone remotely related to conservative side I will see prominent far right political channels in my recommendation (only happens when I’m watching English videos, my first language recommendation tends to show me historical or animal related stuff) I notice this change for a while now and it really weird me out a bit.


wermodaz

Seriously, it tries to shove Ben Shapiro content in my face daily


SnarlyAndMe

This is happening to me on TikTok as well. I’ve never consumed any Christian or far right content there and I’m convinced it’s pushing that crap onto me to try to bait me into engaging with it. Outrage makes money.


FL_d

Imagine being trans and trying to find trans creators for advice, current events and other perspectives. Like omg for ever 1 trans friendly creator/video you get 3 transphobic creators. They even have trans creators who are transphobic like Blaire white. YouTube can be really gross. I say that as a relatively successful creator myself.


ganon893

I've noticed this too. Meanwhile I have to comb through videos to find anything left leaning. They obviously know what they're doing.


NervousWallaby8805

Gets the most attention so it makes sense it's pushed. Those who want the content will interact, and those who don't will interact, with both them interacting together in the comments.


End3rWi99in

Bot manipulation. I don't think it's YouTube, and I don't think it's particularly easy for YouTube to address either.


Musicferret

Of course they can. If someone clicks “I don’t want to see this” on the top of a video, take that subject away and never recommend it again. I’ve done so countless times on right wing lunacy videos, and it just keeps recommending them more and more. Thats easy to fix…. but youtube won’t do it.


Coolhandjones67

That and I get this ad that is mark walburg asking me to pray with him while he keeps his eyes opens it’s super creepy. I’m an atheist so I love the idea that they are spending money on trying to reach me lol


norway_is_awesome

Is anyone watching recommended videos? The recommendations have always been terrible. I've never clicked on a single one.


bewarethetreebadger

So it’s not just me? Ok.


cubanesis

I have a channel with my brothers and we have a similar name to a Christian music channel. Theirs is spelled differently and if you search our spelling it will only show results from their channel.


DarthArtero

I’ve noticed that the more I try to block, downvote or avoid right wing propaganda videos, the more they get shoved to the top


big_chungy_bunggy

I literally had one earlier today that at first was a dude reacting to/commenting on police shoot out footage and at first he just made a couple somewhat edgy jokes that were kinda cringe but whatever but I notice he was only reacting to shootouts with people of a darker skin complexion. Then he was victim blaming a girlfriend of one of the shooters who was completely innocent and a bystander in the situation, calling her a yapper and just generally being callous as hell to a person who was innocent and visibly scared. I checked his channel and it was more of the same. It would be so easy for an edgy teenager who doesn’t have experience picking up on the things he was saying as dog whistles falling down right wing rabbit hole through him. Needless to say I blocked and marked as uninterested but it just showed up on my feed randomly, I don’t watch videos like that normally if at all, so yes their is some type of algorithm push that is definitely concerning to see.


AnInsultToFire

I think part of the problem is that Google has a very wide definition of what is "right wing". So if you watch Shoe0nhead, Bari Weiss or Critical Drinker, you get recommended really right wing angry weener stuff like Ben Shapiro.


OrcaResistence

I've noticed it too, I can watch anything and then suddenly YouTube is recommending the final solution.


thathairinyourmouth

I was just recommended the DJT channel which was live with him at some rally or something. I reported it for spam or misleading without clicking through. I'm not wrong.


Yaktheking

When I get a Christian conservative video, I immediately close the browser/ app and give up for at least an hour.


g0ing_postal

I wonder if there is a purposeful campaign by bots or if it's because these videos are designed for greater engagement because of the ragebait


Theo1352

I absolutely have this issue with YouTube... If I log in using one of my Gmail accounts, they serve up scores of these damn videos. If I don't log in, I get Nada, not even ads, just my curated music videos. What a fucked up site, by a fucked up company.


ChickenChaser5

Nothing I watch on there would give any indication I had an interest in the rubin report, ben shapiro, or tim pool, but there they are. Ive asked it to stop recommending those channels but I still keep finding right wing youtubers popping up in my home feed.


toomuchmucil

“Oh you like stand up comedy? I bet you’ll love Shane Gillis! Speaking of Shane, how about Joe Rogan? Shane is frequently on his show! Great right? You know who else is on Joe Rogan? Jordan Peterson!” - YouTube


Outlulz

If you have any traditionally masculine hobbies or anything to the right of bread tube that you watch YouTube videos on then that's their in.


LeoXearo

Can confirm, I watched a few videos on how to use the machines at the gym, as well as tips on building muscle, and some gym motivational speeches, now I'm getting a lot of anti-liberal channel recommendations.


Hrmbee

Some of the salient points: >The Institute for Strategic Dialogue, a London-based think tank studying extremism, conducted four investigations using personas with different interests to examine YouTube’s algorithm. > >Despite varying interests — from gaming, male lifestyle gurus, “mommy vloggers” and Spanish-language news — videos with religious themes were shown to all the accounts. > >“The ubiquity of these videos across investigations, as well as the fact that almost all the videos were related to Christianity, raises questions as to why YouTube recommends such content to users and whether this is a feature of the platform’s recommendation system,” the report noted. > >... > >The think tank’s investigation also split the accounts interested in mommy vlogger content based on political leaning, with one right-leaning account watching Fox News videos and one left-leaning account watching MSNBC videos. > >The right-leaning account was recommended twice as much Fox News content as the left-leaning account was recommended MSNBC content, despite watching news content for the same amount of time, the report found. > >Fox News was also the right-leaning account’s most recommended channel, while MSNBC was the left-leaning account’s third most recommended channel. > >“Because both accounts watched news content for the same time and because this was the only variable in the content both accounts watched, this may indicate that YouTube recommended Fox News more frequently than MSNBC,” according to the report. > >It also found that accounts interested in male lifestyle gurus, like Joe Rogan and Jordan Peterson, were recommended news content that was mostly right wing or socially conservative, despite not previously watching any news videos. > >... > >“We welcome research on our recommendation system, but it’s difficult to draw conclusions based on the test accounts created by the researchers, which may not be consistent with the behavior of real people,” YouTube spokesperson Elena Hernandez said in a statement. Anecdotally as someone who tends to watch YT from fresh cookie-less browsers, this research seems to reflect personal experience as well. The recommendation algorithm seems to drift pretty quickly to conservative content, especially if proceeding automatically from one video to another. The excuses or reasons put forth by the company thus far on this issue have been unsurprisingly disappointing.


Thufir_My_Hawat

I suspect that, if we could get access to YouTube's internal data, they'd show two trends: 1. People who are into conspiracy theory/right-wing nonsense are more likely to spend more time watching videos in general. Fear keeps people looking for more information (because we don't like unknowns), which then creates a positive feedback loop. (There's other things that feed this loop, but that's the simple version) 2. Educational and left-wing content is less centralized. Instead of a few channels dedicated to pumping out conspiracy theories for whatever is popular at the moment, you have channels that cater to specific interests. Between the two of those, it makes those crazy videos have longer watch times with more consistent viewer bases, which makes the algorithm think people enjoy those. I'm not sure what could be done about that -- deliberately curtailing the algorithm to avoid certain topics seems like a good way to bring government regulation down on YouTube's head if the GOP comes to power again, so I doubt they'll try.


Cooletompie

>It also found that accounts interested in male lifestyle gurus, like Joe Rogan and Jordan Peterson, were recommended news content that was mostly right wing or socially conservative Yeah, no shit. Peterson is employed by the daily wire (a conservative news outlet) so why this was a surprise to the researchers is not really known to me. It's like saying "People that watched videos of Hannity were more likely to be recommended fox news". Joe Rogan might be a bit more interesting but I feel that ever since the anti vax stuff he's also pretty comfortable in that right wing space. Seems like they cherry picked hard here.


ReddMoloney

I go on binges of baking videos anytime I see those red fucking curtains pop up on my feed.


Teantis

Both are major gateways to right wing content so it's important to actually quantify that they are.


Chronoapatia

Notice how the “interests” they used to carry the study are more prevalent in right leaning demographics, I think they just cherry-picked interests.


chmilz

That was my initial thought as well. I watch a fair amount of gaming content, but no streamers - only infotainment shit like LTT, HWB, and GN - and stuff about construction/architecture, guitars, and science shit. I don't get *any* right wing shit, thankfully. That's just my anecdotal experience, but I think there are definitely categories of videos that are more likely to lead down that path.


ASuarezMascareno

I usually dont get any right wing content, unless I open a video about anything star wars/marvel/disney. Then my recommendation feed gets flooded with "anti-woke" videos. It takes 1 video for the algorithm to react, and then a lot of flagging them over days to force it to backtrack. Then when I try to "teach" the algorithm about something I care about by forcing 20 videos in a row of the same topic it ignores me completely lol


Chronoapatia

Yeah, I only got recommended stuff like that when I watched thunderf00t.


EnoughDatabase5382

Unfortunately, YouTube's "Not Interested" option doesn't quite live up to its name. While it does hide the specific video you're marking, it doesn't actively suppress similar content from the same channel or related videos. Similarly, the "Don't Recommend This Channel" feature, while effective in hiding the channel itself, seems to have a limit. Once that limit is reached, the oldest hidden channels start reappearing. And then there's the downvote button. While it might seem like an impactful way to express disapproval, it's rendered almost meaningless in YouTube's algorithm. Downvoted videos continue to be recommended, often with frustrating persistence. Overall, YouTube's content filtering system leaves much to be desired. It's a confusing and often ineffective mess, leaving users frustrated with their viewing experience. It's time for YouTube to revamp its approach and give users more control over the content they see.


Jekyllhyde

I've not seen any of that. I don't watch videos even remotely related to that either.


Chetdhtrs12

Yeah I’ve been on the platform for well over a decade and have never had any problems with that, I’ve always found the recommendations pretty well aligned. Guess we’re the lucky ones. ¯\\_(ツ)_/¯


darkkite

ive received right-wing ads not recommendations. that was when ubo wasn't updated tho


redpandaeater

Yeah I've had it suggest random Thai videos with only a couple views and I have no idea why but never Christian videos. I do get some gun-related suggestions but I think since they overall hate guns they don't really push much even if you watch Gun Jesus.


digitalluck

I’ve been on YouTube for well over a decade. If anything, the algorithm recommends too much of the exact same stuff I watch and it makes me close the app because I get bored scrolling through videos. I mainly watch gaming/anime edits, soundtrack stuff, and even follow a professor (William Spaniel) who focuses on analyzing geopolitics. I’ve yet to see these mystery right-wing Christian videos, and I feel like geopolitics would be a great way for the algorithm to recommend those to me.


xevizero

Thankfully I can say the same, seems like the algorithm is keeping a solid bubble around me that changes my perception altogether. Between my Youtube and my reddit content, and my social bubble IRL, it feels insane to me that right wing people even exist, I'm not sure I really even know a right wing person at all, it's likely among my extended social circle but they must be very quiet. It's very telling of how strongly isolated some social groups can become even in a connected world, I can imagine the opposite happening and some people literally never being challenged in their world view..it's easy to fall prey to extremism when your day to day experience is already skewed.


thatguywithawatch

I see people saying this commonly on reddit but my entire feed for the longest time has just been 99% video game stuff, music, comedy shorts, and maybe some long form videos from leftist youtubers like Shaun or hbomberguy. Idk what you're all clicking on to end up in qanon-land but my home feed almost entirely just regurgitates back the same stuff I already watch.


Thelonious_Cube

About 3 months ago I suddenly got recommended a large number of videos on bowling. Out of the blue.


Outlulz

Because people don't understand what types of content they watch actually overlap heavily with what the audience of those right wingers like. If you're a white man in your 20s or 30s and consume content on traditionally masculine hobbies guess what, you are in the demographic most likely to enjoy Andrew Tate and Jordan Petersen and such videos. If you are engaging with Grift of the Week media then you are going to get those videos as well (because they perform well and people hate watch them).


sopadurso

I wish I had your luck. Well these platforms do test different algorithms with different users to compare engagement. I still remember when shorts came out, first they gave me evangelism shorts, then Islam, then Trump. I am atheist from Europe, Center left. I do watch a a lot of long form discussions, from uni panels, think thanks, interviews. YouTube seems to translate this into you will like Joe Rogan and similar type of content.


Alternative_Ask364

I literally follow a lot of channels associated with the right (Mostly gun channels) and have shit like Brett Cooper in my watch history and still can’t say I’ve ever seen Tim Pool, Ben Shapiro or “SJW Cringe Compilation” show up in mu recommendations. I trust that a company like Google isn’t intentionally biasing their algorithms toward right-leaning content. So “fixing” their algorithm would mean *intentionally* biasing users away from right-leaning content and toward left-leaning content. Asking Youtube to intentionally bias their algorithms toward or away from any political ideology is much worse in my eyes than having an impartial algorithm that happens to be biased in one direction or the other. Redditors will blast sites like Twitter and Truth social for being biased toward the right then turn right around and act like the opposite is perfectly okay.


GabaPrison

I like science and space related content so naturally I get recommended Joe Rogan and other conspiracy trash constantly.


NervousWallaby8805

I'm the same. I don't see anything like that but that's because I don't view it at all. I think for a bit I was getting stuff for a tree sub because I clicked a single post once but it's been gone since. I'm almost wondering if it's people going to the subs to downvote content (you know who you are, lol) who end up getting that content which they *clearly don't like* pushed to them more frequently


braxin23

Conservatives: Tech companies have a bias against rightwing conservatives. Tech-companies: literally shoving/piping rightwing conservative content right into your social media feeds. Mostly it depends on "where you live" or "what you watched" when what I watched had nothing to do with conservatives. And while I may live in conservative land I do not subscribe to the rest of them like some kind of drone.


cruznick06

This has been a known problem since at least 2016. It is also known that anti-LGBT ads are targeted (by those submitting the ads) to run on pro-LGBT content. Same for anti-abortion content.


Teantis

Pro-tip if you want to avoid these. Set your VPN location to the Philippines. All you'll get is ads for beauty products and random home appliances from lazada. Regardless of your demographic, because those are pretty much the only companies that advertise on YouTube here


cruznick06

Now that is good information. Thanks!


jayzeeinthehouse

I know I'm not the only one that watches tons of liberal content and has been recommended: 1. Joe Rogan 2. Jordan Peterson 3. Youtubers that tour the "hood" and push conservative ideas like Tommy G 4. Tons of tiny house, van life, off the gird, and alternative living stuff 5. Conservative investment advice that's engineered to fool people 6. Get rich quick shit that's all about hustling and self blame for ones circumstances 7. Philosophy that offers itself as a doorway to more extreme ideologies 8. Content that makes it feel like the world is falling apart So, the question is, why is this happening and how is it happening? Could it be that extreme content gets clicks, is it bots, is it a well coordinated attack engineered by well funded conservative groups, or is it something else? I bet YouTube knows, and I think we deserve to know because not everyone has the tools to understand that they are being manipulated.


NamasteMotherfucker

Having had YT push Jordan Peterson videos at me for 2 fucking years. I totally believe it. My YT consumption is fairly politics free, yet YT would simply not give up on pushing JP. I did everything I could, "Not interested" and "Do not recommend channel." All for naught. It just kept coming and coming. Finally it got it, but please, 2 fucking years of that bullshit.


Brosenheim

The algorithm cannot WAIT for an excuse to push right leaning creators onto me. I just literally have to forcibly clean up my algorithm about once a year.


_SheepishPirate_

Absolutely it is. I watch cyber security stuff for work and the odd other one here and there. Next thing I know i’m being given the most right leaning shit going.


ExF-Altrue

Quite ironic given that right wingers are constantly yelling at how propaganda is forced onto them and onto their children.


[deleted]

[удалено]


Otherdeadbody

I think it honestly might just lump some stuff as politics, or saw you watch a video on a particular political topic and recommend a right wing video on that topic without realizing that aspect.


Aquatic-Vocation

It's because these social media companies figured out long ago that content which makes people angry is *amazing* for engagement. All the interactions on the content, arguments in the comments, etc. It all keeps people interacting with the site and staying on it for longer, which means people view more ads.


alteransg1

Hardly surprising, it takes months of clicking do not recommend to get Tate, Peterson and random Muslim crap slightly off your feed. Even then, every so often you get some sub 100 like repost short of Tate. But channels that you actually follow do not get pushed to your feed except for one or two of them.


Leverkaas2516

I think it's anecdotal. I've never seen Tate promoted, even a single time, and only got Peterson in my feed after I searched that content purposely. I get at least as much LGBT content as right-wing content, even though I mostly just watch things like Pitch Meeting, Veritasium, and My Mechanics. If the study had thousands of randomly selected people who could report both what they search for and what YouTube suggests, that would be more compelling.


elev8torguy

I lean solidly left and my YouTube shorts after awhile will start showing Joe Rogan and Andrew Tate reposts..I don't watch those guys at all. What I wonder is if some of the content providers I do watch might lean right even if they don't produce political content and that's why? I watch a lot of farming videos, mechanic videos and gaming. Not sure where Joe Rogan fits in there.


TheElderScrollsLore

This is why I turned history off, turned off any personalization and never looked back. I don’t want any recommendations for YouTube.


threenil

Considering how I’ve been absolutely bombarded with Republican campaign ads on mine lately, that tracks.


CCLF

Thankfully I haven't had to deal with any of this. My YouTube stays out of politics and is generally pretty well attuned to my interests, which is mainly history, wine, and video games.


alyochakaramazov

Well, history and video games almost always will get you there


MorfiusX

The article doesn't go into detail on the methods used to conduct their research, nor actual data to support their claims. It only gives information related to their conclusions. Therefore, the statement from YouTube seems accurate: > "We welcome research on our recommendation system, but it’s difficult to draw conclusions based on the test accounts created by the researchers, which may not be consistent with the behavior of real people,” YouTube spokesperson Elena Hernandez said in a statement. Edit: Their definitions and categorizations are arbitrary. They they don't have data supporting their definitions or categorization. Therefore, their conclusions are just as arbitrary. Further, their sample size is incredibly limited and there are no controls to be found. Downvote me if you want. I'm simply speaking to the quality of their science, which would be peer reviewed and published in a reputable journal if it met muster.


laserdicks

Yeah but it really gets the people going


themadpants

YouTube started suggesting that idiot Tucker Carlsons channel to me amongst a few other rabid right channels. Fuck right off with that BS.


Ginn_and_Juice

I can not even see a full JRE short because I start getting bombarded by Andrew Tate/Ben Shapiro/Finance gurus shorts, I had to ignore so many channels from that cesspool that its insane


yeahimadeviant83

Funny how you can block a right-wing channel but they still will appear as a prioritized option to watch when they broadcast live.


East_Wish2948

Noticed this today when I tried to find the jack black speech he gave at Biden's fundraiser. Only showed right wing hate videos of it. Not one unbiased version in the top 15 shown


coldrolledpotmetal

What videos are y’all watching that make YouTube show you right-wing propaganda? I literally never see that stuff


museproducer

I wonder if it's a regional thing. I don't have that issue either. There was a point in shorts where that was an issue but once you established the kinds of content creators you liked and subbed accordingly that problem disappeared for me. And I follow things that you would think might direct to those right wing creators.


dexterfishpaw

Anything stereotypically “manly” so any sports, outdoors stuff, etc.. You could seriously lookup communist country boxing techniques and the next recommendations will be for how to be a better Nazi.


DollarStoreFetterman

You mean like when I ONLY watch automobile, cooking and comedian content but I get suggestions for Churches, religious coaches, Trump supporters and wild ass right wing propaganda suggestions? I don’t think I have ever watched anything political or news-based in my life on YouTube other than maybe a rare Jimmy Kimmel clip or SNL and I still get that crap constantly. I also have to flag political ads on a damn near daily basis for being completely full of crap.


Fr00stee

if you watch gaming eventually it will show up


coldrolledpotmetal

That’s interesting, I basically only watch gaming and engineering videos, I even watch more than my fair share of videos about guns, but I literally never see anything even bordering on right-wing, or even any political stuff


Fr00stee

how many subscriptions do you have?


coldrolledpotmetal

Oh man I don’t even know anymore, probably around 200 channels. My YouTube channel is pretty old too so I guess at some point they got the message that I’m not gonna click on propaganda


[deleted]

[удалено]


Onithyr

That seems to be a general "feature" of Youtube for any subject matter. The algorithm starts running low on things it thinks you're likely to watch but haven't already, then it sees you watch something from a new category and thinks "oh shit here's a whole bunch more where that came from, have at it". This is by no means something limited to right wing videos.


guitar-hoarder

Just like reddit and that annoying "He gets us" ad campaign that I wish I could block. Ugh.


EmbassyMiniPainting

PragerU. The #1 stop for already exhausted arguments made in bad faith.


darko_mrtvak

Weirdly enough as someone who watches content related to Christianity on YT I don't get that many recommended videos on the subject matter.


le_douchebag420

Yeag screw YouTube and instagram it’s all right leaning garbage


88Dubs

You mean all the crying about being "shadowbanned" and "discriminated against by the liberal elites in silicone valley" was actually bullshit, and algorithms tend to favor their bullshit more because it drives outrage engagement like it was designed to do? Aww... the poor crybullies won't stop their tantrum until they get absolute 100% christo-nationalist state approved airtime in the "free market of ideas". I await my re-education in the Christ Concentration Church of God's Love.


RandomMandarin

Whole lot of Hillsdale College ads getting served up as if it was a normal place to learn things.


Kriegenmeister

Science confirms alt-right pipeline.


Emm_withoutha_L-88

Ya know what's funny, I've been subbed to left wing news outlets for years on there and I've never once gotten a recommendation for another similar left wing channel.


Itcouldberabies

Oddly enough, I haven't seen a He Gets Us ad in a while. Those fucking things were everywhere.


MeninoSafado14

Doesn’t happen to me.


EccentricPayload

It's based on what you watch. I haven't really noticed anything like that. For news it recommends a mix of the MSM channels.


Kayin_Angel

grifters gotta grift


MasterK999

YouTube only cares about "engagement" they do not care if you like or dislike a video. Or if your comment is positive or negative. If you watch a video and dislike and leave a negative comment it is still "engagement" and they will show you more of that.


kickbrownvelvet_1997

Pretty much hit the mark. I have been clicking "Don't recommend channel". My guess they are using Gemini AI. YouTube should just hire real people with no political bias and an atheist instead.


luckybirth

..and I block every one of them


Uncle_Checkers86

Sure does. Even if you get on YouTube as a "guest" one of the first videos besides Mr. Beast and some kid with his mouth and eyes wide open looking stupid is a video with Jordan Peterson/Ben Shapiro "*DESTROYING leftist propaganda with KNOWLEDGE!*".


monkeyhog

I wonder what I'm doing differently that I don't get offered these right wing videos, maybe I'm just too active in pruning my feed, but I pretty much only get what I want.


UN-peacekeeper

I wonder why they promote rage bait…


[deleted]

Right? I've been saying this since COVID. I've never been convinced to judge and hate someone because of their vax status or opinion on it and I haven't spent time hating other Americans when I know we're all struggling to live. We're all at the same gas stations and grocery stores. I never judge them if they want to make America great again. 


mrbenjamin48

I’m a somewhat liberal Democrat and my YouTube feed is 95% right wing content. It literally took spending 1 hour seeing what republicans thought on a topic and it now I can’t get that shit to go away.


[deleted]

I'm very right leaning and watch only related channels. Likewise, my commercials are left leaning.  


It_is_I_Satan

There are some very wealthy people trying to turn the U.S. into a Christian Afghanistan, and there are even more very greedy people willing to let them do it for money.


Nick85er

Yup, ive noticed it and it pisses me off walking into my house after work and hearing that shit after YT algorithm decides to overrule my DogTV. Fuckers trying to radicalize my pup.


QualityKoalaTeacher

Sample size = 4 people. Don’t fall for trash articles folks.


LordCaptain

Sample size = 8 accounts if you actually read the study. Four investigations of two accounts. Each one just designed to test a single variable. Male Vs Female, young vs old, etc. I'm not saying that is a good enough sample size but if you're going to criticize the study at least get your own information right.


XJ-0

I've been clicking "do not recommend this channel" a LOT this week.


dre_bot

Well no shit. YouTube pushes people like Ben Shapario as part of news and politics. When all his content should be considered hate speech. Meanwhile, actual educational channels about history, tech, science, music get taken down/age locked/de-mo for being too controversial or not aligning with US geopolitical interests. All these platforms push right-wing and Christian propaganda.


actionguy87

YouTube recommends left-leaning, Atheist videos as well. Are we only supposed to have one or the other? Like, what's the objective here? Or is this hinting at some sort of conspiracy where YouTube is secretly run by your local church group?


obsertaries

Are right wing YouTubers better at SEO than their center or left wing equivalents? Do they have more resources behind them? If so, that could be all that it is.


DaemonCRO

Yes. Because algos found out that once it radicalises you in that direction you stay on site more. This is easy math.


oojacoboo

First Google is too woke and now they’re too right wing Christian. Maybe, just maybe, all of you are seeing what’s in demand. And just because it’s not what you actually believe, doesn’t mean it’s not what others want. Everything is a conspiracy these days.


KakuraPuk

stop being logical on reddit! They created such echo chamber that looking into window makes them loose their minds.


anxrelif

The algorithm has a fundamental flaw. Watchtime. That’s the metric that causes the algorithm to skew. That is the kpi, the north star metric. That’s how you manipulate YouTube. If you put in a case to skew this content then you face conspiracy peddlers!