YT recommendations are always something I'm at least modestly interested in - even if I wouldn't know to independently search for. I think YT is following the model clearly demanded by current American ethics that suggests that the only thing worse than talking about what a nice guy hitler was is stopping someone else from talking about what a nice guy hitler was.
I do think that at some point YT, FB and everyone else (google even!) will have to reckon with radicalization - but I still think that YT's recommendations are quite a bit more valuable than FB.
It's easy to tell that it's happening because you will see obscure, random news/opinions channels with high view counts. Something you can only get when you at some point have been promoted by Youtube.
I watched a Youtube video about how birds aren't real, that the US government killed them all and replaced them with surveillance drones.
There is no doubt I watched that. Nor is there any doubt that it impacted my recommendations.
The question is, what is the likelihood of it having radicalized me into believing this? Is Youtube successfully recruiting people to genuinely believe birds aren't real/aliens/etc. on a wide scale? I do not believe that is the case. I'm not claiming it has zero impact on this. I'm saying I'm skeptical that the scale is meaningful, especially compared to legacy media.
What I know to be certain is that traditional/legacy/corporate media is constantly and successfully recruiting people into believing conspiracy theories on a broad scale like Iraq had WMDs and Trump was a Russian asset. And yet there is not serious talk about how the corporate media is "infamous for getting people radicalized."
Which legacy media is not making a streaming migration effort currently with enough sway to manipulate the narrative? I'm not aware of any major players not moving to svod or avod in the US.
All you need to know is Youtube, one of the biggest and most popular streaming services with young people, is literally turning your children into Nazis.
I've had a Google account for... 15+ years? and have never really interacted on YouTube, in large part because I don't want to give Google my data. Just recently though I've been getting more into hobbyist boardgaming and wanted to help out some of the little guys in this niche by giving them some likes and subscribes.
So, now it's feeding me incel videos. Maybe this just is giving me more info than I want about people who play board games. But wow, Google.
I believe you. The question is not whether you were fed conspiracy theories, it's how effective were they and "compared to what?" Because that's what's implied by these articles/studies. They claim "people are radicalized," which is very different than "people watched some UFO videos."
How many people watched Youtube videos and were radicalized into believing in flat earth, aliens, etc?
Compare that to how many people watched legacy media and were radicalized into believing that Iraq had nukes and Trump was a Russian agent?
YT continues to push Jordan Peterson videos on me even though I've downvoted them many times. Other friends report the same thing.
I think the algorithm has figured out that if it can get people into JP, there's a chance they go down into more extreme rabbit holes and thus become super-engaged. So it's worth it for the algorithm to keep trying to push JP even on people who don't seem interested at first.
I used to get a lot of "Jordan Peterson" recommendations too. A bit off topic, but I really don't like his take on many subjects. I can see that he's truly articulated and it seems he mostly wins arguments by carefully crafting his phrases and forcing his view to less skilled communicators rather than by following a reasonable line of thought. I'm glad I don't receive recommendations of his videos anymore.
> he mostly wins arguments by carefully crafting his phrases and forcing his view to less skilled communicators
I think it's the other way round. When people interview Jordan Peterson, they use pre-prepared phrases to and ascribe to him opinions and views he doesn't hold. e.g. the infamous Cathy Newman interview, "So what you're saying is".
This doesn't always happen e.g. the debate between Russell Brand and Jordan Peterson.
I don't think there would have been nearly such a controversy around his work, if he had not touched on the pronoun issue, which was about compelled speech, not even particularly about pronouns.
However that issue is a bit of a hornet's nest, and Peterson is by no means alone for being targeted, as also seen in the UK with Professor Kathleen Stock.
Youtube's algorithm for me is a perfect example of what a poor ranking algorithm looks like.
I subscribe to a number of mainstream, local news feeds as well as our government's daily COVID updates. And ocassionally I will do a Google search for random terms. And yet somehow at least a few times a week I will be recommended some obscure news source often indeed with conspiracy or some ultra-right wing edge.
I very much sympathise with the challenge that Facebook's data scientists have to deal with. Incredible hard problem to solve.