Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You can say that explicit curation and intention is the way things "should be" as much as you like, but all the evidence suggests that most people strongly prefer automated curation.


Not all automated curation is built equal and boiling it down to just one thing really confuses things. I absolutely adore YouTube's automated curation since the primary goal there is just to steer me to things I'll find interesting - the ads are present on all content and so YT just wants to keep me on the platform for as long as possible.

When it comes to Facebook it always feels like I'm being steered towards topics that yield monetizable verbiage. If a friend likes an upcoming concert I'll definitely hear about it loud and clear - while as an upcoming picnic or personal project being planned is less likely to float to the top.


YT's curation keeps trying to shunt me off into neoconservative conspiracy theory videos, which I never click on, but occasionally auto play if I've left the window in the background. About all I can figure I've done is watch some videos about military-themed multiplayer video games. I don't actually play these games, but sometimes the commentary is funny.

Facebook kept trying to sell me an Oculus Quest weeks after I had already bought one ("look at the metrics! 95% of people who saw the ad also bought the headset", "you're reading the graph backwards").

To me, it lays bare the myth of advertising analytics. All this data, all this tracking, and none of it is actually all that useful for the stated purpose. Makes one wonder if it's really all for fleecing advertisers or if it's to keep totalitarian regimes happy.


You can go into your YouTube watch history and delete videos that you don't want influencing your recommendations. I do this all the time because if I watch something out of the ordinary, my recommendations will be bombarded with similar stuff that I don't want to see. I just wanted to see that one video with the dog doing funny things, damn it!


YTs recommendations are infamous for getting people radicalized, into conspiracy theories, and filled with misinformation.


YT recommendations are always something I'm at least modestly interested in - even if I wouldn't know to independently search for. I think YT is following the model clearly demanded by current American ethics that suggests that the only thing worse than talking about what a nice guy hitler was is stopping someone else from talking about what a nice guy hitler was.

I do think that at some point YT, FB and everyone else (google even!) will have to reckon with radicalization - but I still think that YT's recommendations are quite a bit more valuable than FB.


I'm highly skeptical of those claims. This seems mostly like wild claims from legacy media that doesn't want you to watch streaming video.

Yes, I'm familiar with the relevant studies.


It literally happens to me every week.

It's easy to tell that it's happening because you will see obscure, random news/opinions channels with high view counts. Something you can only get when you at some point have been promoted by Youtube.


I watched a Youtube video about how birds aren't real, that the US government killed them all and replaced them with surveillance drones.

There is no doubt I watched that. Nor is there any doubt that it impacted my recommendations.

The question is, what is the likelihood of it having radicalized me into believing this? Is Youtube successfully recruiting people to genuinely believe birds aren't real/aliens/etc. on a wide scale? I do not believe that is the case. I'm not claiming it has zero impact on this. I'm saying I'm skeptical that the scale is meaningful, especially compared to legacy media.

What I know to be certain is that traditional/legacy/corporate media is constantly and successfully recruiting people into believing conspiracy theories on a broad scale like Iraq had WMDs and Trump was a Russian asset. And yet there is not serious talk about how the corporate media is "infamous for getting people radicalized."


Which legacy media is not making a streaming migration effort currently with enough sway to manipulate the narrative? I'm not aware of any major players not moving to svod or avod in the US.


All you need to know is Youtube, one of the biggest and most popular streaming services with young people, is literally turning your children into Nazis.


I've had a Google account for... 15+ years? and have never really interacted on YouTube, in large part because I don't want to give Google my data. Just recently though I've been getting more into hobbyist boardgaming and wanted to help out some of the little guys in this niche by giving them some likes and subscribes.

So, now it's feeding me incel videos. Maybe this just is giving me more info than I want about people who play board games. But wow, Google.


Which studies are funded by legacy media? There are undoubted way more that are not.

I’ve been fed all kinds of UFO and other strange videos by YT myself.


I believe you. The question is not whether you were fed conspiracy theories, it's how effective were they and "compared to what?" Because that's what's implied by these articles/studies. They claim "people are radicalized," which is very different than "people watched some UFO videos."

How many people watched Youtube videos and were radicalized into believing in flat earth, aliens, etc?

Compare that to how many people watched legacy media and were radicalized into believing that Iraq had nukes and Trump was a Russian agent?


YT continues to push Jordan Peterson videos on me even though I've downvoted them many times. Other friends report the same thing.

I think the algorithm has figured out that if it can get people into JP, there's a chance they go down into more extreme rabbit holes and thus become super-engaged. So it's worth it for the algorithm to keep trying to push JP even on people who don't seem interested at first.


Don't downvote. It's probably seen as "engagement".

Click the "..." on the recommendation and choose "Not Interested" - optionally you can specify you don't like the video.


I used to get a lot of "Jordan Peterson" recommendations too. A bit off topic, but I really don't like his take on many subjects. I can see that he's truly articulated and it seems he mostly wins arguments by carefully crafting his phrases and forcing his view to less skilled communicators rather than by following a reasonable line of thought. I'm glad I don't receive recommendations of his videos anymore.


> he mostly wins arguments by carefully crafting his phrases and forcing his view to less skilled communicators

I think it's the other way round. When people interview Jordan Peterson, they use pre-prepared phrases to and ascribe to him opinions and views he doesn't hold. e.g. the infamous Cathy Newman interview, "So what you're saying is".

This doesn't always happen e.g. the debate between Russell Brand and Jordan Peterson.

I don't think there would have been nearly such a controversy around his work, if he had not touched on the pronoun issue, which was about compelled speech, not even particularly about pronouns.

However that issue is a bit of a hornet's nest, and Peterson is by no means alone for being targeted, as also seen in the UK with Professor Kathleen Stock.


Youtube's algorithm for me is a perfect example of what a poor ranking algorithm looks like.

I subscribe to a number of mainstream, local news feeds as well as our government's daily COVID updates. And ocassionally I will do a Google search for random terms. And yet somehow at least a few times a week I will be recommended some obscure news source often indeed with conspiracy or some ultra-right wing edge.

I very much sympathise with the challenge that Facebook's data scientists have to deal with. Incredible hard problem to solve.


I wonder how much wagging the dog FB has ended up doing to everyday activities? Do people give more weight to doing things that have a better opportunity for more FB engagement? (Thinking more on this I don't think FB is the sole culprit here)


I think the trouble is trying to have it both ways: excusing the existence of a curation algorithm because "people prefer it" [0] while also refusing to take all the responsibilities normally associated with a publisher who curates and delivers content.

[0] Never mind the more obvious problem that "people prefer it" is not a great excuse for intentionally makinga product as addictive as possible.


Giving users the option to toggle the algorithm on/off would be nice.


It used to be that way. Then they started "forgetting" everyone's setting to turn it off (go back to the original reverse time based feed), then they took the option away. After purchasing Instagram they did the same exact thing, FB engineering specifically removed the feature on their platforms.


The fiction of rational choice usually makes at least a performative nod towards informed consumers, actual choices, clear results, and consent. Which of those prerequistites does Facebook fulfill?


> all the evidence suggests that most people strongly prefer automated curation.

What evidence would that be?

I don't think having to actively force a feature on your audience is a good sign that they prefer it.


Platforms based on automated curation thriving while platforms based on manual curation wither. If Facebook gave people the opportunity to turn off the algorithmic feed, and they took it... they'd just spend all their time on TikTok instead.


It depends what you mean by “prefer.” I would prefer straight chronological even if I spend less time on it and click like fewer times.


Smokers strongly prefer cigarettes.


I think the comparison is unfair to cigarettes, which at least provide some fleeting pleasure and look cool while you're smoking them.


True, also cigarettes provide an amusing bogeyman to compare things to, whereas Facebook is no laughing matter.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: