> Platforms rely on algorithms to sort and target content. They have not wanted to invest in human editing, to avoid both cost and the perception that humans would be biased. However, the nuances of journalism require editorial judgment, so platforms will need to reconsider their approach.
Suppose Facebook embraces the fact that editorial power is political power. Could they develop this without splitting into Blue Facebook and Red Facebook like the television news networks?
They don't. In fact, as a response to the "fake news" hysteria, they've gone to a completely centralized "Trending News" bar, so it's identical across the country. [0] This is after they got in trouble last year for intentionally stopping the propagation of stories that painted conservatives positively. [1]
I don't think so. I had to purposefully subscribe to far right news sources and block a lot of mainstream media outlets in order to see much right-wing news in my Facebook feed. That trending news feed is still biased left though, I can't change that.
Yeah, it's a total fantasy that most people want to hear from the other side. They're loyal to their tribe and hate anything and everything having to do with the other one. It causes them anger just to see it, it's like flashing the rival's gang sign or flying an enemy flag. It's a psychological identity thing.
This is important for community admins to understand. This is not a dialogue problem. People are given a tribe to belong to or oppose, and these affiliations are used by strategists and propagandists to affect popular anger/resentment.
That means that users don't want to see a balanced feed. They want to see things that make them feel good things, and not bad things. And the other side is "bad things".
Suppose Facebook embraces the fact that editorial power is political power. Could they develop this without splitting into Blue Facebook and Red Facebook like the television news networks?