I interviewed there this summer because the position/compensation offered was just too good to not consider seriously. Like, top 10% at FB kind of money, which is a lot. During the interview process, I asked many FB employees about their thoughts on FB in the news, the headlines, scandals, etc. All super smart and dedicated people, but anecdotally there was no hint of them thinking they had done anything wrong. They explained for example how Cambridge Analytica was totally overblown in the media, specifically because users opted in, and the media was just bashing them. They did say they wouldn't follow the same route again, but didn't seem to consider themselves responsible ethically/morally. I can see how one might debate whether having an opt-in absolves FB of responsibility, but it had a smell to it and didn't necessarily jive with my personal ethical framework. I did get an offer, and not necessarily for that reason ended up declining. But it parallels what you said except in May 2019. I would consider taking a job there in the future and I got the sense that they're working to turn things around, but what high level people at FB said to me largely seemed to mirror your experience many years later so I thought I'd share.
>They explained for example how Cambridge Analytica was totally overblown in the media, specifically because users opted in, and the media was just bashing them. [Emphasis mine]
That is really troubling. Excusing nefarious behavior because "well the user opted in" is a downright horrible way to rationalize bad behavior. Users never read the TOS. They barely understand the privacy settings (though I've heard they've gotten better ... haven't had Facebook in years). I tend to think that, yeah, Cambridge Analytica was probably overblown a bit for a number of reasons, but wow. I know a couple people who have quit Facebook (one recently, one a couple years ago before the election) and a couple who have stayed. I have to say, the ones who've stayed have drifted away from the rest of our friend group. Sad to see.
I would also imagine that, especially during interviews, they recognize their duty is in part to sell the candidate on the firm. There's an implicit pressure to be positive about your company because there's only downside to saying negative things, even if it's honest. I don't see why an employee would put themselves on the line with a candidate, who they may never work with nor see again, just to express what they think is right.
If you're trying to hire "top 10% pay at FB" smart level people, hiding behind poorly constructed rhetorical barriers is more of a negative thing than recognizing a real negative thing. It doesn't have to be anyone's "fault" even, but more of a consequence of the system that was built with good intentions.
The people probably had a moral conflict: the good Facebook money but lose "their souls", or go with their conscience, quit and lose that money and work friendships/network. So, so they can sleep at night they've convinced themselves the company they're a part of wasn't the responsible party. So "Hey, those users opted-in, it's their own fault!".
Funny how it means the company is now probably full of either "ignorance is bliss" or morally bankrupt people, because the people with good conscience have left. Not that society at large is much different... what uncomfortable truths are we ignoring?
I know a few people who work there too. I can have an intelligent conversation/debate about almost anything with them, from tech stacks to software development methodology, but bring up privacy and Cambridge Analytica, and all of a sudden the wagons circle and it’s all “we’re so misunderstood,” and “the biased media is out to get us,” and “we do so much for user privacy and always just get shit on!” Its as if they all went through the same training and got the same talking points. Spooky!
I don’t doubt that Facebook does a lot for user privacy, but it is probably the case that nobody there (except maybe Boz) really, thoroughly understands the enormous amount of power and influence the company wields, and they also haven’t realized that the thing they have worked hard to create is now an out-of-control monster.
Yes. A few hundred thousand people took the quiz and Facebook's platform allowed them to therefore gather information on tens of millions of people. Facebook had a policy at the time that the extraneous data could only be used to improve user experience, not third-party data harvesting, but reportedly it was weakly enforced.
It doesn't pay to speak up. We glorify those who made a change but truth is that 99% of those who speak up even with small things quickly learn to shut up or leave for greener pastures. Remaining in the company are the YES-men and outside of regular employment those who speak just to breathe. Still, we encourage everyone to speak up to be able to weed out the black sheep before they cause too much damage.