Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> That little ISP who has no real use of it

Those little ISPs like Comcast, Verizon, Cox, AT&T etc. I can't imagine those small companies would want your data or want to use it after bribing politicians to get the right to do it.

Everyone knows Google tracks you, same with Facebook. They offer a service and people understand that. You aren't paying you are the product.

You are paying your ISP for access to the internet and privacy, or at least that used to be a selling point. You are paying AND you are the product and they have competitive services they want you to use so they'll mess with competitors. Your entry to the internet should be objective and independent, what you use on top of that is up to you. You can still route around Google and Facebook, you simply cannot route around your ISP.



>They offer a service and people understand that.

No they don't. Maybe in a superficial sense, but the vast majority of the public doesn't have any clue how data can be combined and mined to reveal far more than they thought they were bargaining for.

Otherwise there would be no sudden Facebook scandal.


The scandal is that third party companies like Cambridge Analytica harvested the whole social graph and used it to manipulate people.

People know that Google and Facebook track as they are ultimately ad companies, and if you get a free service people know that your data is the product.

We pay for ISPs to protect privacy, not sell it off, that is the big difference.

Many people have seen Facebook tracking from the marketing/business or even small group/page side with their analytics and Facebook is known for their ads and sponsored content. Facebook is a marketing platform, everyone knows they use your data. ISPs are what you use to get online, not known as marketing companies that have ad networks, though they want to be.


I'd like to expand this because I don't see it written much, and please correct me if I misunderstand:

The scandal is that FB sold or allowed the ability for third party companies like Cambridge Analytica to harvest the whole social graph and FB sold or allowed the ability for them to use that same data on FB to target and manipulate people.


It is nuanced a bit so some background: Cambridge Analytica did violate Facebook terms of service as that wasn't truly allowed to pull down the whole social graph, but the protections against pulling friend data without their knowledge in the Facebook OpenGraph APIs weren't truly in until v2 around 2013-2014.

I know this because we used to do lots of Facebook apps/games and back then, once someone gave you access to their information, you could get all their friends and all their information and recursively pull down most of the social graph for public information. Most games were just using it for friend names, if they played the games, invites and competing with friends but there were bad apple apps out there harvesting it all down. The facebook app revolution was partly due to the data element and was open for many years.

It always surprised me how much data could be pulled, it is part of the reason Zynga was so effective as well and attracted some oligarch money. Part of the reason Facebook started locking it down is game/apps were getting more adept at pulling all data and Facebook was scared someone would become a social graph competitor, so they locked it down mainly for their own needs not really privacy.

Who knows if Cambridge Analytica had extra access beyond that to get to profiles that weren't public, but most profiles were public by default back then and people only put information online that they wanted to share publicly without as much expectation of privacy. Over time people for some reason started to trust that Facebook was protecting their data but still had the friend permissions access hole.

Back in the late 90s and early 00s people were very against sharing any real info on the web with sites previous to and like Facebook, it slowly changed as the appearance of privacy was added but truly it was still wide open if even one of your friends gave access to the app until Facebook v2 OpenGraph. With the OpenGraph v2 friend lockdown changes, you could only get a friend ref id only available to your one app that wasn't their actual facebook id and was different per app, and you could send them an invite but not pull their data until they agreed which it should have been all along.

After that change it was an era of tons of invites on Facebook and companies like Zynga threatened to leave and did try to build their own, it also shut down many Facebook game companies that could no longer get the numbers, many moved to mobile that was still wide open. Zynga was given special privileges by Facebook for a while due to this where others didn't have that access, others may have also had those special privileges. Facebook transition to mobile took a long time and some people even thought Facebook wouldn't be able to make the leap. At that time, the app/game companies on Facebook considered it Facebook killing the viral nature of some of those apps/games which was ultimately good. It was a huge mistake for Facebook not to separate app/gaming from your personal info and friends but that was the product then, they should have allowed people to setup app/gaming profiles that other app/gamer users could friend each other and not pollute your main friends list and pull all your social graph data. Games were a bit of a trojan into your social graph due to the setup back then.

It is possible that Cambridge Analytica had other access to non public data but as I mentioned, most data was public by default then and in a way CA was late to the game, many companies probably had people internally that could pull it down and possibly even from data centers, Facebook eventually built their own data centers. Then there is the whole side where the NSA had any access they wanted or needed as well to both public and private data, who knows if that was exploited or not. Cambridge Analytica used their data for nefarious purposes against the ToS of Facebook but that was bound to happen because it was the move fast days and security was an afterthought. In theory you could still have a network of apps that combine to get people to give you access to their data and friends approve it as well but most of that has moved to mobile rather than facebook apps as that is easier on mobile now and people moved there including Facebook themselves.

Really this whole adventure was spurred by the Web 2.0 era that people were being social and sharing more online and it was democracy online, more public, previous to that it was very limited. Web 2.0 launched this site, reddit, Facebook, Google social products, comment systems etc. So I think there was a temporary time where it was the Wild West of data mining and people sharing more than they should with the expectation of privacy because Facebook was a walled garden and people thought it made their data safer. Turns out that was not the case if they didn't specifically mark it private.

Many of these issues still affect mobile but that is getting better, however the Facebook apps probably pull more from mobile to build the social graph than they ever could on the web including calls, audio and other things that mobile allows you to do as it is native and not sandboxed like the web. Sandboxing via web browsers was huge back in the day because people were so worried about their private data and hacking, that went away for a while, Web 2.0 happened, mobile happened, data was misused, now it will tighten to more private/permissions again and has been for the last few years. Ultimately people knowing that data you put online or when you use apps isn't private is probably a good thing as the good that will come of all this. We might get to a right to your own data Bill of Rights amendment or similar one day.

Ultimately Facebook was not necessarily nefarious in this, companies like Cambridge Analytica that exploited Web 2.0/mobile and social networks to use that data against you, rather than just serve up ads, is where things went too far and thus the backlash. Facebook since v2 OpenGraph has been privacy/security conscious both for them to protect the social graph data and to create trust in users.

Now ISPs are getting in the game with removing privacy protections with their new law and they don't care about consumer trust as much, that is the scary one.


Wow! Clearly nuanced, this is a fantastic answer, I did not expect a worthwhile reply, let alone such a great synopsis. Thank you for typing it out. This quality dialogue is why I, and so many others frequent this forum! It's definately not an either or situation, and I hate how our regulators have, for about a half century now, sold out to the ISP/Telco mafia, which enables their continually shitty operations and service while holding on to their anti-competitive market positions. AT&T figured out quite long ago that excelling at cronyism was their most effective long-term business model. I am worried the tech giants will embrace regulation and skate the same path.

https://www.fastcompany.com/40520529/big-tech-lobbying-spree...


Not entirely true, you can use a vpn, but then you're placing trust in a vpn provider.


Why use paid privacy-by-policy systems when you can use free privacy-by-design systems like Tor and i2p?


Because they're slow as shit, difficult to configure correctly, and make you more conspicuous to most three-letter agencies.


> Because they're slow as shit,

Tor isn't that bad actually for browsing, while i2p needs more love regarding speed.

> difficult to configure correctly,

For Tor: You just download the Tor Browser. Already pre-configured.

> and make you more conspicuous to most three-letter agencies.

Good argument for actually using them.


> You just download the Tor Browser. Already pre-configured.

Isn't Tor Browser fairly bad due to it being a target? I'm not sure if that changed recently, but I recall seeing lots of "don't use the Tor Browser bundle".

Tor also relies on exit nodes to exist, yet it's considered very dangerous to run one.


> Isn't Tor Browser fairly bad due to it being a target? I'm not sure if that changed recently, but I recall seeing lots of "don't use the Tor Browser bundle"

Just FUD.

> Tor also relies on exit nodes to exist, yet it's considered very dangerous to run one.

It's dangerous in some places to run an exit, not everywhere.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: