Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Here's a much simpler explanation: The Feds submit a FISA order for specific data collection. The companies' lawyers approve it. Then the NSA has a convenient user interface for accessing that data (perhaps real-time?) somehow from the companies' servers (possibly through an intermediary). How else is this data being sent to Ft. Mead? Thumb drives via FedEx?

The dates on the slides might be when a company has erected some convenient access point to grab the data "lawfully" obtained by a FISA order. Microsoft whipped something together quickly. Apple took years to get the UX just right.

Frankly, sucking in ALL of the Internet seems extremely difficult and useless. We're talking GOOG+AAPL+MS+YHOO+Skype+many more. And all for $20M/year? The gov't spends more on toilet paper.



This is my thoughts exactly. I cannot imagine any hugely sophisticated data collection infrastructure costing a mere $20m a year.

More likely this is software written to take in structured data obtained by subpoena -- as it's generated by targeted users. This "ultimate user data liberation" API may have even been the system at Google that was attacked by the Chinese: http://www.washingtonpost.com/world/national-security/chines...


It's possible PRISM is a separate program that's accounted for differently. We've had pretty good evidence for a while now that the government does have firehose type capability in at least some locations.

http://en.wikipedia.org/wiki/Room_641A


Oh, that makes a lot of sense. Referring to this:

http://cdn.theatlantic.com/static/mt/assets/science/assets_c...

For only $20M/yr (which is nothing by government standards), I could definitely see that being a roadmap for building the user friendly endpoint to obtain the relatively small number of legally obtained records from each provider.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: