Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I’d imagine it’s the massive number of user delete requests of people wiping their accounts in protest.


Certainly an abnormal usage pattern - I've personally tried a handful of backup and delete scripts, and have been waking up a lot of cold storage today.

But how many many non-cached requests does it take to overwhelm the servers? Obviously scraping my comments page for years-old data is going to hit the backend hard and isn't going to have a traffic pattern at all like the front page of my subscriptions, but I'd assume that they had far more capacity available than would be required to handle that. The number of people currently running a Greasemonkey or Python script against the archive is higher than it's ever been, but it has to be a miniscule, infinitesimal fraction of their normal traffic - right?


Many back-ends are a house of cards depending on some cache to keep it from crashing. It would not surprise me if some small number of simultaneous deleters would crash it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: