Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

How hard would it be to build a distributed hosting system for all the torrents ever on Piratebay ?

Considering how one poster here says that all of the torrents upto 2013 fit into 90 MB , its quite feasible. It could tunnel over TOR to make tracking things a little more difficult.



Sure. But how would one manage updates? I.e. accepting new torrents.



How about just falling back to Gnutella P2P for the torrent hosting layer.


Is that really a viable alternative? My impression is that people left the Gnutella network due to instability and floods of crap files.


Isn't it already roughly done like that? the entire bittorrent magnet system

https://en.wikipedia.org/wiki/Magnet_URI_scheme

which, if I'm not mistaken, will look for peers which have the relevant torrent file, and acquire and use that

- - -

and the 90MB is all the information /except/ torrent files, once again, if I'm not mistaken


Currently you still need a web site for searching the magnet links.


I've always wondered if bittorrent itself wasn't the right protocol for this. Let's say you have a tiny web page, with embedded javascript that implements bt. A person could keep a copy of this page local to their system, have a bookmark for it. The page, when it loads, uses bt to grab a copy of thepiratebay, and load it inline. It wouldn't even need a domain name at that point.

The swarm for this would be large enough that it would be self-sustaining, it'd be impossible to shut it down.

I just can't figure out how you'd make it updatable, so that new content could be added to it.

(Also, there's the issue of initial distribution of the html file that let's people get access to it...)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: