Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I don't think it's ignoring robots.txt, it's probably just cached at another thread/machine/datacenter. It will pick up the new robots.txt eventually. This behavior is well-known.


But whats the point of it downloading the robots.txt file if it isn't going to honor it?


Presumably, one cluster (or whatever google calls them) is honoring it, while a different cluster isn't.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: