Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Run into a similar problem with my blog this year. After spending some time trying to resolve it, I just gave up.

I can understand that every now and then Google changes it's rules and validation procedures, so that what used to work now gets removed from the index out of sudden, given their fight with spam and slop. But what I'm struggling to understand is how could Google crawler and Google Search Console be so bad so that:

* google crawler stops fetching sitemap out of sudden, even though Google claims it's an important signal for the search engine * requesting sitemap refresh via GSC fails on "unknown" error, which is puzzling considering according to my web logs, nobody tried to load it between my request and the error * after fixing an error, validation job gets stuck for weeks, only to fail for unclear error * random deindexing events as explained in the post

And I don't buy the argument that this is necessary for Google to deal with spam, because Bing Webmaster Tools just works flawlessly, and they have to deal with it as well.

I don't understand how a small business deal with this kind of issues.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: