I received an e-mail from Google saying "Coverage problems detected on linsenschuss.de". Do I still have to make special settings so that Google can index the page without problems? Do I still need to edit the robots.txt specifically?
So the question is, what exactly are the url's being blocked that should not be blocked? X3 DOES block some url's which should NOT be accessed and indexed by Google. For example when Google tries to access "pages" within the /content/ dir.Dane wrote:No, its not for dane-vetter.com, its for linsenschuss.de. The message is: Indexed, although blocked by robots.txt.
First please figure out if it is a problem, which I am almost 100% sure it is not. All url's blocked by X3's "robots.txt" are meant to be blocked. Why else would I add them?Dane wrote:But its good to know that i dont need to change any settings and that the robots.txt is correctly. So i ask my hoster where is the problem.