ChrisW123 Posted December 20, 2004 Share Posted December 20, 2004 I'm confused. I just did a keyword search in Google to take a look at my website listing and it appears that they have indexed pages that I have disallowed in Robots.txt, checkout_shipping.php, in the example below. ImageCritique Photography - Online Image Download of Cheap Stock ... Online image downloads of cheap stock photos. We have ... photography needs. ImageCritique Photography - Online Image Download, Cheap Stock Photos, ... https://st15.startlogic.com/ ~imagecri/checkout_shipping.php - 23k - Dec 19, 2004 What would cause this? In robots.txt (in my Root (Catalog folder)) I have: User-agent: * .... Disallow: /checkout_shipping.php .... And WHY of all pages would they select checkout_shipping.php in the first place?!! The result above was the 3rd item on the FIRST page of the results. So it appears they have concluded that checkout_shipping.php best fits the search phrase I used. But why? Why not my home page? The content, title, and keywords on my home page better matches the search phrase I used, then does the checkout_shipping.php page. So I'm wondering why they would use it? Any ideas? Is it just a GoogleBot mystery? :) Link to comment Share on other sites More sharing options...
This topic is now archived and is closed to further replies.