Guest Posted January 13, 2010 Share Posted January 13, 2010 To the readers: Thanks for viewing this topic. I hope that my questions, if answered, help others as well as me. I play close attention to my Google Webmaster tools and have been noticing some interesting things. 1. Parameter handling - This seems somewhat new. How can I use it to my advantage. Has anyone experimented with a list of parameters to ignore / don't ignore? 2. Sitemaps - I have one current sitemap submission. I deleted my old ones, but they are still available to look at if I click show all sitemaps. It reports that there being indexed, but, I haven deleted them from my webserver. Has anyone else had this issues. I'm concerned my old sitemaps (prior to URL rewrite) are interfering with my new sitemaps. 3. Sitemaps - Indexed URLS continues to slip. Its been decreasing by a few pages every couple of days. Please tell me someone knows a way to reverse this. Thanks for any comments :) Link to comment Share on other sites More sharing options...
♥mdtaylorlrim Posted January 13, 2010 Share Posted January 13, 2010 To the readers: Thanks for viewing this topic. I hope that my questions, if answered, help others as well as me. I play close attention to my Google Webmaster tools and have been noticing some interesting things. 1. Parameter handling - This seems somewhat new. How can I use it to my advantage. Has anyone experimented with a list of parameters to ignore / don't ignore? A parameter is the part of a URL that follows the ?. A general page like product_info.php goes no where on it's own but with ?item=123 it will go to the correct item. In this case Item 123. You definitely do not want Google to ignore the parameter otherwise it will only index product_info.php with nothing after it, so none of your products will be indexed. Install SEO URLs and you will not have to deal with parameter handling. 2. Sitemaps - I have one current sitemap submission. I deleted my old ones, but they are still available to look at if I click show all sitemaps. It reports that there being indexed, but, I haven deleted them from my webserver. Has anyone else had this issues. I'm concerned my old sitemaps (prior to URL rewrite) are interfering with my new sitemaps. 3. Sitemaps - Indexed URLS continues to slip. Its been decreasing by a few pages every couple of days. Please tell me someone knows a way to reverse this. These are too closely related to give you two different answers. The bottom line is have only one sitemap. You can have multiple sitemap indexes as long as they connect. You do not want an orphan sitemap on your site because Google remembers them for quite some time, and subsequent visits to your site will continue to index them, possibly with wrong URLs. Only have a good sitemap with valid links on your site, and keep Google webmaster tools data up to date. If Google sees bad URLs then it will deleted causing the number of pages index to decrease. Get your sitemaps correct asap and matching what is on Google. You will have to live with a reduced number for a while but don't be alarmed. Use the site search feature to see what you really have indexed. In a google wweb search put in "site:your_site_here" and you will see all your indexed pages. Don't ever expect them to match the number of indexed URLs on the webmaster tools page. Thanks for any comments :) Community Bootstrap Edition, Edge Avoid the most asked question. See How to Secure My Site and How do I...? Link to comment Share on other sites More sharing options...
Guest Posted January 26, 2010 Share Posted January 26, 2010 mdtaylorlrim, Thanks for the answering my questions. I apologize for the dated response. I read your response almost immediately and now have had the proper time to respond to it. Thanks for the quick information on parameter handling, I feel confident I have it mastered. As for the site map questions I am still utterly lost and confused. I am including some screen shots of my webmasters screen to give yourself and other readers a better grasp of what I'm trying to explain. You can see below my submitted site map and the extremely low indexed ratio. This has continued to fall instead of rising as others reported it would over time. Notice in the middle right hand corner you can see "All (3)". This link takes you to a page that shows previously submitted site maps. I have attached an image of this at the bottom (2nd image). Again the one below the text was reached by clicking "All (3)". Look carefully at the Site map stats. You can see they are very inaccurate. Also notice that sitemaps.xml was the old site map and is returning a 404 while the https sitemaps.xml is not. Since both were returning a 404 for many months I looked on google for other answer. One person suggested adding the link to my htaccess for a 301 redirect to the correct one. I have done this and as you can see I have two duplicate sitemaps. Please note I did not submit the https sitemaps.xml. Google tried to download the old one (previously a 404) again and was given back a 301 redirect to the correct site map by the .htaccess. With all this explained can anyone offer insight to remove the wrong ones? Thanks for any help! Link to comment Share on other sites More sharing options...
Hotclutch Posted January 26, 2010 Share Posted January 26, 2010 Your old sitemaps must be located somewhere on your server, if google is still uploading them. Just delete it from that location. If they are automatically generated by a cronjob then you need to disable that. As for the URL count in Webmaster tools, I don't think it is always 100% accurate at any time. Compare the indexed URLs you have in site:www.mysite.com with what is being reported in Webmaster tools. Link to comment Share on other sites More sharing options...
Guest Posted January 27, 2010 Share Posted January 27, 2010 Your old sitemaps must be located somewhere on your server, if google is still uploading them. Just delete it from that location. If they are automatically generated by a cronjob then you need to disable that. As for the URL count in Webmaster tools, I don't think it is always 100% accurate at any time. Compare the indexed URLs you have in site:www.mysite.com with what is being reported in Webmaster tools. Hotclutch, Thank your for taking the time to reply and offer your insight. You brought up a point that I had overlooked. The cronjob aspect was something I hadn't checked, but after checking the cronjobs I quickly realized it was not the problem. There currently is no cronjob for the old sitemaps. As for the old sitemaps being located on the server, I can assure you or anyone else that is not the case. If you try to query the sitemap it will return a 404 not found, also I have visually inspected the files on the server and have not been able to find it. To correct my previous statement, if you attempt to query the old sitemap you will be redireted through use of the .htaccess to the correct sitemap (reasoning for this is my previous post). What other ideas can you think of? I appreciate your response, you pointed something I had missed. If only that was the solution. I appreciate any more comments from you or from others. Thanks Link to comment Share on other sites More sharing options...
♥mdtaylorlrim Posted January 28, 2010 Share Posted January 28, 2010 To correct my previous statement, if you attempt to query the old sitemap you will be redireted through use of the .htaccess to the correct sitemap (reasoning for this is my previous post). What other ideas can you think of? How about this...two things actually... If Google does not get a 404 they mark it as a good map, even though the eventual file is your current map...and, The maps listed on the second page are the ones you deleted and are there for information only. Bottom line is, leave only the good sitemap there. Delete all references to any others and don't try to fool the spiders with redirect links. Community Bootstrap Edition, Edge Avoid the most asked question. See How to Secure My Site and How do I...? Link to comment Share on other sites More sharing options...
Hotclutch Posted January 28, 2010 Share Posted January 28, 2010 Also you should not allow robots to fetch anything from your https. Not sure about your server setup, but I have a separate robots.txt for the https side with Disallow all in there for SEs. After you have that, then delete the sitemap from Google. Link to comment Share on other sites More sharing options...
Guest Posted February 2, 2010 Share Posted February 2, 2010 How about this...two things actually... If Google does not get a 404 they mark it as a good map, even though the eventual file is your current map...and, The maps listed on the second page are the ones you deleted and are there for information only. Bottom line is, leave only the good sitemap there. Delete all references to any others and don't try to fool the spiders with redirect links. mdtaylorlrim, Roger that. Can you confirm your information with a Google source by chance? You wrote: "The maps listed on the second page are the ones you deleted and are there for information only.". Not that I doubt you, I would just like to confirm this. Because if so, then I really never had a problem to begin with. Thanks Also you should not allow robots to fetch anything from your https. Not sure about your server setup, but I have a separate robots.txt for the https side with Disallow all in there for SEs. After you have that, then delete the sitemap from Google. Hotclutch, Can you please briefly tell me why it is not okay to allow robots to fetch https information? Do they even crawl https in the first place? Currently I do not have anything set up to prevent it, but I have no reason to suspect its being indexed. Thanks Link to comment Share on other sites More sharing options...
Recommended Posts
Archived
This topic is now archived and is closed to further replies.