Leeb2 Posted December 27, 2005 Share Posted December 27, 2005 I have tried to read up on this on the forums but it is too confusing with all different views. I want to allow spiders to index my site, but I want to keep the sids out of the URLs that they index. Here are my session settings: Force Cookie Use: False Check SSL Session ID: True Check User Agent: True Check IP Address: False Prevent Spider Sessions: True Recreate Session: True What do I set here in order to allow spiders to index my site, but keep the sids out of the URLs that they index? I use shared SSL and use the updated spider.txt. Link to comment Share on other sites More sharing options...
♥Vger Posted December 27, 2005 Share Posted December 27, 2005 Go to Contributions and look for the spiders.txt contribution. The basic osCommrce spiders.txt file is two years out of date and doesn't list many of the newer spiders - including MSN Bot and Yahoo Inktomi. The spiders.txt contribution is kept updated - it has one file with all the major spiders and another LARGE file with most of the minor ones too. Vger Link to comment Share on other sites More sharing options...
Leeb2 Posted December 27, 2005 Author Share Posted December 27, 2005 Go to Contributions and look for the spiders.txt contribution. The basic osCommrce spiders.txt file is two years out of date and doesn't list many of the newer spiders - including MSN Bot and Yahoo Inktomi. The spiders.txt contribution is kept updated - it has one file with all the major spiders and another LARGE file with most of the minor ones too. Vger Yes, I have that spiders.txt installed. Will that prevent my site from being indexed with sids in the URLs? Link to comment Share on other sites More sharing options...
Recommended Posts
Archived
This topic is now archived and is closed to further replies.