Jump to content
  • Checkout
  • Login
  • Get in touch

osCommerce

The e-commerce.

Google / robots.txt / default shopping cart


jhande

Recommended Posts

Posted

Google is driving me totally nuts! :wacko:

 

Within my Google Webmaster Tools, I just checked to see how things are going since I recently added /catalog/ to my robots.txt file (I no longer have a catalog directory).

 

I seen these three URL errors amoung the 138:

http://handeshobbies.com/1941-willys-street-pr-32.html?action=buy_now
http://handeshobbies.com/1966-chevrolet-chevelle-p-467.html?action=notify
http://handeshobbies.com/2005-ferrari-superamerica-pr-302.html?action=buy_now

 

Usually all I see are error messages pointing to anything regarding my catalog folder, such as:

http://handeshobbies.com/catalog/1958-chevrolet-impala-2in1-p-479.html
http://handeshobbies.com/catalog/1960-chevrolet-impala-hardtop-2in1-p-481.html

Which by the way I'm still getting results for.

 

I visit my catalog and there sitting in the shopping cart are three items. :o

I deleted my Temporary Internet Files, Cookies and History in my browser. Refreshed the page and they are still there. :huh:

 

So I guess you could say I have two problems...

How do I clear the shopping cart?

Oh, I wasn't signed in yet.

The only way I figured was to login and empty the cart. Isn't there another way?

 

Here's my robots.txt file:

 

User-Agent: *

Disallow: /catalog/

Disallow: /includes/

Disallow: /cgi-bin

Disallow: /account.php

Disallow: /account_edit.php

Disallow: /account_history.php

Disallow: /account_history_info.php

Disallow: /account_newsletters.php

Disallow: /account_notifications.php

Disallow: /account_password.php

Disallow: /address_book.php

Disallow: /address_book_process.php

Disallow: /advanced_search.php

Disallow: /advanced_search_results.php

Disallow: /checkout_confirmation.php

Disallow: /checkout_payment.php

Disallow: /checkout_payment_address.php

Disallow: /checkout_process.php

Disallow: /checkout_shipping.php

Disallow: /checkout_shipping_address.php

Disallow: /checkout_success.php

Disallow: /conditions.php

Disallow: /contact_us.php

Disallow: /cookie_usage.php

Disallow: /create_account.php

Disallow: /create_account_success.php

Disallow: /info_shopping_cart.php

Disallow: /login.php

Disallow: /logoff.php

Disallow: /password_forgotten.php

Disallow: /popup_image.php

Disallow: /product_reviews_write.php

Disallow: /shopping_cart.php

Disallow: /ssl_check.php

Disallow: /tell_a_friend.php

Disallow: /UpdatePrice.php

 

Does anything look wrong with it?

Why since I uploaded the robots.txt file am I getting strange goings on at my site and more errors at Google?

 

My wife is bugging more than me. She says to just give up and open an eBay store, less headaches.

I'm starting to think she might be right for once.

Almost every time now I update or upgrade something it seems to break.

I'm seriously debating whether to renew my hosting subscription come June.

- :: Jim :: -

- My Toolbox ~ Adobe Web Bundle, XAMPP & WinMerge | Install ~ osC v2.3.3.4 -

Posted

Create a google sitemap account and upload sitemaps so that google will get recent copies of your url's. With the latest Header Tags installed, you shouldn't be getting the duplicate url's.

 

The cart will empty once the session expires. If you are still in the session, then just delete them fro the cart by using the update option on the shopping cart page.

 

Directories in a robots file should have a closing slash, so change this

Disallow: /cgi-bin

to this

Disallow: /cgi-bin/

I also wouldn't list

Disallow: /conditions.php
Disallow: /contact_us.php

 

Jack

Support Links:

For Hire: Contact me for anything you need help with for your shop: upgrading, hosting, repairs, code written, etc.

All of My Addons

Get the latest versions of my addons

Recommended SEO Addons

Posted

Thank you so much for helping me again Jack! ;)

 

Create a google sitemap account and upload sitemaps so that google will get recent copies of your url's. With the latest Header Tags installed, you shouldn't be getting the duplicate url's.
I dug a little deep at Google Webmaster Tools and found that all the pages resulting in errors which point to my non-exsistant catalog directory products, all show they are linked from other sites. Maybe that's why I am getting duplicate links which one includes the catalog directory??

 

I still have to set the CRON job for the sitemaps. Since my products don't change often/quickly, would submitting the maps to Google weekly be fine instead of daily?

 

The cart will empty once the session expires. If you are still in the session, then just delete them fro the cart by using the update option on the shopping cart page.

What stumped me, is the items weren't placed in the cart by me. I assumed it must have been Google since two were listed in the results. :huh:

 

Directories in a robots file should have a closing slash, so change this
Disallow: /cgi-bin

to this

Disallow: /cgi-bin/

I also wouldn't list

Disallow: /conditions.php
Disallow: /contact_us.php

 

Thank you I will fix that right away. I pretty much used the robots.txt from the contributions. :blush:

 

Jack

 

Thanks again Jack.

- :: Jim :: -

- My Toolbox ~ Adobe Web Bundle, XAMPP & WinMerge | Install ~ osC v2.3.3.4 -

Posted
Thank you so much for helping me again Jack! ;)

 

I dug a little deep at Google Webmaster Tools and found that all the pages resulting in errors which point to my non-exsistant catalog directory products, all show they are linked from other sites. Maybe that's why I am getting duplicate links which one includes the catalog directory??

You should only have a duplicate content problem if the pages actually exists. If you have removed your catalog directory, that shouldn't be the case. If they are there due to a sitemap, they will go away with the next upload. If they are there because there are links to them in the listings, then a 301 needs to be created for them.

 

I still have to set the CRON job for the sitemaps. Since my products don't change often/quickly, would submitting the maps to Google weekly be fine instead of daily?

Yes, that is fine.

 

What stumped me, is the items weren't placed in the cart by me. I assumed it must have been Google since two were listed in the results. :huh:

Hmm, that is strange. The search engines shouldn't be able to place items in the cart. Be sure your spiders file is up-to-date and the Prevent Spiders option is on. If you have cache enabled and are using /tmp/ as the directory, that can happen.

 

Thanks again Jack.

No problem. :)

 

Jack

Support Links:

For Hire: Contact me for anything you need help with for your shop: upgrading, hosting, repairs, code written, etc.

All of My Addons

Get the latest versions of my addons

Recommended SEO Addons

Posted

Another plea for help... :blush:

 

You should only have a duplicate content problem if the pages actually exists. If you have removed your catalog directory, that shouldn't be the case. If they are there due to a sitemap, they will go away with the next upload. If they are there because there are links to them in the listings, then a 301 needs to be created for them.

Pages do not exsist, at least in a catalog folder as they have been moved to the root.

Sitemap is somewhat up-to-date, does not include the catalog folder.

I need help with the proper syntax for the htaccess file.

 

This is all I could find:

Implementing a 301 redirect for dynamic pages

RewriteEngine on

RewriteCond %{QUERY_STRING} ^id=13$

RewriteRule ^/page.php$ http://www.example.com/newname.htm? [L,R=301]

"In the example above the id=13 should be replaced with the query string of the page you wish to redirect and the page.php with the name of your file prior to the query string."

Advanced - domain name/folder change

"If you are changing folder names, then the following lines should be used. The RewriteRule section of the following statement should be on a single line and you'll need a RewriteRule line for each folder change."

RewriteEngine On

RewriteRule ^olddir/(.*)$ http://new.com/newdir/$1 [R=301,L]

"Save the file, upload it back into your web (old domain) in the root document directory. Test it out by typing in the old domain name. You should be instantly and seamlessly transported to the new domain."

 

How do I list the old URL of http://handeshobbies.com/catalog/ is now http://handeshobbies.com? :huh:

 

 

Hmm, that is strange. The search engines shouldn't be able to place items in the cart. Be sure your spiders file is up-to-date and the Prevent Spiders option is on. If you have cache enabled and are using /tmp/ as the directory, that can happen.

Spiders.txt is from contributions dated 12/28/2008.

Prevent Spiders option is true.

Cache is enabled and using the /tmp/ directory. Should this setting be as is or changed?

Again, thank you so much Jack for coming to my rescue! :)

- :: Jim :: -

- My Toolbox ~ Adobe Web Bundle, XAMPP & WinMerge | Install ~ osC v2.3.3.4 -

Posted

You don't list the old url, just the directory.

RewriteRule ^catalog/(.*)$ http://handeshobbies.com/$1 [R=301,L]

 

With cache enabled and set to tmp, it is using the servers tmp directory to store and retrieve data. The code doesn't identify what site is to be retrieved, only that it should retrieve data that fits that section of code. So it can load the data from some other site on the server. To fix it, change the path to the tmp directory. See admin->Modules for the path to the files. Be sure tmp actually exists in your root.

 

Jack

Support Links:

For Hire: Contact me for anything you need help with for your shop: upgrading, hosting, repairs, code written, etc.

All of My Addons

Get the latest versions of my addons

Recommended SEO Addons

Posted
You don't list the old url, just the directory.
RewriteRule ^catalog/(.*)$ http://handeshobbies.com/$1 [R=301,L]

Thank you Jack ;)

 

With cache enabled and set to tmp, it is using the servers tmp directory to store and retrieve data. The code doesn't identify what site is to be retrieved, only that it should retrieve data that fits that section of code. So it can load the data from some other site on the server. To fix it, change the path to the tmp directory. See admin->Modules for the path to the files. Be sure tmp actually exists in your root.

 

Jack

 

Uhmm... :blink:

 

Admin -> Configuration -> Cache

Use Cache = true

Cache Directory = /tmp/ and the directory/folder is in my root.

Is that correct?

- :: Jim :: -

- My Toolbox ~ Adobe Web Bundle, XAMPP & WinMerge | Install ~ osC v2.3.3.4 -

Posted
No, the path needs to point to your account. It is pointing to the servers account, unless your host has it setup differently, which it doesn't sound like.

 

Jack

 

One last stupid guestion, I PROMISE, I think...

 

So then the path I should type in = http://handeshobbies.com/tmp

Or should it be my Linux path?

 

:blush:

- :: Jim :: -

- My Toolbox ~ Adobe Web Bundle, XAMPP & WinMerge | Install ~ osC v2.3.3.4 -

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...