Jump to content
  • Checkout
  • Login
  • Get in touch

osCommerce

The e-commerce.

Search Engine Optimization Guide


nalin

Recommended Posts

Introduction and Overview

This is planned to be a series of posts describing the implementation of code and modules which allow for the integration of OSCommerce and other open-source software projects (notable a LAMP server; linux, apache, mysql, and an ambiguous P...in this case PHP), to deploy a site which is 100% indexable, and which features backend database modifications to make integration of Key Phrases and/or keywords as automated and trivial as conceptually possible.

 

This first post attempts exclusively to handle indexing and robust host setup. I discuss server configuration and modifications necessary to make apaches mod_rewrite compensate for the lacking code in SE safe modifications on OSC 2.2 MS2. I assume that these modifications largely require a dedicated server and as such I go into minor modifications which allow for mod_rewrite based virtual hosting (which is beneficial over other methods in its ability to add hosts on the fly without restarting apache). Many of the ideas here are extensible to software and platforms other then a LAMP server, and certainly to different distribution choices and software versions. As exploration is part of the open source spirit I invite and encourage the reader to adapt these solutions to whatever combination of software they feel it most suited for.

 

Though this would best be deployed on non production servers, barring errors (and plan for errors!) it could be deployed in perhaps an hour, with a handful of apache reloads. For the benefit of those who would perfer to try this out in a production enviornment I have consolidated all apache.conf modifications later in this document so that only a single apache reload is necessary. I believe (but cannot guarentee) that none of the rewrite rules would adversly effect a default OSC installation, thus for a production enviornment one could change the apache(2).conf file and then implement the changes to oscommerce at leasure

 

Prerequisite software:

Linux Server (*very* preferably one where you have root access. minimally you will need to be able to modify .htaccess files and have mod_rewrite installed)

Apache compiled with the mod_rewrite module.

For the record I am using gentoo linux 1.4 pr1 and apache 2.0.47

 

 

Implementation:

Step 1 - Setup Apache2 to work with mod_rewrite:

Edit /etc/apache2/conf/apache2.conf and append as necessary

#adjust as appropriate to your root dir
<Directory "/pub/www">
Options Indexes MultiViews
#defines the use of .htaccess, "None" is fastest, "All" most customizable
AllowOverride None
Order allow,deny
Allow from all
#the these have security implications
Options FollowSymLinks ExecCGI
</Directory>

RewriteEngine On

#facilitates debugging
#change RewriteLogLevel < 3 for non-debuggin purposes
RewriteLog "/var/log/apache2/rewrite.log"
RewriteLogLevel 3

RewriteMap  lowercase  int:tolower

#test rule
#change "administration" below to reflect your oscommerce administrator directory
#remap / to /catalog/
RewriteCond %{REQUEST_URI} !^/administration(.*)$
RewriteCond %{REQUEST_URI} !^/catalog(.*)$
RewriteCond %{REQUEST_URI} !^/phpmyadmin(.*)$
RewriteRule ^/(.*)$ /catalog/$1

 

Testing: The pupose of the last block of code is to remap / to /catalog/ (except in the case that you are visiting some other directory), which can give us a minimal better rank for pages because they appear closer to the root directory. To test it goto http://domain.tld/ and it should appear that you are visiting the http://domain.tld/catalog/index.php page. If this is not the case, use the debugging file for more information

tail -n0 -f /var/log/apache2/rewrite.log

reload browser and look at what is output here, you should have something along the lines of:

66.214.41.75 - - [28/Oct/2003:05:03:24 +0000] [www.overnightheadsets.com/sid#80a6f98][rid#8357ab0/subreq] (3) applying pattern '^/(.*)$' to uri '/index.php'
66.214.41.75 - - [28/Oct/2003:05:03:24 +0000] [www.overnightheadsets.com/sid#80a6f98][rid#8357ab0/subreq] (2) rewrite /index.php -> /catalog/index.php
...similarly with stylesheet and images etc...

If you get nothing in the rewrite.log file please consult apache.org or google.com, as they are both capible of providing extensive information on whats broken...

 

Step 2 - Fix calls to basename($PHP_SELF):

With mod_rewrite it is entirely possible to kill some of the php enviornmental variables and the call to basename($PHP_SELF) demonstrates this. Calls to basename($PHP_SELF) which *should* result in the name of the file, resulted in an empty string for me. We want to fix this in advance and in a manner that will not take effect until we explicitly use se safe urls, and thus want to define a new function. To seperate added code in a modular manner I have done this in the file catalog/includes/functions/custom.php which needs be included by catalog/includes/application_top.php.

catalog/includes/application_top.php

//other includes from the functions directory
...
//require custom definitions
include( DIR_WS_FUNCTIONS . 'custom.php' );

catalog/includes/functions/custom.php

function custom_se_safe_link( $link ) { {
 if (SEARCH_ENGINE_FRIENDLY_URLS == 'true')
   return ereg_replace( ".php(.*)$", ".php", ereg_replace( "^/", "", $link) );
 else
   return basename($link);
 }
}

Now is the toughest part, replace (the seemingly neverending) occurances of "basename($PHP_SELF)" with calls to "custom_se_safe_link($PHP_SELF)"

I will leave doing it using "sed" (stream editor) as an excercise for the reader as I think i've had enough fun with regular expressions thus far and theres more to come.

Testing needs be omitted here until completion as results should not change until one set the SEARCH_ENGINE_FRIENDLY_URLS. Upon completion of all steps, you can test using an automated link checker such as linklint to spider your site and insure the integrity of all urls.

 

 

Step 3 - Convert OSC SE Safe URLS to "normal" URLS:

Set (administration)->configuration->sessions->force cookie use to True. Sessions are handled uniquely in urls and this will accomplish two goals. It will allow session wary spiders to index our site (most or all spiders dont because by starting different sessions a spider could generate infinate content from a single site), and it will allow us to ignore the unique handling of sessions within the urls

 

SE Safe urls works as follows:

  • Assume variables are defined in the form VAR=VAL (ie there are no boolean or flag variables)
  • Change urls of the form page.php?var1=val1&var2=val2&var3=val3...&varn=valn to those of the form
    page.php/var1/val1/var2/val2/var3/val3/.../varn/valn

Given the pairs of variables it is relatively trivial to use a series of regular expressions to parse the modified url back into unmodified urls

I have accomplished this in two rules which are best placed as the first rules in the apache2.conf file (this is because the first "loops" and therefore they would execute anything prior to them multiple times). I have placed them just prior to the "#test rule" line above.

#change all occurances of *.php*/varN/valN to &varN=valN
RewriteRule ^(.*)\.php(.*)/(.*)/(.*)$ $1.php$2&$3=$4 [N]

#change the *.php&* to *.php?*
RewriteRule ^(.*)\.php&(.*)$ $1.php?$2

Testing:

Navigate your site until you have atleast two url variables. Then replace all occurances of '?' '&' '=' with the '/' charicter. This should result in the same page. In case of differing results tail the logfile as above.

 

 

Step 4 - Give it a whirl:

Set (administration)->configuration->my store->Search-Engine Safe URL's to True. If all is as it should be you can marvel as mod_rewrite allows you to ignore the "still in development part" and your shiney new site joins the ranks of those sites which parse static urls into beautiful dynamic content

 

 

Apendix A - Virtual Servers and a more complete apache2.conf file:

If youve gone the to trouble of installing and configuring mod_rewrite you might as well milk it...the following allows one to define new hosts on their virtual server by creating a directory, literally...I make a /pub/www/hosts/[www].domain.tld directory and apache serves it up to those who ask for it. For interested parties, there is a very extensive post in the gentoo linux forums which details extending this to mail,ftp,dns and the like (incedentally I think this is one of the most awsome server extensions ever) here

You can see my implementation of this concept below

 

relevant sections of apache2.conf in their intirety

RewriteEngine On

#change RewriteLogLevel < 3 for non-debuggin purposes
RewriteLog "/var/log/apache2/rewrite.log"
RewriteLogLevel 3

RewriteMap  lowercase  int:tolower

#change all occurances of *.php*/varName/X to &varName=X
RewriteRule ^(.*)\.php(.*)/(.*)/(.*)$ $1.php$2&$3=$4 [N]

#change the *.php&* to *.php?*
RewriteRule ^(.*)\.php&(.*)$ $1.php?$2

#remap / to /catalog/
RewriteCond %{REQUEST_URI} !^/administration(.*)$
RewriteCond %{REQUEST_URI} !^/catalog(.*)$
RewriteCond %{REQUEST_URI} !^/phpmyadmin(.*)$
RewriteRule ^/(.*)$ /catalog/$1

#remap non-existant.tld to /pub/www/default
RewriteCond  !/pub/www/hosts/${lowercase:%{SERVER_NAME}} -d [OR]
RewriteCond  !/pub/www/hosts/www.${lowercase:%{SERVER_NAME}} -d
RewriteRule  ^/(.*)$  /pub/www/default/$1 [L]

#remap www.domain.tld to /pub/www/host/domain.tld
RewriteCond  /pub/www/hosts/${lowercase:%{SERVER_NAME}} -d
RewriteRule  ^/(.*)$  /pub/www/hosts/${lowercase:%{SERVER_NAME}}/$1

#remap domain.tld to /pub/www/host/www.domain.tld
RewriteCond  /pub/www/hosts/www.${lowercase:%{SERVER_NAME}} -d
RewriteRule  ^/(.*)$  /pub/www/hosts/www.${lowercase:%{SERVER_NAME}}/$1

# this log format can be split per-virtual-host based on the first field
LogFormat "%V %h %l %u %t \"%r\" %s %b" vcommon
CustomLog logs/access_log vcommon

 

 

Appendix B - Cavearts:

1) This can and will break stuff see basename above

2) This may require additional configuration for preexisting server stuff, for example phpmyadmin cant figure out where it is and one needs to explicitly define its url

 

 

Ideas in progress / Todo / Great Excercises for the reader to post here :)

The foremost direction I want to go with this is to a set of SQL queries which generate a mod_rewrite map that maps ids to product names and ids to catagory names (my index, products, and catagories are where SE indexing most concerns me). This would allow one to specify the url in text rather then numerically which would rank pages slightly higher. Conceptually it should be easy to implement, but my current project of incorperating database-driven phrases has put this one on the back burner.

The other big one that is more complicated, is modifying urls so that pages remain close to the root directory. This is difficult in the sense that there arnt a lot of characters to do it with, perhaps " " (%22) would work, I thought about ~ but i think it is reserved and would conflict with home dirs and what not. Any good ideas on how to handle this in a generic manner would be appreciated.

 

 

Conclusion:

Mod_rewrite is awesome! I think its implications for SEO and the like are very impressive, and though it cannot address pagerank and the like it can and does handle those minor (or in the case of getting indexed, fundemental) changes that allow one to do every little thing to make their site as optimized as possible.

Link to comment
Share on other sites

OK, I'l admit I only skimmed through this doument, however, I did have a couple of questions.

 

1.) How is this implementation of mod_rewrite different from the standard OSC SEF switch?

 

2.) Since Internet Explorer now comes standard with cookies NOT enabled, do you think that the pop up warning all customers using the latest versions of IE will get when you force cookie use will adversly affect sales?

 

3.) Since all major search engines now index dynamic urls with '?', and '=' just as well as urls without them, why would you think a mod_rewrite is necessary?

 

4.) Since there is already a contribution that inserts the product name into the URL, why would you need to replace the product id? (I guess just to shorten the URL a little)

 

Thanks.

-------------------------------------------------------------------------------------------------------------------------

NOTE: As of Oct 2006, I'm not as active in this forum as I used to be, but I still work with osC quite a bit.

If you have a question about any of my posts here, your best bet is to contact me though either Email or PM in my profile, and I'll be happy to help.

Link to comment
Share on other sites

OK, I'l admit I only skimmed through this doument, however, I did have a couple of questions.

 

1.) How is this implementation of mod_rewrite different from the standard OSC SEF switch?

 

2.) Since Internet Explorer now comes standard with cookies NOT enabled, do you think that the pop up warning all customers using the latest versions of IE will get when you force cookie use will adversly affect sales?

 

3.) Since all major search engines now index dynamic urls with '?', and '=' just as well as urls without them, why would you think a mod_rewrite is necessary?

 

4.) Since there is already a contribution that inserts the product name into the URL, why would you need to replace the product id? (I guess just to shorten the URL a little)

 

Thanks.

1.) It is able to compensate for that nasty "(still in development)" status of the OSC SEF switch, able to handle multiple hosts, will allow for more robust url changes, and can be used to alter urls on a site-by-site or system wide basis. It also is vastly more efficient to do rewriting within apache via apache.conf then within site php files (though this is not necessarily true if using .htaccess to define rewriting rules). Finally there are miscellaneous bonuses like the ability to block malicious robots from spidering your site to gather email addresses, and all sorts of other bells and whistles out there if you look for them.

But, in the interest of fairness here, it suffers from the necessity of root access and more configuration then most webmasters might be able to do effectively. It also is specific to apache, though there are similar tools for almost any flavor of webserver.

2.) I was not aware of this. The little work I do on windows computers tends to be done within mozilla. If this is true yes that is a bit of an obsticle, however given the proliferation of both IE and cookies, I dont necessarily think this is the case. Perhaps it was that windows update preserved your IE settings?

3.) There are a lot of search engines out there, some rather old but still getting a fair amount of use. While you could ignore that 25% or 15% or 5% of traffic that will not be able to reach you, I would prefer that all traffic can and does reach me.

4.) Again, I was not aware of this (and due to the external nature of maps, they dont necessarily get the speed gains described in 1.). I will look into that contribution and the feasibility of incorperating it for this aspect of the writeup

 

One last comment, feel free not to use it; there are squabbles all over osc forums about Search engine indexing and the like and really the last thing I want to inspire is another. I dont work for google nor know how they operate, nor can I tell you definatively whether a default OSC installation is indexable. Rather I am looking to provide a means that guarantees a site to be indexable by google and every other engine out there, which will expand eventually, to be inclusive of code that patterns pages in the same manner that an SEO company would pattern pages.

Link to comment
Share on other sites

Thanks for taking the time. Your post and answers were very thoughtful, throught provoking, and well organized, and it was a pleasure to read them.

 

Just to clarify for everyone, you are saying this this SEF url creator (for lack of a simpler term) is more universal, mor efficient, and has a few more perks than the stock OSC one.

 

My comments on your answers...

 

1.) Can you elaborate on how it might block unwanted spiders such as email harvesters?

 

 

2.) The latest versions of IE that I've worked with do not have cookies enabled, for whatever reason. I've also read an article (which I'll attempt to dig up) that suggested that this was a permanent switch.

 

3.) Good answer! :-) Most people are formost concerned with the top 10 or so search engines, which all index dynamic content ulrs fine now, but you are right, that are still a few obscure engines out there that are not compatible.

 

4.) The mod i refer to adds a "fake" parameter to the url, for the sole purpose of gettign the product name into the URL for an index placement boost. The mapping is actually better idea.

 

Finally, I have one further question.l If somone is already indexed on all of the major search engines with dynamic content urls, would you recommend that they use this mod?

-------------------------------------------------------------------------------------------------------------------------

NOTE: As of Oct 2006, I'm not as active in this forum as I used to be, but I still work with osC quite a bit.

If you have a question about any of my posts here, your best bet is to contact me though either Email or PM in my profile, and I'll be happy to help.

Link to comment
Share on other sites

1). here and here are two documents which cover this in more detail then I can.

4). No not really. I dont think some aspects of this guide could hurt placement (such as eliminating catalog/), but placing content from your root directory 3 or 4 directories deep isnt going to help ranking. This first step is meant to handle indexing, and if that goal is already accomplished ones satisfaction then nothing here is very pertanant.

That said there are other things you could do with this module which would benefit page ranking...

Link to comment
Share on other sites

  • 2 months later...

Update:

 

The site incorperating these suggestions was submitted (google) ~11/03

 

Googlebot has indexed twice since and currently has approximately 60 pages including an assortment of categories and products (though I will omit a link, you can see the results by googling for "allinurl:(URL from my profile omitting the 'http://www')" and then clicking the 'repeat with omitted results...' link.

 

The pages include the entire gambit...homepage and relatively static stuff, categories, and products. FYI there are more than 60 pages in the site but I think our page rank proved problematic for a more complete index. Nothing in my logfile analysis seems to indicate any problems for google other than low page rank and lot of "same-ol'"

 

OK, so part two of the promised series of posts is forthcoming, I took it upon myself to ensure that part two did not get my site banned and as such it has been some time in the making. You can see (the implementation of) part two by viewing source on the various pages indexed. The titles generally list some but not necessarily all of the phrases the page is optimized for. So far I am very pleased with the results...an SEO company that has been working on one of our other sites for approximatly six months and is getting the majority of traffic through looksmart is not competative with the hits I got two months into this process. Time permitting I will start a contribution with the code for part two and a corresponding thread within the next couple weeks.

Link to comment
Share on other sites

I Do see you have many of your pages in google. However none of your pages have a pagerank > 0. IT doesnt look like you are inte top ten of many of your keywords. Are you spreading your pagerank to thin. Did google penalize you to a PR0?

 

Scott

Link to comment
Share on other sites

I Do see you have many of your pages in google. However none of your pages have a pagerank > 0. IT doesnt look like you are inte top ten of many of your keywords. Are you spreading your pagerank to thin. Did google penalize you to a PR0?

 

Scott

I do not believe there is any penalization going on, I think rather that a "link:"...search reveals the culprit, (a lack of incoming links, building link weight was put on the back burner until very recently).

 

As for the keyword placement, some of the terms we are getting traffic are listed below (position is omitted, the terms are ordered by traffic so the first few are the best placed/most searched).

Generally, the terms reflect pages which have been optimized with specific, less competitive terms (those which have non-generic titles when searching with allinurl:). As the "optimization" consists exclusivly of entering desired search phrases in a table, the results below are very promising, specifically because they indicate that those pages with coresponding database entries are doing well.

 

wireless usb headset

plantronics parts

wireless pc headsets

pc headsets wireless

usb headset adaptor

usb pc headset

usb wireless headset

usb headphones

nortel headsets

usb telephone headsets

voice amplifier with headset

plantronic vista

nortel headset

plantronics dsp100

replacement parts plantronics

plantronic headset m10

usb telephone headset

computer headset dsp

plantronics equipment

wireless headset usb

plantronics usb adaptor

plantronic accessories

usb pc headsets wireless

plantronics replacement parts

meridian headset

Link to comment
Share on other sites

Hello!

 

I have milestone MS2 installed with some SE friendly and a few html pages . I submitted my site to google a few weeks ago and luckily my site was indexed within a few days. Now, 3 weeks later majority of the site pages are indexed .

 

Great news so far...but a few days ago I noticed several things which I worry might our ranking

 

1. When I try to open source code with URL source code doesnt pop up. If i remove the sid portion from the URL and refresh the page , i can look at the source code . SOmetimes source viewer pops up but it is without any text.

There is no such roblem with Mozilla.

 

2. Somehow the code breaks...my URLs sometimes show like this

http://www.ativasativa.com/index.php/cPath/2

instead of

http://www.ativasativa.com/index.php/cPath=2

 

 

3. It looks like Google isnt ale to read the content of varous pages. I have checked with various spider simulators...most of them read only the title and the link...no content at all.

 

 

I feel totally stuck here. would appreciate any input. Here is the link to my site

www.ativasativa.com

 

Thank you

Link to comment
Share on other sites

Is it common for ten to fifteen terms that were top 5 to just vanish on an update?  If so, will they return during the update, an upcoming update, or will they be lost for ever?

WebmasterWorld has this topic exhausted quite throughly...you might want to google for it or look through webmaster worlds forums, most of the time the results do not reappear.

 

Generally, when you look at google for any semi-competitive term you see "natural rankings" which are a result of hard work, purchase, or manipulation, and each update google blocks some of these and other perculate up.

 

EDIT: blocks is a bad word, penalizes would be more accurate

Link to comment
Share on other sites

Hello!

 

I have milestone MS2 installed with some SE friendly and a few html pages . I submitted my site to google a few weeks ago and luckily my site was indexed within a few days. Now, 3 weeks later majority of the site pages are indexed .

 

Great news so far...but a few days ago I noticed several things which I worry might our ranking

 

1.? When I try to open source code? with URL?  source code doesnt pop up. If i remove the sid portion from the URL and refresh the page , i can look at the source code . SOmetimes source viewer pops up but it is without any text.

There is no such roblem with Mozilla.

 

2. Somehow the code breaks...my URLs sometimes show like this

http://www.ativasativa.com/index.php/cPath/2

instead of

http://www.ativasativa.com/index.php/cPath=2

 

 

3. It looks like Google isnt ale to read the content of varous pages. I have checked with various spider simulators...most of them read only the title and the link...no content at all.

 

 

I feel totally stuck here. would appreciate any input.?  Here is the link to my site

www.ativasativa.com

 

Thank you

1: I am not sure I understand your question...with apache you should be able to define php mimetype to execute in backend and then serve, but the details of how this is done depend on linux distribution / Operating system.

 

2: This is dependant on making sure that all calls go through the tep_href_link function. If there are urls which do not use the function they will not be rewritten, I would use a tool such as 'linklint' to spider your site and see which pages are generating invalid links/404s/anything else undesireable.

 

3: it seems google is reading (and caching) your pages fine, if you try something like a google for "(Cashmere) scarf to help you brave the chilly weather or an exotic handmade" (arbitrarily copied from your index), it gets the index page. If there is a page which is listed in allinurl: search that this does not happen on then please give me an example of it (not that some of the boxes in OSC change contents (specials, reviews, etc) and as such it is best to avoid this area for test phrases).

Link to comment
Share on other sites

Nalin...Thanks very much for detailed response.

 

1.( I am not sure I understand your question...with apache you should be able to define php mimetype to execute in backend and then serve, but the details of how this is done depend on linux distribution / Operating system.)

 

What basically I meant was that when I browse the oscommerce portion of my site in internet explorer, i am not able to view the source code(view-source). This doesnt happen with Mozilla though.

 

2.the code breaks..your response( This is dependant on making sure that all calls go through the tep_href_link function. If there are urls which do not use the function they will not be rewritten, I would use a tool such as 'linklint' to spider your site and see which pages are generating invalid links/404s/anything else undesireable.)

 

I realized that I have turned on the SE friendly URL option...this apprantly changes the code.

 

3. My biggest worry was that Google wasnt able to look at page content as was shown by the "confusing spider simulators". You showed me that the Google is reading the page content actually fine. This was my biggest worry. I am very grateful to you for helping me to figure this out.

 

Google has already indexed about 80 pages of my site...so hopefully evrything is going to be fine.

 

Thank you again

Dumb_Question

Link to comment
Share on other sites

Google has been penalizing sites for over optimization quite a bit lately, (since november update, ive noticed many of my top keywords being removed because I link to the pages with the exact terms.)

 

Changing words and sentence structure will help fix this.

 

construction hard hat

 

change to:

 

hard hat for construction

 

the 'for' is dropped and beats the penalty.

Link to comment
Share on other sites

  • 2 weeks later...

Iv'e got a question noone seems to be able to answer....

 

I can't get my header_Tags controller to work, as I don't know where to look to change. I tried working the header_Tags.php and it doesn't have anything for alot of my pages, just one for default.php... Can anyone give me a list of pages that should be in there, or a link to a file showing me where the hell I'd be able to edit my page titles and descriptions, so I can submit it to google? This is in desperate need, noone can answer this, or noone tries....

Link to comment
Share on other sites

The defaults are here.

/catalog/includes/languages/english/header_tags.php

 

They will be overridden by the title/description/meta tags in your Products page.

 

For Category Page header tags. You may need a separate contribution.

ibandyop

Link to comment
Share on other sites

I know where they are, as I specified.

 

How do I decipher what page is what in the header_Tags.php is my question?

If someone might take their copy, or a bare copy of header_tags.php and paste it in here, and highlight certain parts, and explain what they are, and what I am to do with them, and such, that'd be great, as thats the kind of help I'm looking for. I am not a master of php, and it is unknown to me.

Link to comment
Share on other sites

Olorin,

I dont know if you are asking specifically about the header tags on my site, but these are done via stream buffering and not with osc code...basically the page(s) are generated as a giant string which I then do find and replace operations on (it was done in this manner so that minimal changes to the underlying code were necessary, and also to allow for more extensive changes then the header variables alone could provide). It is slower then something more embedded, but also more expandable and robust.

 

I am in the process of better documenting, making a contribution for, and ensuring ms2 compatibility of my underlying code, if you would like I can post what I have thus far towards this end...

Link to comment
Share on other sites

  • 3 months later...

Hi, I saw the change you made to osCommerce using mod_rewrite. Its a good change. Personally, I'd like to see products pages look like static HTML files. Remove all the category= or product=. It just adds to the "noise" of the URL. I also would prefer to see category ids before the product ids. To me, it just reflects the navigation that buyers have to go through to visit your product. Forming your URL in the same way couldn't hurt.

 

Free marketing your web store is such an important component to having an ecommerce store. I would have switched osCommerce a long time, but this has to be one of the blocks that make me reconsider every time I think about switching... :(

Link to comment
Share on other sites

Sun,

 

There is a 'search engine friendly URL" contribution which changes the URL to something like domain.com/store/category/2/product/23....(not exactly that but similar). It cleans it up a bit.

 

Btw, are you sunk818 from anandtech..the one who used to sell the remotes?

Link to comment
Share on other sites

Hello,

I have static html looking page urls, with category names and product names in the url, with the page name being the page title (for example: http://www.freeriderstores.com/category/T-...ga_G_Print.html - not sure if it works yet from a ranking improvement perspective (only implemented it recently). But if anyone would be interested in the code then i can release it as a contrib. (its a bit messy but it does work)

Sam

Link to comment
Share on other sites

What basically I meant was that when I browse the oscommerce portion of my site in internet explorer, i am not able to view the source code(view-source). This doesnt happen with Mozilla though.

 

...

 

Thank you again

Dumb_Question

This happened to me at work a few months back. When I asked our support staff they said it was a known problem with IE. The solution was to delete my temporary internet files. Cured it no problems :D

 

Regards,

Brian.

Link to comment
Share on other sites

  • 4 weeks later...

Hello Sam,

i would be very interested. Since two weeks i searched for a constribution like yours. Please tell us where we can download your constribution.

Many thanks

volupp

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...