Jump to content
  • Checkout
  • Login
  • Get in touch

osCommerce

The e-commerce.

Biggest Oscommerce site


Guest

Recommended Posts

I have a client who just moved from yahoo stores to oscommerce. is on a dedicated quad xeon with a gig(sic) of ram, ata raid. Seperate mysql server with nothing else on it. this store is the only database on the server.

 

the clients store opens every 6 weeks for 2 days. For the first 6 hours after opening they get an average of 100,000 pageviews per hour. I havent looked at how many queries mysql is making during this time, but i will next opening. Apart from the lousy amount of RAM is there any reason to believe that oscommerce is to heavy a code to handle this initial load? Right now the store is extremely slow. I realize the ram is a major issue, and also some mods have been added which may be adding queries to the db. But how much can an out of the box oscommerce handle?

 

thanks

Link to comment
Share on other sites

It's not an osCommerce problem - it's really down to how fast the server is, how much ram, and how many concurrent connections can be handled by the mysql server. If he already has a very large database he may want to trim out deadwood or at least 'optimise' it.

 

Vger

Link to comment
Share on other sites

There is a contribution by Daemonj (Jim Daemon) + a thread in tips on cacheing the mySQL queries - this should be a good start to help handle the load.

 

Our next release, MS3, has a new database class and other enhancements that will greatly improve performance :D

 

Matti

Link to comment
Share on other sites

When will MS3 be out, and how hard is the conversion going to be?

 

They sell clothing samples. www.lulushoponline.com n its open for the next few hours.

 

"how many concurrent connections can be handled by the mysql"

Is this a function of how fas the server is? How many queries can MySql the program handle per second/minute/hour? Any indeas on that. Ive googled but havent found a good answer.

 

One more question. Since this store has some modifications. I would like to know how many queries are required by an out of the box install to view a product page? Then I can compare with what I have now.

 

thanks for the fast replies.

Link to comment
Share on other sites

one thing you didnt state was the backbone connection the server has to the internet. if the customer has a separate system just dedicated for the web site, why didnt you put osC and mysql on the same system?even tho it is ever so slight, there is a time delay between the two systems, and if one connection may cause it a delay of 1 microsecond, and you may have 100000 page veiws per hour and say you get 10000 customers doing this, at a 1 microsecond delay from systems, that can add up.

also are you using the mysql small, medium or large cnf file

Link to comment
Share on other sites

I am in the process of finding out all the server specs from the client. Both the main server which I listed some of the components above and the DB server which I have no idea about until the morning. I specifically asked about the connection between the servers. Hopefully its fiber between the two but who knows it could be a dusty belkin 10/100. I dont know much about the host. I hope it wasnt the host that recommended a Quad Xeon server with a measly Gig of RAM, thats not inspireing.

 

MySql was just moved to its own server because the previous tech/programmer suspected it was having to work to hard to display pages and make the queries.

 

27 page views a second is what i suspect their average is during the first 6 hours. If oscommerce queries MySql 100 times per page (just a figure I read, I havent tested it yet, am I way off?) thats 2700 queries per second. Can MySql/ oscommerce handle that.

Link to comment
Share on other sites

I am not a MySql expert from what Ive been reading I suspect that MySql is not an issue. Although I did read a few forums where they recommend Oracle 9 and Java for stores this busy. Lets not do that argument though.

Link to comment
Share on other sites

rather than me explaining it, go to mysql.com and read up about the configuration files available for installation with your servers, all based on activity, etc.

Link to comment
Share on other sites

Make sure you optimize the code for queries before going live. A vanilla install with default specials and featured products will have an unbelievable ~60-75 queries per page! This will cripple your server at the traffic volume you're talking about.

 

Disable the non-essential features (like banners, category counts, etc.) and find your own way of optimizing the tep_get_tax_rate() method. The MS3 release does a great job of introducing a tax class that greatly reduces the query load but the MS2 code base is a beast. I chose to put the tax rate into session and save the queries. A more elegant method would be to emulate the tax class of MS3...either way it's your choice.

 

I would suggest creating some simple output code so you can work more efficiently. For example, here is one of my client sites which is down to about 25 queries on the index page (fully loaded with specials, featured products, affiliate, vBulletin integration, articles, GV / Coupons, etc.) => CLIENT SITE.

 

Skip down to the bottom and it outputs the content of the COOKIE and SESSION globals + a query count and also lists EACH QUERY executed.

 

When I first installed the output code it absolutely amazed me the number of redundant queries (yes, query cache is enabled on his server).

 

BTW, to turn off the output simply use the URL request ?output=0

 

If you want my output code I'll send it over...

 

Good luck!

 

Bobby

Link to comment
Share on other sites

I'll get the specs for his dedicated server later but I know it's in the upper 2+ Ghz CPU, 1Gb+ RAM, with a 10 Mbs connect to 1 Gig uplink.

 

How did you get 36 queries? Did you add a few things to the shopping cart? A new customer to the index page will have the query count to about 25...

 

Also, keep in mind that those numbers are a little slow right now since the data center where he colos the server is under DDoS attack (I know since I also colo my server at the same data center!)

 

Bobby

Link to comment
Share on other sites

Got 36 queries by clicking on a product from the index page. it was 25 or 26 on your index page.

 

Can you post that output code too, please.

Link to comment
Share on other sites

In catalog/application_top.php place this code at the top:

  define($qnum, 0);
 define($qs, '');

 

In catalog/application_bottom.php place this code above the tep_session_close() method:

	if ($_REQUEST['output'] == '0') $_SESSION['output'] = '0';
if ($_REQUEST['output'] == '1' || $_SESSION['output'] == '1')
 {
	 $_SESSION['output'] = '1';
	 echo '<pre><b>COOKIE INFO:</b><br>';
	 print_r($_COOKIE).'</pre>';
	 echo '<hr>';
	 echo '<pre><b>SESSION INFO:</b><br>';
	 print_r($_SESSION).'</pre>'; 	 
	 echo '<hr>';
	 echo '<p>'.$qs.'</p>';
 }

 

In catalog/includes/functions/database.php replace these functions with this code:

[ changing tep_db_query() and tep_db_perform() ]

  function tep_db_query($query, $link = 'db_link') {
   global $$link, $qnum, $qs;

   if (defined('STORE_DB_TRANSACTIONS') && (STORE_DB_TRANSACTIONS == 'true')) {
     error_log('QUERY ' . $query . "\n", 3, STORE_PAGE_PARSE_TIME_LOG);
   }

   $result = mysql_query($query, $$link) or tep_db_error($query, mysql_errno(), mysql_error());
$qnum++;
$qs .= 'Query #'.$qnum.'<br>'.$query.'<br>';
   if (defined('STORE_DB_TRANSACTIONS') && (STORE_DB_TRANSACTIONS == 'true')) {
      $result_error = mysql_error();
      error_log('RESULT ' . $result . ' ' . $result_error . "\n", 3, STORE_PAGE_PARSE_TIME_LOG);
   }

   return $result;
 }

 function tep_db_perform($table, $data, $action = 'insert', $parameters = '', $link = 'db_link') {
   reset($data);
   if ($action == 'insert') {
     $query = 'insert into ' . $table . ' (';
     while (list($columns, ) = each($data)) {
       $query .= $columns . ', ';
     }
     $query = substr($query, 0, -2) . ') values (';
     reset($data);
     while (list(, $value) = each($data)) {
       switch ((string)$value) {
         case 'now()':
           $query .= 'now(), ';
           break;
         case 'null':
           $query .= 'null, ';
           break;
         default:
           $query .= '\'' . tep_db_input($value) . '\', ';
           break;
       }
     }
     $query = substr($query, 0, -2) . ')';
   } elseif ($action == 'update') {
     $query = 'update ' . $table . ' set ';
     while (list($columns, $value) = each($data)) {
       switch ((string)$value) {
         case 'now()':
           $query .= $columns . ' = now(), ';
           break;
         case 'null':
           $query .= $columns .= ' = null, ';
           break;
         default:
           $query .= $columns . ' = \'' . tep_db_input($value) . '\', ';
           break;
       }
     }
     $query = substr($query, 0, -2) . ' where ' . $parameters;
   }
$qnum++;
$qs .= 'Query #'.$qnum.'<br>'.$query.'<br>';
   return tep_db_query($query, $link);
 }

 

Save the files, upload, and activate by placing the ?output=1 request to any URL. Once it is activated it will put it into session and each page will have the output at the bottom ONLY for your browser (customers will not see it). To deactivate simply put ?output=0 request to any URL.

 

Enjoy...

 

Bobby

Link to comment
Share on other sites

Why did you not just Enable logging, show parse time, and store queries?

 

One modification that you can make is to the catalog/includes/functions/database.php file for the tep_db_query function.

 

Store the parse time for the query and then when looking through the log file you can determine which queries are slowing your site down.

 

Also, you should definitely create some indexes on all join and order by fields if you are expecting that kind of traffic.

 

Using persistant connections goes without saying and using anything more than the medium.cnf mysql configuration file on that server will cause excessive paging.

 

Your database server should have 2GB or more for that kind of traffic.

"Great spirits have always found violent opposition from mediocre minds. The latter cannot understand it when a man does not thoughtlessly submit to hereditary prejudices but honestly and courageously uses his intelligence." - A. Einstein

Link to comment
Share on other sites

The reason why I chose that route instead of storing it all in the database (and then create a report to view the data) is because the output had to be on a per page basis and in real time.

 

The purpose of me creating the small output code was to figure out how each page could be optimized. In this manner I could output the query info on a per page basis, look for redundant queries, and correct them. For instance, I found that a simple JOIN in one method would eliminate 12 queries. That kind of information would be hard to decipher from storage.

 

See my logic?

Link to comment
Share on other sites

  • 3 weeks later...

I jsut released a page_cache contribution that should bring your per page load time down to about .1 seconds and 10 queries. This is optimized for large volume shops and offers the best of both worlds: dynamic info but static page server load.

 

What more can you ask for with large stores?

 

Readme file with live demos, installation info, etc.

 

Contribution Announcement

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...