thewesterndepot Posted March 31, 2015 Share Posted March 31, 2015 We run an OS Commerce powered web page on our own dedicated server. In April of last year, we were forced to upgrade the server. At around that point, we started to notice that there were seemingly random website outages where our page could not be viewed from any computer and the only solution for this at the time was to do a hard restart on our server. This happened for about a month. That problem has been solved, but now we are faced with a different challenge. Our website is slowing down seemingly at random. Sometimes a page loads very quickly and other times it takes up to 2 minutes to load. There does not appear to be a pattern for this that we can figure out. The website is www.westerndepot.com. We have tried looking into our website health with Yslow and similar, but all queries that we have run show that our website rates an "A". It appears to be the waiting times that are fluctuating rather than the actual loading times. The server has 64GB of RAM and the current max RAM used shows at 10GB. We are using SME Server 9.0. The server is composed of dual WD Caviar Black 1TB drives mirrored using software Raid under SME 9 with 833 GB free space remaining. The products_to_categories table has 18375 entries for 13923 products where 2427 products are inactive. There are 385 total categories. Our MySQL configuration file is as follows: character-set-server=utf8 collation-server=utf8_general_ci key_buffer_size=2048M preload_buffer_size=512M query_cache_limit=64M query_cache_size=512M query_cache_type=1 query_prealloc_size=512M read_buffer_size=2M read_rd_buffer_size=4M sort_buffer_size=2M thread_cache_size=300 join_buffer_size=256K table_open_cache=512K tmp_table_size=256M max_heap_table_size=256M slow_query_log=1 max_connections=500 concurrent_insert=2 log_output=FILE The following is an example of a slow query log. # Query_time: 12.277704 Lock_time: 0.000193 Rows_sent: 1 Rows_examined: 39652497 SET timestamp=1427835801; select count(distinct p.products_id) as total from products p left join manufacturers m using(manufacturers_id) left join (select * from (select products_id, specials_new_products_price, expires_date, status from specials where status = 1 order by products_id, specials_new_products_price, expires_date) as t group by products_id) as s on p.products_id = s.products_id left join products_to_notes ptn on ptn.products_id = p.products_id join products_description pd on p.products_id = pd.products_id join (select products_id from products_to_categories where (categories_id not in (728)) group by products_id) as p2c on p.products_id = p2c.products_id where p.products_status = '1'; # Query_time: 11.654512 Lock_time: 0.000367 Rows_sent: 240 Rows_examined: 36425129 SET timestamp=1427835812; select distinct m.manufacturers_name, p.products_model, p.products_image, p.image_folder, p.image_display, p.products_quantity, p.hide_qty_over, p.products_bundle, IF(s.status, s.expires_date, NULL) as expires_date, m.manufacturers_id, p.products_id, pd.products_name, p.products_msrp, p.products_price, p.products_tax_class_id, IF(s.status, s.specials_new_products_price, NULL) as specials_new_products_price, IF(s.status, s.specials_new_products_price, p.products_price) as final_price, p.sold_in_bundle_only, p.products_type , pd.extra_value_id6 , pd.extra_value_id10 from products p left join manufacturers m using(manufacturers_id) left join (select * from (select products_id, specials_new_products_price, expires_date, status from specials where status = 1 order by products_id, specials_new_products_price, expires_date) as t group by products_id) as s on p.products_id = s.products_id left join products_to_notes ptn on ptn.products_id = p.products_id join products_description pd on p.products_id = pd.products_id join (select products_id from products_to_categories where (categories_id not in (728)) group by products_id) as p2c on p.products_id = p2c.products_id where p.products_status = '1' order by products_name limit 2160, 240; All slow queries come from "advanced search". The site uses multiple sales per product contribution. Any thoughts? Did we miss something simple? Any help, directions, thoughts, or contacts would be greatly appreciated. Link to comment Share on other sites More sharing options...
MrPhil Posted March 31, 2015 Share Posted March 31, 2015 Have you checked if there are any background tasks running, such as backups, tune-ups, etc.? That would be a common reason for irregular slowdowns, but would usually be at fixed times of day. Could any tasks be triggered by load or performance limits? Those could account for slowdowns at unexpected times. Unless you're getting hammered by visitors (including bots) at an irregular rate, I would think that osC itself would run at fairly constant performance. You might want to look at user load, especially robot visits, and consider blocking most robots to see if that helps. Link to comment Share on other sites More sharing options...
BrockleyJohn Posted April 1, 2015 Share Posted April 1, 2015 All that Phil says is true, but your queries look massively suboptimal to me. If they go through 40 million rows it's likely to take a while. I don't know if it's a single contribution to blame - I suspect someone's implemented a combination of them (I can see evidence of others, including bundled products)... it's not easy to figure out exactly what these queries are trying to achieve, but they clearly aren't going about it in the best way. In both queries there's a sub sub select on the specials table, grouped by product id, and I can't see it achieves anything over the usual join. Both queries are going off to find all the products that are in any category other than (or as well as) 728 - and doing it in a very roundabout way. I don't have any experience of the multiple sales per product contribution, but I can't imagine why a shop customer would want a list of stuff outside a list of categories. And even if they do, I wouldn't get it like that. There may be things you need to address on the server, but I'd start by trying to untangle the code in the advanced search. Contact me for work on updating existing stores - whether to Phoenix or the new osC when it's released. Looking for a payment or shipping module? Maybe I've already done it. Working on generalising bespoke solutions for Quickbooks integration, Easify integration and pay4later (DEKO) integration at 2.3.x Link to comment Share on other sites More sharing options...
BrockleyJohn Posted April 1, 2015 Share Posted April 1, 2015 Afterthought - the specials stuff will cater for more than one active special per product. I guess that's what the contribution you mention does. There may be a better way to write that - would a join of a select from specials on products_id limit 1 be faster to execute than a select of a select group by? If there are active specials for inactive products, you don't want to process them at all. Even if there aren't, the current query goes off and gets a list of all the active specials and groups them by product, and it does this once for each active product. Sorting this out will speed up your queries. However, I still don't understand the categories. Contact me for work on updating existing stores - whether to Phoenix or the new osC when it's released. Looking for a payment or shipping module? Maybe I've already done it. Working on generalising bespoke solutions for Quickbooks integration, Easify integration and pay4later (DEKO) integration at 2.3.x Link to comment Share on other sites More sharing options...
thewesterndepot Posted April 1, 2015 Author Share Posted April 1, 2015 Thanks for the responses. I talked with the head IT guy here and he sent me the following: Backups are scheduled to run at times when the server isn't being used much. I don't know of any sort of Linux tune-ups that would be scheduled.AWStats runs every 10 minutes and takes all of 3 seconds to do its thing.Sysmon runs every 5 minutes and takes even less time per run than AWStats. CPU usage shown by Sysmon is about a maximum of 20%. Total system load is about a maximum of 2.3%. With regards to BrockleyJohn's comments: we have been starting to think that the queries are the problem as well (if not the whole problem, then certainly a large proportion of it). I will certainly send our IT manager in the direction of looking at the advanced search. Any further thoughts or resources would be greatly appreciated. The main thing that I am confused by at this point is the seeming randomness of the lags. Sometimes it loads quickly and sometimes it has a 10+ second wait time. The extended wait time happens on both searches and any page loads without any real pattern. Also, does anyone know of any reputable firms or individuals that handle this sort of problem? Link to comment Share on other sites More sharing options...
♥toyicebear Posted April 1, 2015 Share Posted April 1, 2015 Have you checked the servers visitor logs against the "lag" times, many visitors and/or an over eager bot/spider can cause issues especially when using myisam. Basics for osC 2.2 Design - Basics for Design V2.3+ - Seo & Sef Url's - Meta Tags for Your osC Shop - Steps to prevent Fraud... - MS3 and Team News... - SEO, Meta Tags, SEF Urls and osCommerce - Commercial Support Inquiries - OSC 2.3+ How To To see what more i can do for you check out my profile [click here] Link to comment Share on other sites More sharing options...
♥bruyndoncx Posted April 1, 2015 Share Posted April 1, 2015 this is a concurrent user problem on the database level. you can only manage a certain number of long running queries concurrently. I have seen this issue on our system. Even a silly short query might be blocked as only x number of queries can be done at the same time. If you look at the different user processes running on the databse, you can see which one is blocking which and brute force kill the long running process. I believe this kind of situation sometimes is referred to as a "race condition" solutions: 1) sort out those queries listed above if they are from advanced search, why isn't there any real search string listed ? 2) avoid having to do a database query - i have apc caching setup for the configuration data as it rarely changes and also for the different sections on the home page KEEP CALM AND CARRY ON I do not use the responsive bootstrap version since i coded my responsive version earlier, but i have bought every 28d of code package to support burts effort and keep this forum alive (albeit more like on life support). So if you are still here ? What are you waiting for ?! Find the most frequent unique errors to fix: grep "PHP" php_error_log.txt | sed "s/^.* PHP/PHP/g" |grep "line" |sort | uniq -c | sort -r > counterrors.txt Link to comment Share on other sites More sharing options...
♥bruyndoncx Posted April 1, 2015 Share Posted April 1, 2015 also run an optimize tables so the index statistics are correct and consider adding other indexes runnning the query with 'explain query' looking for table scans will point you in the correct direction for where to optimize but ideally you need some one with good (my)sql skills to rewrite your queries in the php file and possibly tune your database further. KEEP CALM AND CARRY ON I do not use the responsive bootstrap version since i coded my responsive version earlier, but i have bought every 28d of code package to support burts effort and keep this forum alive (albeit more like on life support). So if you are still here ? What are you waiting for ?! Find the most frequent unique errors to fix: grep "PHP" php_error_log.txt | sed "s/^.* PHP/PHP/g" |grep "line" |sort | uniq -c | sort -r > counterrors.txt Link to comment Share on other sites More sharing options...
♥bruyndoncx Posted April 1, 2015 Share Posted April 1, 2015 possibly disallow robots to hit the search pages KEEP CALM AND CARRY ON I do not use the responsive bootstrap version since i coded my responsive version earlier, but i have bought every 28d of code package to support burts effort and keep this forum alive (albeit more like on life support). So if you are still here ? What are you waiting for ?! Find the most frequent unique errors to fix: grep "PHP" php_error_log.txt | sed "s/^.* PHP/PHP/g" |grep "line" |sort | uniq -c | sort -r > counterrors.txt Link to comment Share on other sites More sharing options...
BrockleyJohn Posted April 2, 2015 Share Posted April 2, 2015 And if you're running an old oscommerce 2.2 store, make sure that the configuration setting 'Show Category Counts' is set to false (admin / Configuration / My Store) Incidentally, the first query is taking all that time just to find out how many active products there are in a category other than 728. It can probably be avoided altogether; if you're about to go and get them, just count how many you found afterwards. If it's really needed the left joins in it are not as they make no difference to the outcome. This query will give exactly the same result: select count(distinct p.products_id) as total from products p join products_description pd on p.products_id = pd.products_id join (select products_id from products_to_categories where (categories_id not in (728)) group by products_id) as p2c on p.products_id = p2c.products_id where p.products_status = '1' Contact me for work on updating existing stores - whether to Phoenix or the new osC when it's released. Looking for a payment or shipping module? Maybe I've already done it. Working on generalising bespoke solutions for Quickbooks integration, Easify integration and pay4later (DEKO) integration at 2.3.x Link to comment Share on other sites More sharing options...
thewesterndepot Posted April 8, 2015 Author Share Posted April 8, 2015 By working on the indexes in the advanced search we were able to get the rows examined to ~14,000 from 40,000,000. That is certainly helping a little bit, but we are still getting the hangups. Part of the hangup was a google bot hitting the advanced search without any search terms (going through the 40,000,000 rows at random times). We are thinking of changing the sessions table and the whos_online table to InnoDB so that they have auto-repair capabilities (at least from what I understand). Link to comment Share on other sites More sharing options...
MrPhil Posted April 8, 2015 Share Posted April 8, 2015 A bot has no business running searches. Do whatever it takes to keep them out of search pages. Google will certainly obey robots.txt, but other search engines may be ill behaved and ignore it. It should be possible to catch search requests with no terms and return some dummy page, rather than triggering a huge database operation. What does manually invoking an advanced search with no terms do on your site? Link to comment Share on other sites More sharing options...
thewesterndepot Posted April 8, 2015 Author Share Posted April 8, 2015 Previously running a manual advanced search with no terms returned the whole inventory. That has been stopped. Now it triggers a return to the search page without actually searching anything. Since that time, we have had no huge lags. Hope it sticks. Link to comment Share on other sites More sharing options...
Recommended Posts
Archived
This topic is now archived and is closed to further replies.