profundiscom Posted September 8, 2005 Share Posted September 8, 2005 I'll apologize first because I'm sure it's addressed someplace very obvious, or in the man pages, whathaveyou etc. but can someone tell me why I am having no trouble settingTINY (1-2MB) downloads up, but experience HUGE problems the moment I try to set a DIVX or large wmv. (200+MB=Timeout, blank page.) Potential fixes I have heard so far: 1. It's the php.ini settings (possibly) 2. it's the download.php file (probably some truth to this- |die?! not set right?) 3. Can be fixed by modifying an htaccess file (doubt it) Any quick help or pointers to the correct thread are indeed welcome. Nope, not a noob. But I feel like one after looking for 3 hours for a simple answer I can't find. I'll make an additional $10 donation to Louisiana flood relief if my post is anserwed today. Thanks all. Link to comment Share on other sites More sharing options...
♥Vger Posted September 8, 2005 Share Posted September 8, 2005 There will be a download limit for file downloads via php set in php.ini (usually 8Mb). There is also a php Max execution time also set, usually 30 seconds. Even if you have access to a local php.ini file then to be able to allow for 200Mb downloads and the time they would take would: 1. Use up resources on a shared server so that other sites would stop responding altogether or at the very least slow right down. 2. Draw the attention of your hosting company! You really need, at the very least, a Virtual Dedicated Server. I believe that's a $10 donation post! Vger Link to comment Share on other sites More sharing options...
profundiscom Posted September 9, 2005 Author Share Posted September 9, 2005 Not denying the $10 donation. I'll do it during the TV Relief Program. But back to the PHP. So I'm aware about the PHP/Apache server memory difficulty, but I am entirely unconcerned about bandwidth or what the hosting company thinks. We're running the server ourselves and the potential profit is of more concern than follow up costs. The server slowdown would however be unacceptable. So, the dedicated server is one solution. Someone said that I should look for a more DRM based package or serve the actual files out through on the web server, but encrypted, and then just sell them activation codes. BUT, hey, I really like the osCommerce, as it's an easy setup, pretty simple to customize and has a good support/developer network. Now, the worst part is, I would swear somebody is aware of, and has a fix or workaround. A script? A config setting? Dunno. But if you have the solution for using OSC for DIVX downloads. See following message ID's for my suspected proof. I may wind up contacting these people eventually, if nobody has a solution. http://www.oscommerce.com/forums/index.php?showtopic=169217&hl= http://www.oscommerce.com/forums/index.php?showtopic=165980&hl= Link to comment Share on other sites More sharing options...
♥Vger Posted September 9, 2005 Share Posted September 9, 2005 If this is your own dedicated server then increase the max execution time and file limit in php.ini and see what happens. If it's not your own dedicated server then, like I said before, I'd recommend a Virtual Dedicated Server (a small box plugged into a server which gives you your own chip, memory, and isolated from the rest of the server). A VDS gives you full root access to the httpd.conf and php.ini files. Vger Link to comment Share on other sites More sharing options...
profundiscom Posted September 11, 2005 Author Share Posted September 11, 2005 V- Thanks for the intelligent solutions. I'll do some preliminary tests with the setup I have (with the change in the php.ini file) and if/when it works, will likely consider a dedicated solution exactly like you mentioned. Now if I were only sitting on top of my own T3, it would be a non-issue. I wonder how much my provider will gouge me for? So, a little disheartened, but I will still look for a potential workaround for use with a single server. A friend of mine and I are also considering working up a PHP based solution, to keep the resource requirements down. We'll post it if it works. I'll assume it doesn't already exists or others have already tried and failed. BTW, appreciate your knowledgeableness. Can you point me to a "can't make .zip file download work correctly" thread? I seem to have no trouble sending out raw quicktime and wmv files, but zips are a thorn in my nugs. :blink: Haven't tested for .sit based stuff, but who uses that? (Kidding!) Still love my Mac...and my XP...and my SUSE...etc. Maybe not always necessarily in that order, when we start talking software. Thanks for your help. Feel free to post a follow up or let this one die. I'll still be looking out for my .zip solution..but any help is always appreciated. Link to comment Share on other sites More sharing options...
Guest Posted September 11, 2005 Share Posted September 11, 2005 i have no problems with large files, however i am managing my own servers. i have both mac and windows files, as well as *nix files. Link to comment Share on other sites More sharing options...
profundiscom Posted September 12, 2005 Author Share Posted September 12, 2005 JO- I appreciate your comment, but you left out one little detail.. Was there something you had to do to make it work with the large files, or did it just work like that "out of the box" for you? Because, it sure didn't do it for me. And while you should be rightfully proud, how about sending me the link for this site that people are able to download these large files you are selling? I do some 'adult' content at my end, so I'm not terribly concerned if this was why you didn't include a link. (Does the board have any rule about not posting links? Adult or otherwise?) So did you modify the php.ini file? Did you set up an additional server like Vger mentioned? What type of files are you serving? Video? Applications? Any details are helpful. Thank you for contributing to my thread but I NEED MORE INFO. Thanks in advance. Link to comment Share on other sites More sharing options...
profundiscom Posted September 12, 2005 Author Share Posted September 12, 2005 I GOT IT! Yes, during the test, changing the php.ini file in my /etc/ directory takes care of the problem. Server seems to be doing OK, too. I just multiplied the resource limits by x to I'm still working on a script that does a better job with the server load. Or, I may still take some other advice and make the files smaller. Special Thanks to Vger and Mibble for addressing my rants. (Will still likely be taking Vgers advice in the near future in terms of a dedicated server. Smart ideer.) Link to comment Share on other sites More sharing options...
Sierrab Posted September 12, 2005 Share Posted September 12, 2005 I had a similar problem with large files, by setting downloading by redirect, I believe you avoid all the limiattions that php imposes Secondly I substituted $chunksize = 32 * 1024; $buffer = ''; $handle = fopen(DIR_FS_DOWNLOAD . $downloads['orders_products_filename'], 'rb'); if ($handle === false) { echo 'ERROR...'; die; } while (!feof($handle)) { $buffer = fread($handle, $chunksize); echo $buffer; } fclose($handle); } for the regular download expression at the end of the file Steve Link to comment Share on other sites More sharing options...
profundiscom Posted September 13, 2005 Author Share Posted September 13, 2005 SB- This is EXACTLY the type of thing I'm looking for. I'll test this out in the next 24 hours and let you know how I make out. If you could monitor my thread for a few days in case I have additional questions, it would sure help. (Your absolutely right that I'm not doing it by redirect. Gave me some trouble at first, but I'll iron it out now that you say this method is acceptable). Much, much, much obliged. :thumbsup: Link to comment Share on other sites More sharing options...
profundiscom Posted September 18, 2005 Author Share Posted September 18, 2005 Argg! Download by redirct is not working with my LAMP system. I am getting "Forbidden You don't have permission to access/catalog/pub/xtempfilename/xfilename on this server Additionally, a 403 Forbidden error was encountered while trying to use an ErrorDocument to handle the request.". OK. I'll be on the forum searching for the answer to this.. In the meantime, if anyone wants to volunteer ideas for this or other problems, I am much obliged. They are all related to making the OSC system work with large downloads, so I didn't feel compelled to begin a new thread. The system is working, but not in download by redirect mode, it also seems to have a problem with Windows machines, which won't save the file properly. Save as target, or using the download link both yield a properly named, but ultimately empty file. To recap the problems: 1. Doesn't work with download by redirect, which is apparently, key to making large downloads work. See above error. Suggestions? I swear I have all my permissions opened up for testing, so I thought it wouldn't be a problem. 2. When it does work, (not by redirect), the file will not save to a disk properly using Windows XP. It seems to work on a 95 machine (I have one handy), but saves an additional checkout_success.php file in the process and prompts for a file name.I don't have this problems with either Mac or Linux. I'll be pretty researchy in the meantime. Thanks to all that help. Incidentally, SB.. Does that expression go into the donload.php file? If so, can you be a little more specific in terms of placement. Would be very helpful. Link to comment Share on other sites More sharing options...
Guest Posted September 2, 2008 Share Posted September 2, 2008 If anyone doesn't have access to the php.ini file they can post to it using the .htaccess file (or create one) in the root folder. Add this to your existing .htaccess file (or create a blank .htaccess file with just this in it): php_value post_max_size 20M php_value upload_max_filesize 20M php_value max_execution_time 3600 For example. Good luck! Link to comment Share on other sites More sharing options...
Recommended Posts
Archived
This topic is now archived and is closed to further replies.