Welcome Guest, Not a member yet? Register   Sign In
read_file and force_download, memory limits?
#1

[eluser]Daniel H[/eluser]
Is anybody aware of any problems with memory limits using read_file or force_download?

My (summarised) code is this:

Code:
// Get the file contents
$data = read_file("./static/brief/$brief->zip_filename.zip"); // Read the file's contents
$name = "ycn-0809-$brief->url_title.zip";

// If the download failed go back to awards section
if (!force_download($name, $data))
     redirect('awards');

This works absolutely perfectly with all the zip files that are less than 8mb, however one is 8.3mb and it downloads as 4kb and Safari says 'decompression failed' once it has done so.

I added this line at the top of the download function:

Code:
ini_set("memory_limit","18M");

...yet this doesn't seem to have any effect. Any ideas?

Thanks,

Dan.
#2

[eluser]darrenm[/eluser]
I've just finished a project that had just this problem.

There are two php.ini settings that are likely to give you trouble:

memory_limit and max_execution_time

If you use force_download() to serve the file in one go, you're likely to have trouble with memory_limit (you'll probably need double the size of your file). The alternative is to use something like this:

Code:
$handle = @fopen($path,"r");
if ($handle) {
   while (!feof($handle)) {
      $buffer = fgets($handle);
      echo $buffer;
   }
   fclose($handle);
}

Which serves the file one line at a time. The danger in that is that you'll hit max_execution_time. Fortunately for my project the host had that set to 50000 so it was never an issue. had it been a problem I was going to see if I could reset the ini value regularly within the loop.




Theme © iAndrew 2016 - Forum software by © MyBB