Welcome Guest, Not a member yet? Register   Sign In
Memory Limit Help!

So I am upgrading a site that was once in ColdFusion and I've moved it to PHP via CodeIgniter.

I created an upgrade script as the old site had over 10 GB of images. I have converted the site to have an asset library so it doesn't rely on directory scanning for images (we've converted to image galleries and such).

So this upgrade script scans the directory for images (remember, 10 GB worth), picks each one individually, adds the image to the DB Asset Library, moves the image to the new location, uses image_lib to resize into the different sizes for the site (thumbnails, etc)...

Going through this loop and creating several thumbnails, I end up reaching the memory_limit no matter how high I set it (so far 512m). The solution beyond that is -1, which is not ideal in my perspective... Obviously I'll be changing it after the import.

I've checked the files, and there isn't an individual file larger than 8.9mb, this leads me to believe that somehow image_lib isn't clearing is memory through each iteration. After each I run image_lib->clear(), but that's just a settings thing.

Is there a setting for image_lib/gd2 that tells it to free memory after each resize? It seems to me that it's acting like Photoshop trying to open 10gb worth of files, when I just want to open one at a time and close it after completion (much like a Photoshop Batch script). Any help would be appreciated.

The scripts are working, I just need to find a way for GD2 to flush its memory or something after each image. (right now my scripts are running on 10gb of data, and I am just crossing my fingers).

It could also be how you are reading the files. Are you loading all filenames in that dir at once? You say "picks each one individually", so not sure how you are doing that. Using scanning functions on large directories is consuming.

So I start with:
$import_images_folder = realpath(dirname(__FILE__) . '/../../../import/' . $old_user->user_directory . '/images/');
(ugly I know, and there are better ./ options, but I was having some issues and this worked)
$import_images = get_filenames($import_images_folder, true);

Then I loop through the array
foreach( $import_images as $image ) {
// Manipulate the image

The code is properly accessing images and using GD2 to resize/copy them to their new destination. After a successful copy/resize, it adds the image to the image library database table and then loops through to the next image.

The only thing I can conceive is old image handles aren't being terminated (the gd2 version of fclose) and thus not freeing up memory. My code doesn't process the next image until the last image is complete (unless I am not understanding how the GD2 library works, which is highly possible).

Also, a new update. I execute my import scripts with memory_limit=-1 (infinite) and I get this error:
Quote:Fatal error: Maximum execution time of 300 seconds exceeded in /var/www/{{folder}}/system/libraries/Image_lib.php on line 522

This happens 6 minutes after I start the script. So does that mean I am not ending the first image manipulation and starting a bunch of others at the same time?

I had assumed $this->image_lib->resize() starts and completes (or errors on) the process before continuing with the rest of the import script (thus terminating the file handle and freeing up memory).

(side note: the version of CI I am using has this for line 522:
$copy($dst_img, $src_img, 0, 0, $this->x_axis, $this->y_axis, $this->width, $this->height, $this->orig_width, $this->orig_height);

$copy in this instance is 'imagecopyresampled' for the gd2 library.

Looks like you are hitting php's max_execution_time setting for the script to run. 300 seconds (default time limit) is 5 minutes which might not be long enough to process all of the images.

Try setting a higher time limit for this script to run. At the top of this function/method, try something like:
set_time_limit(600); //10 minutes max

You can guess how long it needs to run based on how many didn't get converted before it errors out. Like it processed 500 of 1000 images, so you should probably, minimally, double the time it needs.

It sounds like this only needs to be run once to convert everything and won't be used again, correct?

Also, are you using
in your loop after it converts an image?


//your individual image_lib config.
$config = array(
  'image_library' => 'GD2',
  //...other settings

foreach($images as $image)
  $this->image_lib->initialize($config);  //initialize

  //... blah blah do your manipulation

  $this->image_lib->clear();  //destroy the instance, free the memory

Yeah, I am clearing after each:

if( !$this->image_lib->resize() ) {
$errors[] = $this->image_lib->display_errors();

For some reason I thought max execution time was per single execution attempt not for the whole script (oops), as in, attempt to execute "fopen" for 5 minutes before giving up. If this is the case I'll probably have the set the execution time to 10 hours (I have 116,536 users and 10gb of photos to go though). But yes, it's just a one time thing. My biggest concern now would be how much memory it uses processing 10gb of images. When I had the memory_limit set to 512m it would spew the memory error before timing out. Will some of the first image resize() calls drop from memory after a certain time or will the library keep all 10gb of image manipulations until php stops processing the scripts?

I apologize for the poorly worded question and thank you 1000 times already for your help!

Would it make sense to unset the image_lib library after each series of images? So each folder contains between 0-100 images that I need to process. Would "resetting" the image_lib instance in CodeIgniter clear the memory cache used for the image manipulation required? Then I would re-substantiate the library when I loop over the new user.

I'm not sure how that could or would work with their library. I don't use CI's image library as it caused too many problems for me or didn't do enough of what I need.

One thing you could do is first get all of the filenames from the dir, and put them in the db at that time. Have another column for "processed" with a default value of 0. Then use that to retrieve the image filenames and when one is processed, mark it as processed. Maybe use the same table you are currently using since the image filename would be the same pre/post conversion?

This way, if you run into an error, you can just run it again on the unprocessed items and continue where you left off so you don't have to start over. I don't know how much time I'd spend trying to get it to run in a single go when you probably will only use it once.

Yeah that makes sense... I can try something like that.

I am having to rename the files though since they are user files and you never know how they've named their files (some of them have been really bizarre). But I can just save the old filename first, and after processing the image change the DB entry to the new filename.

Thanks for the suggestion.

Theme © iAndrew 2016 - Forum software by © MyBB