Welcome Guest, Not a member yet? Register   Sign In
Datamapper memory consumption ? 16Mb exhausted on small query
#11

[eluser]junkwax[/eluser]
How could we reduce the size of these objects?

If we could get the added object to be say 1KB instead of 8KB, it would be a good improvement. Do a print_r() of these objects and you will see by yourself that there a lot
of junk in these objects.

My 2 cents.
#12

[eluser]jacobc[/eluser]
They could be made smaller of course...

But the aim of something like Datamapper is to make mananging relational databases easy.

Handling 2000 objects at once doesn't seem like a reasonable requirement for it.
#13

[eluser]junkwax[/eluser]
First I say 2,000 because my php.ini settings are 16MB.

If my php.ini settings were 8MB , it would mean that fetching just 1,000 rows would make PHP crash.

If my php.ini settings were 4MB, it would mean that fetching just 500 rows would make PHP crash.

You never fetch 500 rows in a DB at once? Me yes, every day.

If I take the original example of stensi (assigning books
to a user ), it means that if I have 4MB settings I cannot assign more than 500 books to the user, otherwise php will crash.

8KB per row is a major design flaw in my opinion.

Code:
// Let's create a user
$u = new User();
        
        // Let's now get the first 5 books from our database
        $b = new Book();
        $b->limit(500)->get();

        // Let's look at the first book
        echo 'ID: ' . $b->id . '<br />';
        echo 'Name: ' . $b->title . '<br />';
        echo 'Description: ' . $b->description . '<br />';
        echo 'Year: ' . $b->year . '<br />';

        // Now let's look through all of them
        foreach ($b->all as $book)
        {
            echo 'ID: ' . $book->id . '<br />';
            echo 'Name: ' . $book->title . '<br />';
            echo 'Description: ' . $book->description . '<br />';
            echo 'Year: ' . $book->year . '<br />';
            echo '<br />';
        }

        // Let's relate the user to these books
        $u->save($b->all);
#14

[eluser]jacobc[/eluser]
If you are going to displaying so many surely you should paginate it.
How often do you need to add 500 books to a user at once?

I'm just imagining a UI with horrible usability.
#15

[eluser]OverZealous[/eluser]
I have to agree. I have never seen a web application that sends that many items to the browser. It's horribly inefficient, especially because the user most likely doesn't want to parse that many items. Even if you are sending it to a heavyweight client, or an AJAX client, the results should be paginated.

For example, I'm using the dojo toolkit and its virtual Grid. I can "show" thousands of rows to the user. But I only have to send 25 or so at a time. The user can quickly scroll through the list, but only needs to see the handful that are there.

BTW: I do have a trick I use to handle large numbers of rows when you don't really need to instantiate the objects within DataMapper. I use this to find what row in a large query a specific ID is:
Code:
$myObj->select('id');
$myObj->where('...'); // set up the query
$myObj->order_by('ordercolumn');

$result = $myObj->db->query($myObj->table);
foreach($result->row() as $row) {
    var $id = $row->id;
    // process $id
}

This gives you the best of both worlds when you absolutely need it.
#16

[eluser]junkwax[/eluser]
jacobc, I think you miss the point on DM optimization. I am not talking about UI nor about pagination, I'm talking about fetching rows in a db. Thanks for the advice on usability but I`ve already read many books and blogs on this subject.
#17

[eluser]junkwax[/eluser]
@overzealous: OK so I will go with using standard fetching functions when I need big fetching.
#18

[eluser]Murodese[/eluser]
Yeah, it sounds like you have much, much bigger problems if you're trying to load 2000 orm-managed rows into memory at once.
#19

[eluser]tdktank59[/eluser]
I have to agree about optimizing DM to reduce its overhead... Maybe there is a way we can setup 1 master object with all the configs that are global so when we load a new object it only has to hold its own values/ config and not a config that is used throughout the whole site...

Because there is A LOT of stuff that is repeated in 1 query from datamapper be it $d = new D(); $d->get(); or w/e you feel like using...

mind you I dont need it optimized for getting so many rows... just would be nice to get it down below the php.ini's maximum...
#20

[eluser]OverZealous[/eluser]
You do realize that 90% of that stuff is already shared, right? The configuration info is copied by reference to each object, just look at the initialization code. This means that there is only one copy per model. var_dump/print_r is a poor view of what is in actually memory.

Of course, some of that could be replaced with public static references, which might help in those extreme cases, but also will most likely break object inheritance completely.

PHP has horrible memory management. It is designed to start up, run something, and exit. The garbage collection isn't even able to cope with circular references - once two objects reference each other, even if nothing else references them, they will not be GC's until PHP exits.

Coming from a Java world, I get really frustrated by that.




Theme © iAndrew 2016 - Forum software by © MyBB