Welcome Guest, Not a member yet? Register   Sign In
CI-based Web Sites Performance
#11

[eluser]DragonI[/eluser]
Thanks tonanbarbarian, I'm going to try it out Smile
#12

[eluser]DragonI[/eluser]
Holy Crap Sad

My pages are in the 5+ to 9+ MB mark with debug set to ZERO.

I'm using APC, caching queries and elements. Here are some details:

1. default layout has 11 elements
2. page that lists a persons blog entries = 8.5 MB (uses 2 components and 7 helpers)
3. simple page using page controller = 5.1 MB
4. blog entry form = 8.0 MB
5. SimplePie - RSS Feed from Engadget, YouTube, Techcrunch - 9.5+ MB for Engadget and 9+ MB for the others
6. Sitemap - using native Cake stuff = 7.5 MB
7. memory_limit via php.ini = 128M (not strictly because of cake but for uploads)

Automagic = faster development but I guess you pay a huge price. I've found performance is lacking as well. Pages take approx. .5 seconds to over 1 second!

This is why I'm coming back to look at CI.
#13

[eluser]Référencement Google[/eluser]
DragonI, it would really be interresting to know if on CI and the same kind of application what are the values for your benchmark, let us know !
#14

[eluser]Nick Husher[/eluser]
Quote:Is your web site small, medium or big? How ‘heavy’ you think it is?
Medium. We’re taking on 37signals’ rubbish. Death to rails, death to hype!

Is it rails you have a dim view of, or 37s? Both the company and the development platform are flawed, although in general I think neither qualifies as 'rubbish.'
#15

[eluser]DragonI[/eluser]
Hi elitemedia,

It maybe a while! Just starting up with CI again. But I will post my results
#16

[eluser]stanleyxu[/eluser]
I think a potential performance problem is PHP language itself. (Is there a "copy-on-write" semantic behind PHP?) Developers have to write code very carefully. E.g.: consider to pass a reference or a copy.

A very good example is the Zip class (Read thread). I zip 1000 files (7MB) using the default zip class, it takes 60secs and 16MB memory. After tuning, it takes 2secs and 8MB memory.
#17

[eluser]Lewis[/eluser]
I think CodeIgniter is well designed to code with, but what the hell were the developers thinking when they coded db->result()? Looping through the result twice just so you can use foreach! That's a big trade-off in performance and just not worth it at all. It works fine for small websites, but try developing a proper application on it and you'll soon see. I've seen other things like this where simple recoding can greatly increase performance. CodeIgniter is brilliantly designed in some places, but why let it down with things like this?
#18

[eluser]Craig A Rodway[/eluser]
[quote author="Lewis" date="1202608280"]I think CodeIgniter is well designed to code with, but what the hell were the developers thinking when they coded db->result()? Looping through the result twice just so you can use foreach! That's a big trade-off in performance and just not worth it at all.[/quote]

Do you have any test results from benchmarks for this function?
#19

[eluser]tonanbarbarian[/eluser]
ok i sort of agree with lewis

i am thinking the reason that the db->result type functions were created were for convenience. it makes it easier to use methods to move back and forward amongst the records

while it does mean you have to parse through a loop twice there are things you can do to improve performance

for example
Code:
foreach ($query->result() as $row) ...
is fairly slow because it has to run the function on each iteration of the loop
use
Code:
$rows = $query->result();
foreach ($rows as $row) ...
instead

how much slower is it to have to process the loop twice though?
well that does depend on how many records you are processing.
if you are using a paginated list you wall always be limiting the number of records you retrieve.
So if it is 5, 10, 20, 50 or even 100 records in the result set then it is going to be very fast because the loop size is small and I would not worry about this as an issue.

If you are not using limits and you are having to pass 1000's of records then the extra processing may be noticable, but this is not something that happens often, and you could consider breaking your process into batches anyway and refreshing the pages in between.

having said all of this I have expressed a desire before to have something like
Code:
$this->query->fetch_object();
$this->query->fetch_array();
each of which just gets the next record but does not do any of the other processing that CI normally uses to "cache" the result data
#20

[eluser]Lewis[/eluser]
[quote author="Craig Rodway" date="1202609518"]Do you have any test results from benchmarks for this function?[/quote]

It doesn't take a benchmark to realise that looping database results twice is unnecessary.


[quote author="tonanbarbarian" date="1202615705"]ok i sort of agree with lewis

i am thinking the reason that the db->result type functions were created were for convenience. it makes it easier to use methods to move back and forward amongst the records
[/quote]
I agree with you there, but why not have a function for simply doing fetch_array? Why make the current one private - why not at least advertise it for people who need to loop through large resultsets? Then when someone wants to use one of the functions to move forwards/backwards, perform result_array and cache the results. That way, there's always maximum performance.


[quote author="tonanbarbarian" date="1202615705"]while it does mean you have to parse through a loop twice there are things you can do to improve performance

for example
Code:
foreach ($query->result() as $row) ...
is fairly slow because it has to run the function on each iteration of the loop
use
Code:
$rows = $query->result();
foreach ($rows as $row) ...
instead

[/quote]

Not so. PHP will run the $query->result() once, and then loop through its results. I think you're getting mixed up with 'for' which will run the statement through each iteration.

[quote author="tonanbarbarian" date="1202615705"]
how much slower is it to have to process the loop twice though?
well that does depend on how many records you are processing.
if you are using a paginated list you wall always be limiting the number of records you retrieve.
So if it is 5, 10, 20, 50 or even 100 records in the result set then it is going to be very fast because the loop size is small and I would not worry about this as an issue.

If you are not using limits and you are having to pass 1000's of records then the extra processing may be noticable, but this is not something that happens often, and you could consider breaking your process into batches anyway and refreshing the pages in between.
[/quote]

Unfortunately the databases we work with require creating many different dropdowns with hundreds of options, or doing very complicated real-time processes were ever drop of performance counts. For example, using CI a result-set of 500 is returned, which is looped through by the database class, then my model to apply some formatting, and then the form helper to create a dropdown. That's 3 times, when it can all easily be done in one.

[quote author="tonanbarbarian" date="1202615705"]
having said all of this I have expressed a desire before to have something like
Code:
$this->query->fetch_object();
$this->query->fetch_array();
each of which just gets the next record but does not do any of the other processing that CI normally uses to "cache" the result data[/quote]

You can use $query->_fetch_object(), although technically it's supposed to be a private method. But remember that only returns one row each run, so you'll have to use 'while' instead.




Theme © iAndrew 2016 - Forum software by © MyBB