Carabiner 1.4: Asset Management Library |
[eluser]dmorin[/eluser]
Personally, I really dislike the idea of storing the combined/minified output in the database. I actually wouldn't use it if it did that because it will add massive overhead and is completely unnecessary since there are much better places to put it like S3, a cdn, or even memcached. I think the best approach for you is to first, standardize the last modified times on the files across the cluster as we mentioned previous and second, take a look earlier in this thread for the modifications for adding S3 support. That way, the first request that came in would cause the first server to process the static content and upload it to S3. The requests would then all go to S3 instead of going back into the cluster and potentially getting served by a server that hasn't yet computed the static content. All of the other servers would also process the static content the first time they received the request since they wouldn't have local cache copies, but that would also be worked around. The other approach would be to set your load balancer to serve all requests from the same user using the same backend server in which case, you wouldn't need to use S3 since any future static requests would be sure to go to a server that has already processed the static content from the original request.
[eluser]tonydewan[/eluser]
I definitely agree that using the database is a less than optimal solution. I think that S3 is the strongest choice currently discussed. Of course, if you don't like the third party solution (or the cost), you could also do something wacky like rolling your own S3 type interface to your cluster. I'm not totally sure what that would look like, but the modifications to Carabiner would be similar to those already shown for S3 integration, I would guess...
[eluser]bgoncalves[/eluser]
Update: as dmorin pointed below, the proposal in this post is not the best solution. This is a really helpful library. Congratulations for your work (with good and clean coding) and also for keeping the support and documentation updated. I created a controller to add the HTTP headers to leverage browser caching - feel free to use it if you find it useful. This controler also enables the possibility for using the gzip flag built in CodeIgniter. It completes the necessary changes to see green lights in all tests with Page Speed - regarding JS and CSS. So, there are 3 things to do: 1. one new config parameter 2. change the way assets url is constructed 3. create the controler 1. In file .../config/carabiner.php at line 78 Code: /* 2. In file .../libraries/carabiner.php at line 234 Code: public $use_controler = FALSE; and in line 302 Code: // before: $this->cache_uri = $this->base_uri.$this->cache_dir; 3. Next the controller, in file .../controllers/resource.php Code: <?php ...testing To test this look at the Apache access logs for the requests it receives. You'll notice that the browsers already do plenty of caching with the help of the webservser but there are some times where the browser repeats the requests when using the 'no-controller' version. The addon in this post was successfully tested in Linux (Ubuntu+XAMPP) and in Windows (XP+WAMP). ...future I have to disagree with Tony when he writes Quote:At this point, I see Carabiner as feature completebut of course the work load is on who disagrees. Next: evolve the controller with a method for images, leveraging browser caching for these assets.
[eluser]dmorin[/eluser]
This is an interesting solution, but I would be VERY careful about what you stream through php. If you're running a site with very little traffic this is a fine solution (although I'm not sure what problem it is solving). But if you have a traffic heavy site, invoking the php interpreter for static content doesn't make much sense. Apache can serve something like 4-5 time more requests (compared to CI with caching enabled) when serving static content on the average server so allowing apache to do what it's good at is the best approach. For this reason, I've also never understood why CI has a gzip option since this is another thing that's more efficient to do at the webserver level since it also caches the compressed resource. In terms of browser caching, simply adding far-futures expires headers to content when it is served will cause the browsers to cache the results. Since Carabiner changes the file name each time a file is changed, browsers will request a new copy each time there are changes. To add expires headers, add this in your apache config: Code: <Directory /dir/path>
[eluser]bgoncalves[/eluser]
@dmorin Thank you! Due to my ignorance of Apache I was trying to solve the browser caching with PHP/CI. Completely agree that the best software should play it's part. Now I have to figure out these settings on my provider - an ISS version... As Tony said, At this point, I see Carabiner as feature complete.
[eluser]dmorin[/eluser]
I assume you mean IIS? Haven't used it in a few years, but I'm sure there's a way to add an expire header! I also agree with you that Carabiner isn't feature complete! I've personally added the s3 upload option to mine plus some custom code that sets expires headers on the uploaded content so it's all served correctly. Plus I removed the code that pulls in external js/css files and adds them to the compiled/minified locally hosted version. I'm not a big fan of that original feature and here's why. Let's say jquery + jquery ui is 100k just for easy math, plus I have a page-specific js file (~2k each) for each page on my site. So lets say a user browses 10 pages on my site. If I continue to let google host jquery, the user's browser will download it once when they get to the site and never again, so 120K total. If Carabiner adds jquery to the compiled versions, then it becomes 1020K total. And it's not only the issue of size, but the duration each of those requests take since js scripts can be blocking in the browser. The code could also be heavily re-factored. That said, I'm extremely grateful that Tony created Carabiner as it makes my life much easier!
[eluser]tonydewan[/eluser]
One thing I meant to do, and never got around to, was to post the .htaccess settings I used in my production environment. This does most everything @dmorin's solution does, just differently. Don't use these in a development situation. It will make your life harder. Also, these are just the static asset optimizations. I've left out the mod_rewrite and other stuff that also goes here. (Also, credit where credit is due: I borrowed most all of these settings from someone else. I can't find the site at the moment, but somebody else did most of the work on this one.) [code]### BEGIN gzip # Insert filter AddOutputFilterByType DEFLATE text/html text/plain text/xml application/xml application/xhtml+xml text/javascript text/css application/x-javascript # Netscape 4.x has some problems... BrowserMatch ^Mozilla/4 gzip-only-text/html # Netscape 4.06-4.08 have some more problems BrowserMatch ^Mozilla/4\.0[678] no-gzip # MSIE masquerades as Netscape, but it is fine # BrowserMatch \bMSIE !no-gzip !gzip-only-text/html # NOTE: Due to a bug in mod_setenvif up to Apache 2.0.48 # the above regex won't work. You can use the following # workaround to get the desired effect: BrowserMatch \bMSI[E] !no-gzip !gzip-only-text/html # Don't compress images SetEnvIfNoCase Request_URI \ \.(?:gif|jpe?g|png)$ no-gzip dont-vary # Don't compress compressed files SetEnvIfNoCase Request_URI \ \.(?:exe|t?gz|zip|bz2|sit|rar)$ \ no-gzip dont-vary # Don't compress PDF SetEnvIfNoCase Request_URI \.pdf$ no-gzip dont-vary # Make sure proxies don't deliver the wrong content Header append Vary User-Agent env=!dont-vary ### END gzip ### begin etag FileETag none ### end etag ### begin expires header <FilesMatch "\.(ico|flv|swf|pdf|jpg|jpeg|png|gif|js|css)$"> ExpiresActive On ExpiresDefault "access plus 10 years" </FilesMatch>
[eluser]tonydewan[/eluser]
@dmorin Your point about downloading remote files is a very good one. One of the reasons I like using the Ajax Library API is for the very reason you described. The only reason I left it that way was to keep it consistent with the way Carabiner treats local files, since using the Google API isn't the only reason people might use a URL reference. I will think about that more, for sure. In terms of the whole "feature complete" thing: I only meant that I didn't (and still don't) have plans for additional major functionality. The only real feature I've considered adding is integration with a CDN. That's something I'm still figuring out how to abstract appropriately. I certainly agree that much of the code could be refactored, and there are certainly smaller features and improvements that could be made. In fact, many of those that have been requested have been implemented in my current 1.45 test release. You can grab it from GitHub: http://github.com/tonydewan/Carabiner/tree/master I'll be taking some time to look at CDN integration soon. I'm also interested in seeing how people might optimize Carabiner.
[eluser]cahva[/eluser]
"Do you use CSS or JavaScript? Carabiner makes your life easier. I promise. " WORD! ![]() |
Welcome Guest, Not a member yet? Register Sign In |