(03-01-2015, 11:18 AM)spjonez Wrote: JS is required for just about everything in the modern web.
This is only the case for sites developed by people that believe this. There has been a long tradition on the web of progressive enhancement, and an equally long tradition of people ignoring (or disputing) the benefits of progressive enhancement. None of the significant advancements in the "modern" web have done anything to change this.
Quote:Situation A:
- User clicks a link to the page, user waits for 1s to see the page fully rendered.
Situation B:
- User clicks a link to the page, sees page content after 0.5s then sees results fill in the table. Since your AJAX call is triggered on DOM ready the request is sent before the browser starts to paint the page. As the page is being drawn your search request is sent and returned in what appears to be less than 0.5s since it starts before painting begins.
Visually, it appears your site is almost twice as fast using a deferred load. It's the same psychological effect as sitting on hold on the phone. Waiting for 1m seems like 10m.
Situation A isn't really the case, except in very extreme situations (in which case you should probably look at other areas for optimization). In my case, a static page usually spends more than half of its time in code which is executed on every request, and 1/6 to 1/4 of the remaining time will be spent in template code or other code related to returning a page (which isn't executed for an AJAX response). So, an AJAX response doesn't take as long as a full page, but it's rarely faster to add an AJAX call to a response than to simply make the database call as part of the original response.
In other words, if it takes 0.5s to generate the page and 0.5s to generate the search results, it might take 0.75s - 0.8s to generate the page with the search results included, but it shouldn't take 1s. Plus, the time for the AJAX response would really be 0.5s + 0.1-0.5s transfer time + 0.5s + 0.1-0.5s transfer time + ~0.1s+ to modify the DOM and re-paint the page vs. 0.75-0.8s + 0.1-0.5s transfer time + ~0.1s+ to paint the page. In other words, the time you save the user in displaying the initial page (without the data) decreases (as a percentage of the total time they spend loading your page) as their connection quality and/or bandwidth to your server decreases.
If I have a very long-running query, the best thing to do is usually to page the table and request only the first page of results to be returned in the original response (assuming a subset of the data returns faster than the full dataset, though there are cases where this assumption will be wrong and I have to make other decisions). Then I can make the page links load the additional data (either as an AJAX response or on a completely new page), or even request the remaining data on DOM ready so the data can be waiting in the browser's memory when the page links are used.
In most cases, I'll setup the page to return a page of data with links that load a new page for each subsequent page of data. Next, if the script runs, it will change those links to call a JavaScript function instead of replacing the whole page, so the data can be retrieved and updated via AJAX (if supported). At that point, there may be additional choices according to supported APIs, such as retrieving the full dataset.