Deep Dive into Browser Performance

Comments

Comments are closed.

Anonymous at 16:44 on 28 Oct 2014

Great talk, glad to see so many links to tools and articles.

Fantastic info. Great tps and tricks!!

This was an incredibly informative presentation with a significant amount of data-driven analysis and reporting of real-world examples (I loved that you used the ZendCon site along with other developer favorites like slashdot & zend).

Another great highlight were a few specific examples of how to deal with unique identifiers for JS & CSS file assets to ensure ease when setting expiration headers. I'd heard of the timestamp approach, but the nginx idea of loading the file name using it's available re-write mechanisms (which I suppose would also be available for Apache, though perhaps not quite as fast....)

Excellent presentation.

Anonymous at 17:05 on 28 Oct 2014

Great presentation. Very detailed and informative.

Excellent presentation. Plenty of insight into techniques to speed up browser performance.

Drew LeSueur at 23:25 on 28 Oct 2014

Lots of good tips. I took a lot of notes. Thanks.

Wow... who know there was so much to mess around with under the hood? Very enlightening! I always wondered if loading massive css and js libraries would affect performance. Cool ideas, workarounds and tools. Thanks!

I hate to be the dissenting voice here, but I felt that this session was more of a mixed bag than the other commenters. I've been working on web applications for 15 years now, and some of the tips and tricks mentioned will cause more problems than it will fix.

* Require.js: Using Require.js or another AMD javascript framework can definitely help -- in fact, I use it quite a bit. However, as the number of JS modules you use increases, the number of javascript files loaded increases, which causes the file download bottleneck issue again. That's why require.js has its own compiler/minifier to create one built javascript file, which is what is considered best practice for production by that project.

* Using iframes: I tried to find information after the session to see where using iframes were considered a "best practice" and failed. While it is true that using iframes will allow you to use multiple CPUs, I would contend that writing an application that *needs* multiple CPUs to perform well is a sign that something is wrong and not that iframes should be used.

* Inline CSS: First, we should differentiate "inline CSS" from "embedded CSS". Inline CSS is the act of modifying an HTML element with a style attribute. Embedded CSS is the act of using a <style> tag in your HTML, preferably in your head element. Embedded CSS is useful, especially when the CSS modifications are small and/or critical. Google's PageSpeed Module for Apache actually helps you do that automatically. Inline CSS, however, is much more controversial to use. If performance is your *only* metric, then perhaps it would win out. However, from a maintainability stance, it is making things harder to read and alter. Additionally, code running from other servers that have HTML5's Content Security Policy enabled will have problems -- by default, inline CSS is not allowed for CSP.

* Refresh/Reflow: This is a minor point, but frustrating nonetheless. Of the three reference points for this, one was from 2009 and another was from 2008 (and actually stated that it was most likely out-of-date!). This is a lifetime ago for browsers, and putting it into the session references seemed odd.

* CSS and browser-specific prefixes: There are a number of viewpoints on this, but a blanket "avoid" may not be the best idea. CSS animations are much MUCH more performant than trying to animate with javascript. Some of those animations are only written as browser-specific prefixes. I would suggest that developers know their userbase and code their CSS accordingly (if no one uses IE in your userbase, it would make sense to remove the 'ms-' prefix). I would also suggest using a CSS pre-processor to help with making your CSS more efficient such as Sass (http://www.sass-lang.com/) or Less.js (http://lesscss.org/). However, using a CSS pre-processor could be a session unto itself.

Finally, I felt the answer to the question asked at the end regarding how to implement this seemed more difficult than needed. The answer (as I heard it) was to have a script during deployment to rename or symlink files that need cachebusting and then to modify all html to have this cachebusting hash or number added to the files found. This seems like a dangerous way of handling this. I would suggest having a configuration variable that can be read by the application (e.g. $cssNum). Then if the configuration variable exists, the view that contains the filenames for the possibly-cached files can be modified like so:

<?php $cssFile = ((null !== $cssNum) ? "app.${cssNum}.css" : "app.css"; ?>

This allows the file to be given a cachebusting number only if the configuration is there (like in production). From there, it is easy to add a mod_rewrite rule to remove numbers from css or js files (this example found at http://html5boilerplate.com)

<IfModule mod_rewrite.c>
RewriteEngine On
RewriteCond %{REQUEST_FILENAME} !-f
RewriteRule ^(.+)\.(\d+)\.(bmp|css|cur|gif|ico|jpe?g|js|png|svgz?|webp)$ $1.$3 [L]
</IfModule>

While difficult to explain the details in a QA part of a session, it still would be something to use instead of modification of files as part of deployment.

Brett,

First of all, thanks for the detailed feedback ;-)

As you've pointed our require.js is something that needs to be used carefully, if you have 100 JS files you certainly would not want to build a dependancy tree 100 long. What would make sense is to combine JavaScript into logical packages of 10 maybe 15 files at most that are separated into applicable concerns. Those concerns would then be defined as dependancies to be loaded via require.js or similar AMD mechanisms.

iFrames are a great way to leverage multiple CPUs on pages with multiple refreshable elements, dynamic dashboards come to mind as one possible application.

Repaint/Reflow is still very much an issue and good'old trick still work just as well as they did a while back. It is a fair point browsers have improved lately, but most still suffer from many reflow issues, which can be minimized using the clone or hide/show tricks.

As far as unique filenames, you raise a very good point. I think I'll make 1/2 slides showing some specific examples of how to make it happen, since explaining it without a reference is rather tricky.

Anonymous at 13:11 on 29 Oct 2014

Good presentation, lots of information on how to look for bottlenecks and suggestions on how to fix/minimize them.

Quite good presentation. Some issues were overstated (like some CSS intricacies that might be of relevance only in some edge cases), but generally good to know about those things when you stumble on problems.

Anonymous at 16:18 on 29 Oct 2014

Really interesting, a lot of things that I hadn't thought of, some things that I knew about, but was great to hear what should be prioritized.

This talk was outstanding. He had important metrics about how browser performance overhead works, and how to minimize each of them.

This was a great talk and I came away with some ideas to try out. I was told the opposite on some of the suggestions years ago and took it as a rule, so it sounds like it is time to prove/disprove a few of my previous teachings.