For some reason I don’t think that Chrome users are spending that long loading the page!
If you want to speed up your page load times, run your webpage through Google’s Page Speed Insights, use CSS3 instead of images whenever practicable, serve gzip-ed content, and grind Freeways output graphics though ImageOptim.
Am in the red Google PageSpeed zone, with Deflate code in my .htaccess file (provided in the .htacess file post on Caleb Grove’s excellent blog).
In 2015, with layouts using much longer and fewer pages, and rising internet access speeds, are rankings in the 45 to 65 range out of 100 worthy of extra effort to compress?
If so, is the following measure still one to take?
Grind Freeways output graphics though ImageOptim
If so, the following questions become relevant:
It’s done to the images Freeway outputs and uploads to the server, right?
Therefore one must “regrind” images every time one re-uploads a Freeway file, right?
The scores are quite a bit better: For page speed I got an A, 2 B’s, 3 C’s and one D (66%). For Yslow I got an A and the rest Bs.
The three areas in which the page rankings get killed are “Serve Scaled Images”, “Specify Image Dimensions” and to a medium extent “Defer parsing of JavaScript”.
It looks like using retina ready flexible images is inherently slowing. Don’t know if there is anything I can do about deferring parsing.
Thanks for pointing out this alternative to Google PageSpeed! The breakdown of what is wrong is much more helpful. Am wondering if the rankings are not only more favorable but more accurate too.
I too find GTMetrix quite useful but don’t get too hung up on the numbers. Remember that these kinds of services are a good guideline but are not an absolute indicator of performance. It’s very easy to want to chase down every missing % point.
Thanks for pointing out this alternative to Google PageSpeed! The breakdown of what is wrong is much more helpful. Am wondering if the rankings are not only more favorable but more accurate too.