By Dave McAnally, Associate Director, Content Solutions
Originally Appeared on EatenByGiants.com
On Wednesday, Google unveiled a new Apache Filter set. It's a logical extension of Google's Firefox/Firebug Page Speed Plugin. Rather than simply pointing out all the ways your site sucks up load-times, mod_pagespeed automatically takes care of issues, thus eliminating the leg-work of going in and doing it yourself. The fundamental purpose of this tool is to make your websites pages load quicker (50% quicker if the company's findings are to be believed). This is particularly of importance to large catalog sites commanding lots of scripts, database calls, CSS and images that wreak havoc on a less-than-stellar connection's ability to render something on your screen. It's been no secret that Google is now factoring in load-time into the algorithm.
How it works:
Mod_Pagespeed is for Apache 2.2 (no previous versions are currently supported). In order to use this tool, you'll need root access to your server so shared hosting accounts are out of luck (however, GoDaddy is poised to launch the mod on their entire network of sites so their customers have that to look forward to).
Overall Benefits of Mod_Pagespeed:
This truly is a case of "What's Good for Google is Good For You". By reducing the noise-to-signal for spiders to access content (by eliminating scripts and CSS from the crawl) Google is also making a smoother ride for search engine spiders. This is a logical place for Google to develop web tools, given the Caffeine update and the overall interest in maintaining a fresher index. The easier it is for spiders to access content, the faster and more frequent they'll be able to do so.
The upshot for users is that these tactics actually WILL help pages load quicker. With all the ways of accessing web content (DSL, 3G, 4G, Dial-Up, Ethernet etc) accessibility cannot be ignored. Based on where mod_pagespeed carves out server-load, it will be particularly popular with catalog and aggregator sites (they likely constitute the majority of the 50% load reduction sample). Not only will this tool produce efficient pages, but it does it on the fly, eliminating the burden of an IT team to find efficiencies (obviously each install still needs to be tailored to its users' needs).
So this is truly a win/win situation. Users get a more efficient experience, and spiders can index content faster.