Server Admins need input on that CMS decision too!

Rand Fishkin had a great post over on SEOmoz on Content Management systems last week. The crux of the article is that regardless of your intentions, a decent CMS system is going to be preferable to a static website at some point. He touches primarily on free CMS tools, which are great, but in all reality, I'm hesitant to recommend them to high profile brands/sites. By nature of being open source, they are prone to hackers and malicious use. It's not that any of them are bad or unable to perform the tasks they say they can, but when there are entire communities devoted to coming up with ways to crack Joomla, I wouldn't recommend it to a high profile client. There are many premium CMS tools that are geared towards large corporate identities and are built to suit their needs. Investing in a proprietary rock solid CMS isn't as cheap as say, purchasing MS Office for a workstation. But they are going to replace the work a full-time employee would have done less than 10 years ago while taking your site functionality to all new levels, so they are more than worth the money.

Rand's article outlines a lot of core functions of a CMS, but the capabilities of modern CMS tools go way beyond that. For example, we have clients that use their CMS to not only serve content, but determine where the visitor is from, and tailor said content to their geographic locale (IP based, so no initial logins are required). From there, it’s also common that an e-commerce layer is built in. This functionality is essential to a global business, however, as with the basic CMS functions, it poses hurdles for spiders that need to be addressed. And they aren't always obvious either (especially for the scale of hosting we’re usually looking at). I'm talking specifically about how these robust systems can eat up server/network resources when not configured properly. The reason is simple; you've got a CMS tool that puts that many more database connections in per session than simple page requests. Pile on functionality many premium tools offer and those connections keep adding up. Throw in e-commerce (usually hitting a different DB entirely), a few million visits per day/week/month and there's quite a bit going on! Don't get me wrong, companies have very elaborate server farms for this exact reason. Nevertheless, we still see times where some of the largest sites in the world that aren't able to keep up with the volume requested, and this is one of the reasons why. Basically, what I'm trying to say is that the more functionality you bake into your CMS, the more work it creates for your hardware.

Why does this matter for natural search? Well aside from the obvious compromise in user experience (who wants to wait around for pages to load?), spiders don't like waiting around. They REALLY don't like 500 level errors either. Last year, SEOmoz's poll among various SEO's showed a consensus that server inaccessibility is one of the most negative impacting events for optimization.

So listen to your server admins when they say they need more power! If you are upgrading to a professional CMS tool, your resource requirements are going to be different. Understand the traffic volume your site gets and anticipate accordingly. Most premium CMS providers will be more than able to help you assess this. After all, the last thing anybody (bots, CMS account reps or otherwise) wants is downtime!

Posted by: Dave McAnally, Product Specialist, Natural Search

0 comments:

 
Copyright © 2008 Resolution Media, Inc. All rights reserved.