Multivariate Test in SEO is a Trick, but a Must

By Nathan Janitz, Natual Search Supervisor, Content Solutions

As tricky as it is, natural search optimization needs to take some notes from paid search and become better at using web analytics to assist in optimizations. The whys are simple to answer, but the hows are much harder.

At the beginning of the year, Lily Chiu over at Omniture blogged about her 2009 Optimization Wish List. Her first wish on the list is “Agencies getting on board” with multivariate testing. Last year I blogged about Optimizing Paid Search with Web Analytics. While more basic than multivariate testing, it was a start in the process of making that wish a reality. Quickly after that post went live, Bryson commented that the post could work with all forms of search (something that Omniture has been preaching for years). A few months ago, I switched my primary responsibility to focus more on the SEO and Web Analytics side of search.

While fairly easy in paid search, implanting web analytics based optimizations including multivariate testing, in SEO tends to be a little trickier. And it comes back to 2 simple problems: 1) a user segmentation dilemma 2) slightly conflicting goals.

When optimizing a page for natural search, we optimizers have to worry about two users: a spider and a real person. In a perfect world, the needs of search spiders would be perfectly in line with that of the actual end user. In reality, spiders interact with content for a completely different reason than end users…yet both are equally as important.

A search spider’s job is to look at content and data about a site and then determine a level of relevancy to a specific query. At the end of the day, it lives to help someone determine what site is the most qualified. It has a simple, but important job. Because of the importance of its job, we as optimizers try to do everything in our power to cater to the spiders needs (i.e. turn flash sites into html). But on our best day, we will never be able to convince the spider to buy anything on our site.

Thus leading us to the end users….the person we are hoping buys something. Once we get to that coveted 1st position ranking, we then have to worry about making sure that the page is still appealing to our potential customer. Again, in a perfect world those would be the same. And again, what “should” happen and what “does” happen are rarely the exact same.

So, how do we test for user experience (different calls to action, content presented in different ways, etc...) while still catering to the needs of a spider? The quick answer is progressive enhancements with a twist.

Here is the breakdown; you have to show the spider the content in one way while being able to test the presentation of the SAME content (or extremely close to it) without cloaking your site (no black hat tactics here). Start by developing a page that is html based and has plenty of optimized text and other features. The idea is when a mobile browser, text browser, or spider hits the page the browser/spider can interact with the content perfectly.

Next, find a multivariate testing tool that uses JavaScript to overlay itself on the base content (not replace the base content). Again, this is the same principle of progressive enhancement. With the proper tools, you can optimize the content (body text, titles, descriptions, etc.) separately from the way it’s presented to the potential customer (Do we need a video? Does the button belong on the left or the right? More color or less? Etc…).

The downside is that this will require even more coding to the page, which could be a resource drain. Also, the implementation of the JavaScript isn’t spider friendly (one of the reasons why this method works). If you don’t take the time to minimize the code of the multivariate program on the website, you run the chance of slowing the crawl of the spider.

Once it is set-up, the same KPI’s matter. Optimize away from high bounce rates and toward high conversion rates. Try to segment users for further testing.

Here is a slight kicker. In paid search, you can take two keywords that are very similar and send them to completely different pages. While you can in SEO, it is very difficult to have two pages rank for something like “page” vs. “pages.” So, while in paid you have the ability to keep keywords online longer and finally cut ones that don’t work; SEO is a little harder to manage at that level. Unfortunately, you will have to live with some keywords just performing poorly.

Also, take a look at the testing strategies for each user (spider included). You don’t want to get hit with a cloaking penalty, so make sure to sync up the content for each user after each test. If you find that using a verb a certain way helps sell a product better, distribute it to the rest of your users. Again, the idea is NOT to generate 2 different forms of content, but to test how each user engages with HOW the content is presented. And again, if you push the envelope and become too liberal in the separation of the two users, you will get hammered by the engines…and your testing will be for not. Like I said, this isn’t the easiest integration.

Integrating multivariate testing with natural search isn’t as easy as it is with direct traffic, paid search, or even display. But like everything, if you take the time to implement it right, you will be able to increase position within the engines while still improving site performance.

Natural search moves at a slower pace than any other online medium, so take the time to set up your plan correctly. Implanting correctly will give you a big jump over your competitors; shortcutting your plan could cause you to drop in the rankings. Happy hunting!!

0 comments:

 
Copyright © 2008 Resolution Media, Inc. All rights reserved.