By Dave McAnally, Associate Director, Content Solutions
You’ve probably heard people tell you that ‘Google doesn’t weigh things like they used to’ anymore and that these days, it’s all about how social your digital footprint is (even though most professionals don’t actually think that). Well I don’t dispute that new technologies may change
how we need to think about basic SEO principles, however I will suggest that some of those basic principles are just as vital today as they ever were. This brings me to a quasi-case study of sorts to share. We’ve recently implemented a XML sitemap on a site that has about as close to a ceteris paribus scenario as one could reasonably expect. Thus, it makes it a great candidate to measure how much impact fully indexing a site has on its ability to drive traffic.
The background: This is a site with both high-level product content and specific retail/service location pages with highly geo-specific content. The site itself has a few quirks to it which have likely made pertinent content difficult for engines to find. It has flash elements with occasionally relevant text content, but it’s also is a controller-based CMS hierarchy (where ‘directories’ in the URL aren’t really ‘directories’ but controller parameters for how the CMS displays the content). The net result of this implementation is that there technically is only one static page on the site despite thousands of URLs being indexed. That said, it appeared Google was able to parse
through around 250 URLs (the site has unique content on multiple geo-specific locations).
What we did: So mid-October, using a series of tools to crawl and build an XML Feed, we uploaded a basic setup (no priorities or anything, just the full meal deal of around 3800 URLs). We then pointed Google to the sitemap location via Webmaster Tools and included an auto discovery line in the Robots.txt file. Then we waited….(it bears mentioning that pretty much nothing else was implemented on the site during this period nor is the fall a seasonal time for this website). Within a matter of two days, Google was reporting it was now indexing 2519 URLs.
The bulk of these new URLs we identified as pages for specific locations which presented a whole new realm of potential geo-specific queries we were now in the game with relevant content for. The Resolution Media Analytics Team developed the following chart tracking the effects on
traffic (note, this is a truncated period of time for the sake of display):
This is a chart showing the visits from natural search engines over the period of a couple months.
The dotted lines are kicking in at the approximate date the XML sitemap was deployed. Based on a series of moving averages and regression, the red dotted line indicates how traffic fluctuations for this site SHOULD have continued unabated. The black lines are our 95% confidence interval (essentially, we are 95% sure anything that happens will fall within this space. The solid black line is our actual data from web analytics. As you can see, the actual data shoots right to the top of the confidence boundary and actually trends over it for periods of time. We interpret this as indicating that an effect outside of what we would normally expect has
occurred in order for this to happen (in this case, the deployment of the XML sitemap).
The following is how this has affected the moving average of traffic on the site:
By November 9th, this site had well over 3500 URLs in both Google and Bing’s index. As you can see, we have a not-insignificant lift that falls well outside the norm of what we’ve been seeing the past few months (again I have this condensed for display purposes). Continuing as is, we do not see any seasonal spikes for this client year over year so clearly there is a change outside the norm.
Is this the end-all-be-all of SEO for this client? Certainly not! We still have many things we need to do. However, ignoring a basic tactic like this simply leaves money on the table. The important takeaway is that things that may seem trivial now can still have significant impacts on traffic. We hear that ranking signals are changing and that certain things aren’t necessary to be competitive in natural search, but this definitely shows that basic indexing tactics are still effective in driving traffic.