Ruby on Rails and How Scaffolding Architecture Stacks Up to Spiders

One of the emerging technologies that has become somewhat of a darling among open source programmers is Ruby on Rails. It was originally launched in 2004, and gained a cult-like following and has since begun to make its way into more mainstream development. Simply put, it’s an application framework written in Ruby that aims to simplify the development and deployment of web applications. RoR eliminates many of the time consuming parts of development (such as compiling) by relying on conventions over configurations. The end result is that custom applications that may have taken a month to develop can now be launched from soup to nuts in a matter of a couple days! It's being embraced by a few popular websites and applications you may already use, such as Twitter, Base Camp and Yellowpages.com. Its ability to work with multiple hosting apps (a wide range of databases and CPanel support) are propagating it, and with the hefty investment from Benchmark, it certainly seems poised to grow in popularity.

As RoR continues to grow, I’ve noticed a few things that could create potential natural search and promotion issues:

- Javascript , Python and AJAX reliant – While very functional, many vital elements of what ends up being the interface exist in ways that appear invisible to spiders. This will require some additional development for content meant to be crawled.

- Extremely Load Sensitive – While many hosts aim to provide robust hosting to support RoR, the fact remains that it is essentially a servlet, and a beefy one at that. This could pose a problem to your visitors and search engine spiders if your site experiences heavy traffic.

- Apps may not be able to be ported from their native site (support is definitely not universal). Ergo, developing widgets with RoR may not be very practical (as opposed to pure xHtml).

- Built in URL mapper – This is a great feature of RoR, but you HAVE to use their URL mapper with no ability to use a 3rd party or your server’s (e.g. Mod Rewrite). It seems fairly straightforward, but since it isn’t tied to the server technology, it certainly seems like it could pose some potential cloaking issues.

This certainly isn’t meant to deter anybody from embracing RoR (or any Model View Controller scheme), however, there are certainly considerations to be made. There are always options to ensure that your site can be seen by spiders (as well as your entire user base), but as with any new technology, the pros and cons, as well as the back-up options, need to be considered.

Posted by: Dave McAnally, Product Specialist, Natural Search

1 comments:

Anonymous said...

Thanks for the post -- developers often don't fully consider SEO when choosing a platform, which is a problem. It's also great to see an SEO who considers nuts-and-bolts issues when making recommendations.

I would clarify a couple things, from a Rails developer's perspective:

1) AJAX can be problematic for SEO, absolutely. However, Rails doesn't demand that anyone uses AJAX. It's a design choice, and the AJAX/SEO issue affects Rails no more than it affects stuff like .NET or CakePHP, which also encourage AJAXy design.

2) Rails' built-in "routes" system doesn't really conflict with mod_rewrite; many large sites use both at the same time.

3) "Load-sensitive"... ok, I can see why you'd say that. Deploying under heavy load is complicated, and it's definitely a weakness. I'd only want to clarify that it's still possible to deploy big, robust, fast sites in Rails.

Great topic for a post, and thanks again! I've got a response posted here as well.

 
Copyright © 2008 Resolution Media, Inc. All rights reserved.