Getting Ready for the Spiders

News

Outdoor-lifestyle retailer Cabela’s, Sidney, Neb., wants the right content to show up on a search results page. But search-engine “spiders” were having difficulty reaching its Web site’s individual pages.

Thanks to a new search-engine optimization application, reading and understanding Web-site information is now easier.

When online consumers use popular search engines such as Google and Yahoo, robotic crawlers go out and find—or “spider”—a Web site to retrieve common documents that classify the content on all of its pages. The spiders determine what each page is about and decipher what is most appropriate for each search.

“Cabelas.com’s URLs were very long and complex, making it difficult for spiders to move through our site,” noted Derek Fortna, Cabela’s Internet marketing program manager.

To overcome this, Cabela’s used a third-party solution that created a separate spiderable copy of Cabelas.com. This copy was based on a weekly data export that included all product information (pricing, images, descriptions, etc.) found on Cabelas.com .

“This secondary Web site, which replaced longer URLs with shortened versions throughout each re-created Web page, provided easy access for spiders from Google and other search engines to reach our content and index it within their databases. As a result, potential customers could find us in search results," Fortna said.

The solution presented a series of challenges, however.

“Transparency between our native Web site and the search-engine optimization [SEO] version was an issue,” noted Fortna. “Customers would move from the SEO version, which looked different than our native Web site, to our native site. The difference was confusing for our customers and not good for our brand.”

The SEO version of Cabelas.com was only updated weekly, causing it to be out of sync with the native Web site’s daily updates, and it the affected accuracy of available inventory and pricing, he said.

There was a bigger problem: When search engines spidered the SEO version’s item pages and customers then tried to access them, “They were often redirected to the page on our native site without seeing any of the SEO work that may have been added to the item pages,” Fortna said.

Known as “cloaking,” this refers to how a search engine spiders different content than what consumers see when they click through on a results page.

To improve the synchronization of its native Web site with the SEO Web-site copy, Cabela’s tapped long-time partner, Netconcepts, Madison, Wis. The retailer joined forces with Netconcepts in 2001, when the vendor began building e-commerce sites for subsidiary companies including CorporateOutfitterCabelas.com, VanDykes.com and VanDykes Taxidermy. Cabela’s implemented Netconcepts’ Gravity Stream solution in July 2006.

“The solution provides a spiderable site. It is essentially the native site changed automatically via a proxy,” Fortna said. “There is no physical second site to keep updated, and the spiders see the same content that customers do when they click through.”

However, the natural search program within Gravity Stream, which does not include traffic from the core brand terms, has the lowest conversion rate out of all its marketing programs, Fortna noted.

“But overall, the large increase in un-branded traffic we’ve seen in our Gravity Stream program has also driven sales substantially on the branded site,” he said.

The clear visibility in terms of linking natural search click-throughs to sales made for dynamic results. In 2007, Cabela’s saw a 140% increase in visits and a 70% jump in sales.

Fortna said boosting SEO and building a natural search channel is critical to survival in today’s online marketplace.

“Retailers need solid reporting in place so that they know where they stand and can make educated decisions going forward,” Fortna said.

Recommended stories

Login or Register to post a comment.