Tech Guest Viewpoint: Improving Customer Experiences with Bots

2/5/2016

We often talk about how to avoid, deter and block bots. But much opportunity lies in figuring out how to distinguish between good and bad bots, and to understand how the distinctions change across applications and environments, especially within the retail industry.



For many retailers, bots represent as much, if not more, of their traffic than human users. In fact, on a typical day, Akamai sees over 10 billion requests from over 60 million bots. Some of these bots, such as search engine bots, are helpful, but many are harmful. Bots consume bandwidth and server resources and can negatively impact your site’s performance which, in turn, impacts sales and conversion. As a result, it would seem logical that retailers would simply block all this bot traffic and conserve their resources for real human consumers. The solution, however, is not so simple because there are many different kinds of bots, some of which are quite essential to your site’s success.



By definition, a bot (short for "robot") is a program that operates as an agent for a user or another program, and it often simulates human activity. On the Internet, the most ubiquitous bots are the programs, also called spiders or crawlers, that access websites and gather content for search engine indexes. In fact, there are good bots like Google’s Web crawler, indifferent bots such as commercial shopping engines, and bad bots like those that scrape and steal content. Because bots and their impact vary depending on industry and intent, a retailer’s solution has to be more nuanced than simply blocking all non-human traffic.



By implementing methods to manage bots – rather than just block them – retailers can improve customer experiences. For example:



Keep the good bots: Google and Bing are the most obvious bots that retailers want accessing their content. If you want to pave the way for these bots, you must create rules that speed them through and ensure that bots like Google crawl the same content as your users see. This will help to avoid any SEO cloaking issues that could negatively affect your search results.



Improve performance: Many commercial shopping engines, online travel agents or partner bots may need to crawl your site but they can also cause site performance issues if they are not properly rate limited. One option is to slow or delay these bots while still allowing them to crawl your site at a setting that utilizes acceptable thresholds for bot traffic, thus providing retailers with more control over the impact on their sites performance.



Importance of alternate content: This is useful in cases where you want to have a bot parse your site but you want to reduce load on your real-time, back-end by returning cached content. For example, many travel retailer sites are constantly being scraped or scanned by other online travel agents who aggregate this data for customer comparisons. While it’s necessary for this to happen to show up in these OTA travel searches, multiple requests to the backend for pricing in a short period of time are not necessary. As such, these requests can be cached, saving load on the back end.



Alternate origin: This is important for a few reasons. First, it allows you to send unknown bots to a different server to reduce load on your primary site or perhaps to an API where you would prefer the request go. This will ultimately allow for a better offload. Furthermore, much like the example of alternate content, often times the sites utilizing bots will not ask you if and how they can use your data. That said, it’s important to have a built-in optimized API solution. This allows bots to gather your site’s information as a way to push them in the right direction by forcing all traffic directly to the API option.



Remember that not all bots are bad bots. Many of them are helping your customers find your products in brand searches or price comparisons. By understanding how and why bots are accessing your website and managing them to benefit your site and your visitors, you can effectively increase customer satisfaction and sales.







Jason Miller is chief commerce strategist of Akamai Technologies.


X
This ad will auto-close in 10 seconds