Amazon aids blind customers with Alexa-enabled devices
Amazon is using video and voice chat to identify products for visually impaired consumers.
A new feature of the Echo Show smart home camera enables blind and low-vision customers to hold up an item to the camera and ask, “Alexa, what am I holding?” The Alexa voice assistant service then helps identify the item through advanced computer vision and machine learning technologies for object recognition.
“The whole idea for Show and Tell came about from feedback from blind and low vision customers,” said Sarah Caplener, head of Amazon’s Alexa for Everyone team. “We heard that product identification can be a challenge and something customers wanted Alexa’s help with. Whether a customer is sorting through a bag of groceries, or trying to determine what item was left out on the counter, we want to make those moments simpler by helping identify these items and giving customers the information they need in that moment.”
From early research through product development and testing, Caplener’s team collaborated with the Vista Center for the Blind and Visually Impaired in Santa Cruz, California. Blind and low-vision Amazon customers and employees participated in user studies, providing feedback to the Alexa for Everyone team.
Show and Tell is now available to Alexa customers in the U.S. on first and second-generation Echo Show devices. Customers can say, “Alexa, what am I holding,” to get started.