Loading...
You are here:  Home  >  Uncategorized  >  Current Article

Edgecase Wants To Help Online Retailers Build A Shoppers’ Discovery Paradise

By   /  September 29, 2014  /  No Comments

shoppingby Jennifer Zaino

Late this summer, adaptive experience company Compare Metrics (see our earlier coverage here) rebranded itself as Edgecase, carrying forward its original vision of creating inspiring online shopping experiences. Edgecase is working on white-label implementations with retail clients such as Crate & Barrel, Wasserstrom, Urban Decay, Golfsmith, Kate Somerville Cosmetics, and Rebecca Minkoff to build a better discovery experience for their customers, generating user-friendly taxonomies from the data they already have but haven’t been able to leverage to maximum shopper advantage.

“No one had thought about reinvigorating navigation or the search experience for 15 years,” says Garrett Eastham, cofounder and CEO. “The interactions driving these conversation today were driven by database engineers a decade ago, but now we are at the point in the evolution of ecommerce to make the web experience evolve to what it is like in the physical world.”

In a physical store, for example, shoppers can immerse themselves in the visual environment around them to really live the “I’ll know it when I see it” experience, while also being able to describe their needs to sales staff in their own words – not by listing data from proprietary attribute tags used to describe an item. Online, however, their choices generally are between keyword search boxes whose results align with those attributes or browser-path filtered navigation, with little in the way of personalization to help them get to what they really want.

In usability testing with consumers, Edgecase found that consumers rated top retailers’ discovery and navigation experiences just five out of ten, and that close to three-quarters of them said they feared they’d miss out on what they really wanted because their views of products were artificially limited by misinterpreted searches and applied filters that cut out product options that would have considered. They also complained about the endless clicking to get to the next navigation refinements and long wait times for pages to reload in response. Little wonder, that, according to Eastham, “online conversion rates are nowhere near where they need to be. You’re lucky if you get five percent conversion.”

As Eastham describes it, there are very few retailers in the industry who can take big steps on their own to change the status quo of online shopping – as WalMart is doing with its WalmartLabs work, for instance (see here and here for more insight on that). “A lot of others can’t afford that,” he says, and there’s also the risk of trying to personalize the consumer path the wrong way. “We realize people want to have fun, be inspired, feel empowered and take control – the biggest gains come from there. The trick is you need data to connect all that experience.”

The Human-Machine Connection

And, he says, you need a combination of humans and machines to best put that data to use. Retailers’ systems already have proprietary tags to define manufacturer attributes of their products, for example, while human curators can step in to view that structured data to drive a lingua franca of shopping – a shared vocabulary with the consumer that connects the existing merchandiser taxonomy with language that makes sense to a good percentage of a retailer’s customer base. In the case of Kate Somerville Cosmetics, for example, ounce-size bottle manufacturer data attributes were translated by humans into natural language that shoppers would be more likely to use, such as cosmetics to take while traveling, or travel-size or mini cosmetics.

“So travel-size becomes a shoppable experience that you can go to, but now we also have a connection between travel, cosmetics and quantity, and we can build upon that,” Eastham says.

Additionally, machines can bring further information aboard that can be useful to improving a retailer’s discovery experience by scraping product feeds or content from external sources like reviews. A human curation strategist can view these unstructured data sources and use that to learn new ways of expanding the web site taxonomy to suit how customers are talking about their products, indicating how the audience at large may be planning to look for them. “The best way to help structure and scale is have humans looking at data points and making connections,” Eastham says. Today, the company has 20 part-time human curators creating that information at scale, but it’s aiming to grow that as it expands its technology to more Internet retailers.

“The reality is that on their own, retailers barely have enough time to take manufacturing data basics, tag items and go on to the next one,” says Lisa Roberts, VP of product and marketing. “They don’t have time to watch how people do or don’t navigate the site by attributes tied to that product and optimize for that.”

The beauty, she also says, is that Edgecase gets explicit input from shoppers as to how good a job it’s doing as to how people describe a product and what about it resonates with what they are looking for. “That’s where machine learning comes back in to naturally optimize what is shown on the page, but it also provides intelligence into the curation team to think about what new words we need to add to this vocabulary.” The discovery of new terms for addition to the taxonomy also can be powerful information to share out to other retail business units, such as marketing and paid search management.

Soft-filtering plays into optimizing the experience, too, so that options that are just outside the edge of what shoppers say they are looking for – say, a scoop neck dress for a user that requested to see round neck styles – are included in results, too, says Roberts. As does contextual imagery, so that consumers can select the appropriate picture that fits their definition of what, for example, they mean when they say they want a short dress.

Indeed, strong and speedy visual navigation becomes an even more important tool in the arsenal on mobile platforms, “You want to semantically express yourself through a selection of images,” she says. “Mobile is a smaller screen and people don’t want to type to search as much. They just want to tap.”

Edgecase touts success such as Urban Decay seeing up to 16 percent better conversation with Edgecase integration. The company emphasizes there is no ripping and replacing of the standardized languages across products and portfolios retailers already have to realize such result. “That is one of the secrets,” Eastham says. “We are very aware that having that information is so critical to product discovery.”

About the author

Jennifer Zaino is a New York-based freelance writer specializing in business and technology journalism. She has been an executive editor at leading technology publications, including InformationWeek, where she spearheaded an award-winning news section, and Network Computing, where she helped develop online content strategies including review exclusives and analyst reports. Her freelance credentials include being a regular contributor of original content to The Semantic Web Blog; acting as a contributing writer to RFID Journal; and serving as executive editor at the Smart Architect Smart Enterprise Exchange group. Her work also has appeared in publications and on web sites including EdTech (K-12 and Higher Ed), Ingram Micro Channel Advisor, The CMO Site, and Federal Computer Week.

You might also like...

Smart Data Webinar: Advances in Natural Language Processing II – NL Generation

Read More →