DEEP LEARNING
Typing text into a box remains the staple of how we search websites and browse online stores. However, recent advances in image-recognition software are being used to change that. The social network Pinterest and the online footwear retailer Shoes.com are testing an artificial intelligence method of deep learning (the building of learning machines from five or more layers of artificial neural networks) and they aim to use images rather than text in searching or browsing.
Deep learning has recently enabled software to match humans on some benchmarks for image recognition. This technique is what powers Google’s image search and its photo organization service.
Pinterest’s new visual search tool allows you to draw a box around a portion of an image on the service, and from there, you can find visually similar items from an index of over a billion images.
For example, drawing around a light fixture seen in a photo of a room will turn up others like it, including close-up images of the same exact model. Some items requested by a visual search come with buy buttons attached — a feature Pinterest introduced this summer. The image search function was rolled out last week to all users of the company’s website and mobile apps. Pinterest’s system also learns to understand images when people draw on the text attached to photos shared on the service.
Pinterest Head of Visual Search Kevin Jing says that visual search now has a better chance of becoming indispensable. “Image representation coming from deep learning is much, much more accurate,” he said to MIT review. “Even this year there has been so much improvement.”
IMAGE-PROCESSING TECHNOLOGY
Footwear retailer Shoes.com is applying a different approach to visual search also powered by deep learning. The company is the first to make use of image-processing technology aimed at retailers developed by AI startup Sentient, which has raised over $143 million from investors.
Initially, Shoes.com is testing Sentient’s technology in the women’s boots section of its Canadian store. Click on the “visual filter” button and you are presented with a grid of 12 images that Sentient’s software thinks represent the most distinct set of styles from the company's catalogue of some 7,000 boots. Choose the one closest to what you’re looking for, and the software will use the visual characteristics of your choice to refresh the grid and show 11 others that are similar to it. Repeating the process makes it possible to home in on a selection of boots with very particular characteristics.
Sentient’s Chief Technology Officer Nigel Duffy says the new feature shows how software that can understand images makes online shopping more efficient. “This is a category where it’s very hard to describe in words to a search engine what you’re looking for,” he explains. “We can get really granular preferences very quickly.”
Shoes.com CEO Roger Hardy says there is evidence that the visual filter feature increases sales and that he is considering introducing it to other categories of footwear. Sentient’s Duffy adds that he expects to see the underlying technology applied to other products such as jewelry, bags, and other accessories.
As of now, the new visual search tools are not perfect. It is only when a lot of people have tried them will it become clear whether image-processing technology has improved enough to significantly change how people interact with online services.
Previously, other companies have tried to use image-search technology to make shopping or discovering products more convenient. Last year, Amazon bundled an app to look up products snapped in a photo with its unsuccesful Fire smartphone. In 2010, Google acquired Like.com, which had launched a shopping comparison site that could search products that were visually similar to one you picked and even let you highlight important details on an image to guide its selections.
Currently, Google shows visually similar products on its shopping site, but doesn’t allow you to highlight details.
Share This Article