All Posts

How Does Visual Search Work?

Visual search has been gaining a lot of momentum. Product images are of course provided on websites to assure purchasers that they have selected the right product, but visual search takes it to the next level. Visual search allows users to search for images rather than for text. Users can select a camera in the search bar of their mobile device or computer, snap a picture, and hit “search.” Away it goes and brings back a list of results, as shown in the figure below. How does it work?download

Two approaches to visual search

Two basic methods can drive visual search.

One is the use of image metadata. The image is tagged to indicate a category and selected attributes such as color, shape, and an array of other specifications.  In this version, the search function is still using text to return results, because it is looking for words.

The second type of visual search uses reverse image retrieval. The image is the query. An algorithm identifies similar images based on shape, color, texture, and other features. These other features or patterns detected by an AI may be characteristics that the human eye can detect, but they may go beyond human visual capability as well.

New call-to-action

Applications for visual search

Visual search made great strides in visual media such as Pinterest and iNaturalist. It then moved to retail, where users can take a picture of an item and the search function can then find a similar product, such as a yellow, long-sleeved sweater. Now visual search is moving into industrial applications. Grainger and Home Depot both have visual search available on their apps. The algorithm works very differently for a Home Depot app and a Pinterest app, however. Different training data is required to make the visual search mechanism functional, and the product selection is much more specific and technical.

Visual search architecture

First, a strong image database is required, with many images in different angles and applications shown for each product.  A strong product data taxonomy with category specific attributes is also necessary to ensure that the API can identify the correct category for the image that is being queried. The category will be able to source all images of the products classified to it. Attributes are managed in the product database and tagged to the image to allow for additional functionality and aid the AI. This approach is used in the Home Depot app. The material and category are identified by the app, which allows the scoring to provide a more specific collated search result. The figure below illustrates these steps.

download (1)

Visual search tools

The next step is tool selection. Many different visual search tools are available, and it’s necessary to pick the right fit for each application, whether it be retail, industrial, scientific or artistic. Finally, the tool needs to be trained. This process uses a combination of the image database and the product database to get started.  Human in the loop is necessary to ensure that the image queries are acting successfully, letting the AI know when it is correct or incorrect so it can learn from that feedback. These steps will lead to a strong visual search.

download (2)

A successful visual search can result in a variety of outcomes. Some consumers simply want identify an item. If they don’t know what a widget is called, they can take a photo of it and use a visual search app to identify the product or an associated part number.  In some cases, success goes a step further, leading to the purchase of an item.

Why invest in visual search?

Adding visual search functionality is a great way to grow revenue. According to research from ViSenze, 67% of millennials prefer visual search. They are the largest growing group of consumers, and visual search is only going to become more and more of a requirement in the future. It’s important to stay ahead of the curve and get started now.

WATCH: Grow Revenue and Improve Operations with Visual Search

We can help you get started with your product database and image library taxonomies and metadata. We can also help you select the best visual search tool for your company. Ready to get started? Give us a shout.

 

Chantal Schweizer
Chantal Schweizer
Chantal Schweizer is a taxonomy professional with over 10 years of experience in Product Information and Taxonomy. Prior to joining Earley Information Science, Chantal worked on the Product Information team at Grainger for 9 years, Schneider Electric’s PIM team for 2 years and had some previous work in PIM consulting.

Recent Posts

[Earley AI Podcast] Episode 26: Daniel Faggella

Human Cognitive Science Guest: Daniel Faggella

[RECORDED] Master Data Management & Personalization: Building the Data Infrastructure to Support Orchestration

The Increasing Criticality of MDM for Personalization for Customers and Employees Master data management seems to be one of those perennial, evergreen programs that organizations continue to struggle with. Every couple of years people say, “we're going to get a handle on our master data” and then spend hundreds of thousands to millions and tens of millions of dollars working toward a solution. The challenge is that many of these solutions are not really getting to the root cause of the problem.  They start with technology and begin by looking at specific data elements rather than looking at the business concepts that are important to the organization. MDM programs are also difficult to anchor on a specific business value proposition such as improving the top line. Many initiatives are so deep in the weeds and so far upstream that executives lose interest and they lose faith in the business value that the project promises. Meanwhile frustrated data analysts, data architects and technology organizations feel cut off at the knees because they can't get the funding, support and attention that they need to be successful. We've seen this time after time and until senior executives recognize the value and envision where the organization can go with control over its data across domains, this will continue to happen over and over again. Executives all nod their heads and say “Yes! Data is important, really important!” But when they see the price tag they say, “Whoa hold on there, it's not that important”. Well, actually, it is that important. We can't forget that under all of the systems, processes and shiny new technologies such as artificial intelligence and machine learning lies data. And that data is more important than the algorithm. If you have bad data your AI is not going to be able to fix it. Yes there are data remediation applications and there are mechanisms to harmonize or normalize certain data elements. But looking at this holistically requires human judgment: understanding business processes, understanding data flows, understanding dependencies and understanding of the entire customer experience ecosystem and the role of upstream tools, technologies and processes that enable that customer experience. Until we take that holistic approach and connect it to business value these things are not going to get the time, attention and resources that they need. In our next webinar on March 15th, we're going to take another look at helping organizations connect master data to the Holy Grail of personalized experience. This is an opportunity to bring your executives to a webinar that will show them how these dots are connected and how to achieve significant and measurable business value. We will show the connection between the data, the process that the data supports, business outcomes and the and the organizational strategy. We will show how each of the domains that need to be managed and organized to enable large scale orchestration of the customer and the employee experience. Please join us on March 15th and share with your colleagues - especially with your leadership. This is critically important to the future of the organization and getting on the right track has to begin today.

[Earley AI Podcast] Episode 25: Michelle Zhou

Data Tells the Story Guest: Michelle Zhou