Three QUT researchers are part of an international research team that has identified new opportunities for retailers to use artificial intelligence alongside in-store cameras to improve consumer behavior and customize store layouts to maximize sales.
In a study published in Artificial intelligence reviewthe team proposes an AI-assisted store layout design framework for retailers to make the most of recent advances in AI techniques and their sub-fields in computer vision and deep learning to monitor their customers’ physical shopping behavior.
Any shopper who’s pulled milk from the back corner of a store well knows that an efficient store layout showcases its wares to draw customers’ attention to items they didn’t mean to buy, increase browsing time, and related or to find viable alternative products grouped together.
A well thought-out layout has been shown to correlate positively with higher sales and customer satisfaction. It is one of the most effective in-store marketing tactics that can directly influence customer decisions to increase profitability.
QUT researchers Dr. Kien Nguyen and Professor Clinton Fookes from the School of Electrical Engineering & Robotics and Professor Brett Martin, The QUT Business School has teamed up with researchers Dr. Minh Le from the University of Economics, Ho Chi Minh City, Vietnam, and Professor Ibrahim Cil from Sakarya University, Serdivan, Turkey, teamed up to conduct a comprehensive review of existing approaches to store layout design.
dr Nguyen says improving supermarket layout design — through understanding and prediction — is a key tactic to improve customer satisfaction and increase sales.
“Most importantly, this paper proposes a comprehensive and novel framework to apply new AI techniques on top of existing CCTV camera data to interpret and better understand customers and their in-store behavior,” said Dr. nguyen
“CCTV provides insights into how shoppers travel through the store, which route they take and which sections they spend more time on. This study proposes to go further down and find that people express emotions through observable facial expressions, such as raising an eyebrow or opening their eyes and smiling.”
Understanding customer emotions while browsing could provide marketers and managers with a valuable tool to understand customer reactions to the products they sell.
“Emotion detection algorithms work by using computer vision techniques to locate the face and identify key landmarks on the face, such as the corners of the eyebrows, the tip of the nose, and the corners of the mouth,” said Dr. nguyen
“Understanding customer behavior is the ultimate goal for business intelligence. Obvious actions such as picking up products, adding products to carts, and returning products to shelves have attracted a great deal of interest from the smart retailers.
“Other behaviors like staring at a product and reading a product’s packaging are a marketing gold mine for understanding customer interest in a product,” said Dr. nguyen
In addition to understanding emotions through facial expressions and customer characterization, layout managers could use heat map analysis, human trajectory tracking, and customer action detection techniques to make their decisions. This type of knowledge can be assessed directly from the video and can be helpful in understanding customer behavior at the store level without the need to know about individual identities.
Professor Clinton Fookes said the team proposed the Sense-Think-Act-Learn (STAL) framework for retailers.
“First, ‘Sense’ consists of collecting raw data, such as video footage from a store’s CCTV cameras, for processing and analysis. Business leaders routinely do this with their own eyes; however, new approaches allow us to automate this aspect of capture and to do it across the business,” said Professor Fookes.
“Second, ‘Think’ consists of processing the data collected through advanced AI, data analysis and deep machine learning techniques, just as humans use their brains to process the incoming data.
“Third, ‘Act’ aims to use the knowledge and insights gained from the second phase to improve and optimize the supermarket layout. The process works as a continuous learning cycle.
“One benefit of this framework is that it allows retailers to evaluate predictions about store design, such as For example, traffic flow and behavior when customers enter a store, or the popularity of in-store displays placed in different areas of the store,” said Professor Fookes.
“Stores like Woolworths and Coles are already routinely using AI-powered algorithms to better serve customer interests and desires and provide personalized recommendations. This applies in particular to the point-of-sale system and through loyalty programs. This is simply another use case of AI to provide better data-driven store layouts and designs and better understand customer behavior in physical spaces.”
dr Nguyen said data could be filtered and sanitized to improve quality and privacy, and turned into a structural form. Because customer privacy was a key concern, data could be anonymized or anonymized, for example by examining customers at an aggregated level.
“Because there is an intense flow of data from the CCTV cameras, a cloud-based system can be considered as a suitable approach for supermarket layout analysis in processing and storing video data,” he said.
“The intelligent video analysis layer in the THINK phase plays the key role in interpreting the content of images and videos.”
dr Nguyen said layout managers could consider store design variables (e.g. space layout, point-of-purchase displays, product placement, cashier placement), staff (e.g.: count, placement) and customers (e.g. crowds, length of visit, impulse purchases, use of furniture, queuing, receptivity to product presentations).