Many online shopping experiences end in disappointment. In 2020, $102 billion worth of products purchased online were returned to retailers — about 18% of all online sales. It’s a problem which has dogged online retailing: how can shoppers visualize themselves in clothes, shoes, or cosmetics they’ve never tried on?
A solution is at hand in the form of Augmented Reality (AR), which enables shoppers to bring store experiences home in the form of virtual dressing rooms.
With AR, consumers can “try-on” a product by capturing a photo with their mobile device and seeing that product overlaid on their image realistically. This helps them see themselves in clothing, footwear, eyewear, and makeup they’ve never tried on, touched, or felt.
AR is making the experience of browsing for and virtually trying on items more engaging and realistic. Research suggests that AR-enhanced try-on experiences lead to a 94% higher rate of converting browsers to buyers.
Retailers leading the way with AR
Customers that assume one brand’s sneakers will fit the same as the ones in their closet are often let down when they receive their delivery.
Nike has a competitive edge online, with its highly advanced measurement systems which compare customer’s foot measurements with other Nike customers with similarly sized feet to ensure optimal comfort and fit.
Retail giant Walmart also sees the potential in AR, having acquired the virtual clothing try-on startup Zeekit in 2021. Today, Walmart’s mobile app leverages computer vision and AR tech to show people of all body sizes how they will look in that item they’ve been eyeing. From Zeekit’s website:
With Zeekit’s Fitting Room, everyone can collect products and either try them on a model that looks like them, or even on themselves, just like in a brick and mortar store.
Whether you’re shopping for shoes, clothes, makeup, eyewear, or even jewelry, AR provides an avenue for shoppers to find their “perfect fit.” But virtual try-on technology isn’t just about providing exceptional customer service — it’s about increasing confidence among buyers by giving them a truer sense of the size, scale, and feel of your product.
66% of shoppers say that AR gives them more confidence about the quality of the product they are browsing. (NielsenIQ)
Data labeling challenges for virtual try-on AR applications
Virtual dressing rooms and enriched e-commerce buying experiences bring shoppers together with virtual apparel. Yet many retailers have found it difficult to provide dynamic, responsive experiences which allow shoppers to visualize themselves moving naturally with a virtual product, without choppiness or missing pixels. No customer wants to see video of their arm or leg missing due to inferior AR technology… even if it is just in a virtual fitting room.
There are many variables in a shopper’s environment which are out of the merchant’s control:
- A variety of camera angles, photo quality, and lighting conditions
- A near infinite range of body shapes and sizes
- Potential occlusions and other edge cases that are difficult to predict
Combine these variables with a vast catalog of products, many with different size, quantity, or color variations. Maintaining visualizations of every product, at every angle, and serving them up in a realistic way takes next-level AI functionality.
Put otherwise, in order for AR virtual try-on to behave predictably, retailers need uncompromisingly high-quality data to fuel their algorithms.
Taking the guesswork out of online shopping
Forward-thinking companies such as Volumental take personalized online shopping experiences one step further, producing shoe recommendations for millions of shoppers by leveraging a combination of 3D foot scans, retail purchase data, and AI.
In order to push the limits of what it means to provide a hyper-personalized online shopping experience with their AR-powered mobile app, Volumental knew they would need pixel-perfect labeled data:
“For our mobile app, we needed extremely precise segmented data because we knew that every missed pixel would easily add up to millimeters of lost accuracy.”
– Mikael Andersson, Sr Product Owner at Volumental
Volumental collaborated directly with Sama annotators on the ML-powered Sama platform to refine their CV models early with tight feedback loops, with annotators proactively raising and resolving edge cases.
Crucially, partnering with Sama assured Volumental that the diversity of their datasets would be accurately represented in their models: to deliver hyper-personalized experiences to delight their users across the globe.