By scanning the AQR code on the front-of-pack, shoppers can access the Be My Eyes app, connecting them to a volunteer for live cooking instructions or to a virtual AI-chat bot utilizing Chat GPT-4 capabilities to address recipe or cooking questions.
The FMCG giant will use the AQR codes to provide product information such as usage direction, recycling and ingredient information and nutritional value, interacting with standard smartphone accessibility through the Be My Eyes app to share information via audio description or displaying information in larger text.
This development marks the first integration of Be My Eyes AI technology with a food product, offering an AI-assisted cooking experience at home.
It builds on the addition of on-pack AQR, developed by computer vision specialists Zappar, to Unilever’s Persil and Colman’s products in the UK last year and is part of Unilever’s global connected pack strategy, which includes using new digital experiences and technology to evolve and differentiate the way shoppers interact with and use Unilever’s products.
“We’ve accelerated digitizing our packs to offer new opportunities for brand engagement and elevated shopping experience and now we’re also focusing on how we can also use digital experiences to make our products more accessible,” Rachana Dongre, senior digital engagement and strategy lead of Nutrition & Ice Cream at Unilever.
“Zappar’s AQR codes mean we can support blind and low vision shoppers to have equal access to information and integrating Be My Eyes into these codes offers a new way to make the full experience of our products more inclusive, from the shopping aisle right through to cooking at home.”
7x normal QR scanning range
The Be My Eyes app, which is free, allows users with blindness, low-vision, or deaf-blindness to receive live accessible information from volunteers, trained customer support representatives and AI.