Google continues to push the boundaries of generative AI with innovative updates to its shopping and image recognition features. These advancements aim to enhance the online shopping experience and provide users with valuable information about their skin conditions.
Let’s delve into the details and explore how these technologies are revolutionizing the way we shop and care for our skin.
A groundbreaking virtual try-on tool for apparel has been introduced as part of the upcoming Google Shopping updates. Leveraging a diffusion-based model developed by Google, this tool predicts how clothes would look on real fashion models.
By simulating various poses, the AI model accurately captures how the clothing drapes, folds, stretches, and forms wrinkles and shadows.
Google trained the model using a vast dataset consisting of paired images. Each pair contained a person wearing a garment in two distinct poses. Additionally, to enhance the model’s robustness, random pairs of photos featuring different garments and people were used.
This comprehensive training approach ensures accurate representations of the clothing on diverse models.
Starting today, U.S. shoppers using Google Shopping can virtually try on women’s tops from popular brands like Anthropologie, Everlane, H&M, and LOFT.
To access this feature, simply look for the new “Try On” badge on Google Search. Men’s tops will also be available later this year.
Lilian Rincon, senior director of consumer shopping products at Google, highlighted the importance of building shopper confidence. When trying on clothes in physical stores, we can immediately assess if they suit us.
However, a survey revealed that 42% of online shoppers do not feel represented by model images, and 59% express dissatisfaction when the received item looks different from their expectations.
To address these concerns, Google’s virtual try-on technology aims to lessen the gap between online and in-store shopping experiences. While Amazon, Adobe, and Walmart have also experimented with generative apparel modelling, Google emphasizes the use of real models to ensure diverse representation.
Models of various ethnicities, skin tones, body shapes, and hair types, ranging from sizes XXS-4XL, were chosen. However, concerns remain about the potential impact on traditional photo shoots and model opportunities.
Coinciding with the virtual try-on rollout, Google introduces advanced filtering options powered by AI and visual matching algorithms. These filters, available within product listings on Google Shopping, enable users to narrow their searches based on colour, style, and pattern.
Similar to how store associates assist customers in physical stores, these filters provide an extra hand when shopping for clothes online.
They suggest and find alternative options based on the items users have already tried on, enhancing the personalized shopping experience.
Expanding the capabilities of Google Lens, the computer vision-powered app, Google now enables users to search for skin conditions similar to those they might observe on their own skin.
By uploading a picture or photo through Lens, the app initiates a search for visual matches. This feature proves valuable when individuals encounter skin issues like moles, rashes, or other conditions that are difficult to describe accurately in words.
While this feature falls short of the AI-driven app Google launched in 2021, which diagnosed skin, hair, and nail conditions, it provides users with crucial information for making informed decisions about seeking medical attention or opting for over-the-counter treatments.
As previously announced, Google Lens integrates with Bard, Google’s AI-powered chatbot experience. Users can now include images in their Bard prompts, and Lens works behind the scenes to assist Bard in understanding the images.
For example, when shown a photo of shoes and asked about their name, Bard, informed by Lens’ analysis, delivers an accurate response.
Google continues to invest in generative AI technologies to enhance Bard’s capabilities. Recent updates include Bard’s ability to write, execute, and test its own code, improving problem-solving skills, and partnering with Adobe to incorporate art generation into Bard.
Google’s latest advancements in generative AI are reshaping the online shopping landscape and providing valuable insights into skin care.
The virtual try-on feature empowers users to make confident fashion choices, while enhanced filtering options and the integration of Lens with Bard elevate the shopping and chatbot experiences.
Additionally, Lens’ ability to recognize skin conditions enables users to gather information and make informed decisions about their health. Google’s commitment to leveraging AI technologies underscores its dedication to enhancing user experiences and addressing evolving needs in the digital realm.
Ready to explore the exciting world of tech news? Join the global community of Hayvine readers. Don’t miss out on the latest updates! Start engaging with Hayvine today and open your doors to the latest news every day!