Software and development tools
15.07.2023 07:40

Share with others:

Share

Is this the future of matching clothes? Google says it is

Google, which relies on generative artificial intelligence whenever possible, is rolling out a new shopping feature that shows clothes on real-life models.
Is this the future of matching clothes? Google says it is

Google's virtual clothing matching tool, part of a wide range of Google Shopping updates coming in the coming weeks, captures an image of an item of clothing and tries to predict how it will roll, fold, fit, stretched and shaped wrinkles and shadows - on a set of real animal models in different positions.

Virtual testing is powered by a new diffusion-based AI model. Google developed it internally. Diffusion models - among which the most famous text-to-image generators are Stable Diffusion and DALL-E 2 — they learn to gradually remove the noise from the initial image, which consists entirely of noise, and move it step by step closer to the goal.

Google trained the model using many pairs of images, each of which included a person wearing the garment in two unique poses—for example, an image of someone wearing a shirt standing sideways and another image, where the model stands from the front. To make the model more robust (ie resistant to visual errors such as wrinkles looking deformed and unnatural), the process was repeated using random pairs of clothing and human images.

For about a month, US shoppers using Google Shopping can virtually try on women's tops from brands such as Anthropologie, Everlane, H&M and LOFT. A new "Try On" feature is available in Google Search. Men's shirts will be available once until the end of the year.

"When you try on clothes in a store, you can immediately find out if they are right for you," he Lilian Rincon, senior director of consumer shopping products at Google, wrote in a blog post. It cites research showing that 42 % of online shoppers believe that models in online stores do not represent a realistic image, while 59 % of them feel dissatisfied with a product they bought online because it looked different on them. more than expected.

Virtually trying on clothes is not a new thing. Amazon and Adobe have been experimenting with generative clothing modeling for some time now, as has Walmart, which has been offering online functionality since last year that uses customer photos to model clothing.

Google has already tested virtual clothing testing and partnered with L'Oreál, Estée Lauder, MAC Cosmetics, Black Opal and Charlotte Tilbury to allow users to search for shades of makeup on different models. with different shades of skin color. The fact is that generative artificial intelligence is increasingly intruding into the fashion industry, and has met with opposition from models who say it further increases the inequalities that have existed in the industry for so long.

In a blog post, Rincon pointed out that Google chose to use real models — and a diverse selection that spans sizes from XXS to 4XL and represents different ethnicities, skin tones, body shapes and hair types. However, she did not answer the question of whether the new feature for trying on clothes will lead to fewer opportunities for models to be photographed in the future. With the release of the presented functionality for virtual testing of clothes, Google also introduces filtering options when searching for clothes. Yes, you guessed it, this too is powered by AI and visual matching algorithms.


Interested in more from this topic?
artificial intelligence


What are others reading?