See clothes on real models with AI-powered try-on and new filters — right in Google Search.
Photo source:
One of the biggest
challenges of online shopping is the unknown: How will that shirt actually look
on you? Will the fabric cling in the wrong places? Is the fit true to size?
With the launch of
Google’s Virtual Try-On for Apparel, those questions are now easier to answer.
This new feature, powered by generative AI, brings a more realistic,
human-centered approach to digital fashion — allowing users to see how clothes
fit on a wide variety of real models before buying.
Google’s system
takes just a single image of a clothing item and shows how it would appear on
real people with different body types. The technology doesn’t use 3D avatars —
it renders the fabric in a way that mimics how it would drape, stretch,
wrinkle, and fold on real human forms.
To make it
inclusive and representative, Google’s team selected models of sizes ranging
from XXS to 4XL, with a broad range of skin tones, ethnicities, body shapes,
and hair types. Skin tones were chosen using the Monk Skin Tone Scale, ensuring
wide coverage and diversity.
Shoppers in the
U.S. can access this tool directly in Google Search. Simply look for tops from
partner brands like Anthropologie, Everlane, H&M, and LOFT, and click the
“Try On” badge. You’ll be prompted to view the item on different models —
making it easier to picture how it might look on you.
Trying on clothes
is only part of the shopping experience. Often, a customer might like a
particular item but want it in a different color, lower price point, or
alternative pattern.
Google now helps
shoppers refine their search just like a store associate might. With its new
guided refinements, you can narrow down your results by color, style, price, or
pattern — powered by machine learning and visual matching algorithms.
This functionality
appears right inside Google’s product listings, and unlike in a store, it’s not
limited to one brand. The system searches across retailers, giving you a wider
range of choices without switching between apps or websites.
The traditional
e-commerce model often shows clothing on just one body type — usually far from
the diversity of real-world shoppers. According to Google’s internal research:
This new try-on
experience aims to close that gap by giving people a more accurate,
personalized preview — increasing shopping confidence and reducing the chances
of disappointment or return.
It also makes
online shopping more accessible to people who may not identify with traditional
fashion imagery.
At launch, the
try-on tool supports women’s tops, but Google has plans to expand it further —
including men’s tops later this year. The underlying AI model and Shopping
Graph, which powers this feature, will allow for growth across brands and
product types.
As more brands
adopt this technology and shoppers become familiar with AI-based fashion tools,
it could mark a shift in how we experience online retail — from static product
pages to dynamic, body-aware previews.
Please subscribe to have unlimited access to our innovations.