Google has introduced two new AI shopping tool-powered ad campaign alternatives, and it has also introduced a new sort of AR Try-On procedure that uses AI to create more realistic visualisations of how items would appear on you depending on your body type.
To start with, let’s talk about the new ad types: Google has introduced Demand Gen and Video View campaign choices, which will provide you more methods to use AI shopping tool in the design and targeting processes.
By converting their video and picture assets into various Google ad formats, demand generation campaigns provide businesses the opportunity to make the most of their reach and appeal to certain user demographics.
Your top-performing video and picture assets are integrated into our most visually appealing, entertainment-focused touchpoints, including YouTube, YouTube Shorts, Discover, and Gmail, using Demand Gen AI shopping tool. As they stream, scroll, and connect, these products are used by more than 3 billion people each month.
Using the Demand Gen method, you can manage your campaigns across all platforms and reuse your creative content in all of them.
Then, you may use lookalike groups based on your audience lists or Google’s AI shopping tool to target these.
You should be able to maximise the effectiveness of your creative assets through the process, which should also optimise targeting for the best response.
Google is also introducing new video view campaigns that will allow businesses to increase views on YouTube Shorts, in-feed videos, and in-stream videos all inside the same campaign.
Early testing shows that compared to in-stream skippable cost-per-view marketing, video watch campaigns averaged 40% more views.
It basically enables greater targeting and reach with less manual work thanks to Google’s growing AI shopping tool components, so it would be worthwhile to try with these new possibilities to see what type of results you get.
On a different front, Google is aiming to enhance the online shopping experience with a new generative AI shopping tool approach for dressing, using digital models that are more accurate representations of your body type.
As per Google:
“Virtual try-ons for clothing allow you to see how clothing appears on a range of actual models. This is how it goes: Our new generative AI shopping tool model can precisely simulate how a piece of clothes would drape, fold, cling, stretch, create wrinkles, and cast shadows on a variety of actual models in various positions using just one image of clothing. We chose individuals in the sizes XXS to 4XL to represent various body types, ethnicities, skin tones, and hair kinds. We used the Monk Skin Tone Scale as a guide.
This would make it simpler to gain additional knowledge about how apparel will seem on you in particular, which might be a significant step towards expanding AR Try-On alternatives and enhancing the internet purchasing experience.
Although it’s a fascinating addition, I’m not sure it can compete with Snapchat’s Clothing Try-On AR capabilities, which are increasingly being used in store display surfaces and are now part of the company’s ARES product offering. However, it does provide some more qualification to your purchasing process, which might be useful in the contemplation stage. This procedure employs your actual physique, not a synthetic variant put over a digital mannequin.
US customers will be able to digitally try on women’s clothing from a variety of Google brands, including Anthropologie, Everlane, H&M, and LOFT, starting today. Simply tap items with the “Try On” icon on Search and choose the design that speaks to you the most.
According to Google, this will eventually be extended to other countries, items, and models.
In addition, Google has added new ‘guided refinements’ to product searches, allowing you to customise your app’s product searches with more specific criteria.
As you can see, you will be able to focus your results based on pricing, colour, pattern options, and more with these new features.
The matches are based on machine learning and new visual matching algorithms, which is another way Google is aiming to incorporate emerging AI shopping tool choices into its Search capabilities. This integration is not intended to replace conventional search but rather to enhance utility.
This is crucial for Google to emphasise since at the moment, most of the industry talk is focused on how generative AI will eventually replace Google search by enabling more semantic search possibilities and offering more human-like responses via conversational AI. It’s too early to anticipate the whole impact in this regard, but there is a case to be made that streamlined AI shopping tool interactions will displace at least some Google search traffic, which may be a huge issue for Google if it falls behind the times.
For this reason, Google is creating its own AI models and releasing its own generative AI shopping tool features.
As it continues to change as a result of behavioural changes, updates like these are another step in that direction.
How do I access Google AI?
Open the Search Labs website, log in with your Google Account, select “Join waitlist,” then choose “Search Labs,” “Get started,” and switch on the “SGE” option to activate the Google AI on search. This will get you early access to Google’s new Generative AI capabilities for Search.
Can AI replace Google?
The Google search engine, which employs sophisticated algorithms to index and rank billions of online pages and rapidly deliver the most pertinent search results to consumers, makes it unlikely that anything will ever completely replace Google.