Wednesday, October 1, 2025

5 Ways to Use...

Pinterest is an amazing platform that can help drive high-quality traffic to your...

Resolve WordPress Plugin Conflict

Introduction to Plugin Conflicts One of the most frustrating things that can happen to...

Sizing Up Success: The...

Introduction to Mobile-Friendly Blog Design In today's digital age, having a blog is an...

From Obscurity to Fame:...

As a new blogger, it can be tough to get your blog noticed....
HomeDigital MarketingGoogle AI Mode...

Google AI Mode Gets Visual + Conversational Image Search

Introduction to Google’s AI Mode

Google has announced an update to its AI Mode, which now supports visual search. This means that users can use images and natural language together in the same conversation to find what they’re looking for. The update is currently rolling out in English in the U.S. and is expected to make searching for products and information easier and more conversational.

What’s New in AI Mode

The update to AI Mode aims to address the challenge of searching for something that’s hard to describe. With visual search, users can start with a text search or an image, and then refine their results with follow-up questions. For example, if you’re looking for inspiration for a maximalist bedroom, you can start with a search for "maximalist bedroom inspiration" and then ask for "more options with dark tones and bold prints."

Visual Search Gets Conversational

Google’s visual search allows users to search for products and information using images. Each image links to its source, so users can click through to find what they want. The experience is powered by the Shopping Graph, which spans over 50 billion product listings from major retailers and local shops. The Shopping Graph is updated every hour to keep details such as reviews, deals, available colors, and stock status up to date.

- Advertisement -

Shopping Without Filters

With AI Mode, users can describe products conversationally, rather than using conventional filters for style, size, color, and brand. For example, asking "barrel jeans that aren’t too baggy" will find suitable products, and users can narrow down options further with requests like "show me ankle length." This experience is powered by the Shopping Graph and allows users to find products that match their preferences.

Technical Foundation

The visual abilities of AI Mode are built on Lens and Image Search, and include Gemini 2.5’s advanced multimodal and language understanding. Google has also introduced a technique called "visual search fan-out," where it runs several related queries in the background to better grasp what’s in an image and the nuances of the user’s question. On mobile devices, users can search within a specific image and ask conversational follow-ups about what they see.

Additional Context

In a media roundtable, a Google spokesperson explained that when a query includes subjective modifiers, such as "too baggy," the system may use personalization signals to infer what the user likely means and return results that better match that preference. The spokesperson also noted that the system doesn’t explicitly differentiate real photos from AI-generated images for this feature, but ranking may favor results from authoritative sources and other quality signals.

Why This Matters

For SEO and ecommerce teams, images are becoming even more essential. As Google gets better at understanding detailed visual cues, high-quality product photos and lifestyle images may boost visibility. Since Google updates the Shopping Graph every hour, it’s essential to keep product feeds accurate and up-to-date. As search continues to become more visual and conversational, many shopping experiences might begin with a simple image or a casual description instead of exact keywords.

Looking Ahead

The new experience is rolling out this week in English in the U.S., and Google hasn’t shared timing for other languages or regions. As AI Mode continues to evolve, we can expect to see more conversational and visual search experiences that make it easier for users to find what they’re looking for.

Conclusion

Google’s update to AI Mode is a significant step forward in making search more conversational and visual. With the ability to use images and natural language together, users can find what they’re looking for more easily and efficiently. As the technology continues to evolve, we can expect to see more innovative features that change the way we search and interact with information online.

- Advertisement -

Latest Articles

- Advertisement -

Continue reading

How People Really Use LLMs And What That Means For Publishers

Introduction to LLMs Large Language Models (LLMs) have been gaining popularity, and a recent study by OpenAI has shed some light on how people are using these models. The study reveals that LLMs are not replacing search engines, but they...

Google Explains Expired Domains And Ranking Issues

Introduction to Expired Domains and SEO Expired domains have been a topic of interest in the SEO world for many years. In the past, buying expired domains was a quick way to rank a website, as they often came with...

OpenAI Launches Sora iOS App Alongside Sora 2 Video Model

Introduction to Sora OpenAI has launched a new iOS app called Sora, which is currently available on an invite-only basis in the United States and Canada. This app is OpenAI's first non-ChatGPT consumer app and its first social product. Sora...

Creating a sense of urgency or importance, like the future of content marketing or why content marketing matters

Content marketing is a form of marketing that involves creating and sharing valuable, relevant, and consistent content to attract and retain a clearly defined audience. It's a strategy that has been around for a while, but its importance has...