Mirror Review
June 19, 2025
Summary:
- Google has introduced “Search Live,” a new function in its app allowing users to have fluid, back-and-forth voice conversations with AI Mode.
- Initially rolling out in the U.S. for those in the AI Mode experiment, it delivers audible AI responses and shows on-screen links for more research.
- It is specifically designed for multitasking, with the ability to operate in the background as you use other applications on your phone.
A New Way to Search
For decades, the search box has been our primary gateway to the world’s information.
But what if you could just talk to it instead!?
Google is now making that a reality, raising a big question: Are we witnessing the beginning of the end for typed search queries?
On June 18, 2025, the company announced that Google launches Search Live, a groundbreaking voice experience within its AI Mode.
This new feature, available in the U.S. for Android and iOS users who have opted into the AI Mode experiment, lets you have a continuous, conversational search experience, ideal for when you’re multitasking or on the move.
But How the Google Real-Time Voice Search Work?
The experience is designed to feel less like using a tool and more like having a conversation.
- A user opens the Google app and taps the new “Live” icon.
- They can then verbally ask a complex question, like, “What are some tips for preventing a linen dress from wrinkling in a suitcase?”
- An AI-generated audio response provides the answer, and the conversation can continue with follow-up questions, as the AI carries over the context.
Crucially, while you listen, the system still provides familiar web links on your screen, and a “transcript” button is available for those who prefer to read the response.
The Technology That Makes It Conversational
This feature is more than just a voice-to-text front end.
Search Live is driven by a custom version of Google’s Gemini model, which has advanced voice capabilities and is integrated with Search’s core quality systems.
It employs a sophisticated “query fan-out” technique.
Your single spoken question is deconstructed into several related searches that are run at the same time.
This allows the AI to synthesize information from a much broader and more diverse set of web pages, delivering a richer answer than a single typed query might produce.
The Future Vision: Less Typing, More Doing
Google’s plans show a clear trajectory away from manual input.
The features slated to come to AI Mode in the near future further emphasize a conversational, automated approach:
- Live Visual Search:
Soon, you won’t just talk; you’ll be able to use your camera to show the AI what you’re seeing in real-time for a fully multimodal conversation.
- Agentic Capabilities:
Google intends for AI Mode to handle tasks for you, like finding and comparing ticket prices or booking reservations, reducing complex research projects to a simple verbal request.
Conclusion: So, Is This Really the End of Typing?
While Google launches Search Live as a powerful leap into a voice-first future, it is not the end of typing. Instead, it marks the beginning of a more versatile and contextual era for search.
The feature itself, by providing on-screen links and a text transcript, acknowledges that text remains a vital medium for deep exploration and verification.
Typing offers a level of precision, privacy, and control that voice cannot always match, especially for complex data entry or in public settings.
The future of search isn’t a battle between talking and typing; it’s about having the right tool for the moment.
Voice is for convenience, multitasking, and open-ended exploration.
Yet, typing will remain for focused research, specific data queries, and situations demanding silence and precision.
Therefore, Search Live doesn’t replace the keyboard; it simply gives it a powerful new partner.