Google said it is rolling out its visual assistant, which brings ups information as well as ways to interact with apps with a Google Assistant voice request in a full-screen experience, to Android phones this summer.
When an Android user makes a query through Google Assistant, Google will provide a more interactive visual experience on the phone. That includes ways to interact with smart home products, like thermostats, or interacting directly with apps like the Starbucks app. Google’s visual assistant is coming to iOS devices this year. You can make a voice query such as “what is the temperature right now,” and a display shows up with a way to change the temperature.
Users will also be able to swipe up to get access to a visual snapshot of what’s happening that day, including navigation to work, reminders, and other services like that. All this aims to provide users a way to quickly get access to their daily activities without having to string together a series of sentences or taps in order to get there.
Google’s visual assistant on phones more of an evolution of how users can interact with Google’s services and integrations in a more seamless way. Voice interfaces have become increasingly robust with the emergence of Google Home and Alexa, allowing users to interact with devices and other services by just saying things to their phones or devices at home. But sometimes there are more complex interactions, such as tweaking the temperature slightly, and having a visual display makes more sense.
Each new touch point developers get — such as a full-screen display after a voice query — gives companies more and more ways to keep the attention of potential customers and users. While Alexa offers developers a way to get a voice assistant in their tools, as well as SoundHound with its Houndify platform, Google is trying to figure out what the next step is for a user after asking a question for Google. That makes more sense on the phone, where they can quickly get a new interface that requires some kind of visual element.