- Gemini can now chain actions to complete complex tasks
- Gemini Live gains multimodal capabilities on newer phones
- Gemini will evolve into a fully powerful AI assistant with Project Astra
To coincide with the launch of the Samsung S25 line of devices, during today’s Galaxy Unpacked, Google announced some impressive updates to its Gemini AI platform. Many of the improvements are specific to devices like the new Samsung S25, but some also work on older Samsung S24 and Pixel 9 phones.
The feature that stands out is Gemini’s new ability to chain actions. This means you can now do things like connect to Google Maps to search for nearby restaurants, then write text in Google Messages to send to people you want to invite to lunch, all through Gemini commands.
The chaining capability is added to all devices that run Gemini, “extension-based,” meaning that extensions to link the particular app to Gemini will need to be written by a developer for them to be included. Naturally, all major Google apps already have extensions for Gemini, but extensions are also available for the Samsung Reminder, Samsung Calendar, Samsung Notes, and Samsung Clock apps.
Gemini Live becomes multimodal
Google’s Gemini Live, the part of Gemini that gives you the ability to have a natural, human-like conversation with AI, is also getting major multimodal upgrades. You will now be able to upload images, files and YouTube videos to the conversation you are having. So, for example, you could ask Gemini Live, “Hey, look at this photo of my school project and tell me how I could improve this,” then upload the photo and get a response.
However, the Gemini multimodal enhancements are not available across the board and will require a Galaxy S24, S25, or Pixel 9 to work.
Look on it
Astra Project
Finally, Google announced that Project Astra capabilities would arrive in the coming months, first on the Galaxy S25 and Pixel phones. Project Astra is Google’s prototype AI assistant that lets you interact with the world around you, asking questions about what you’re looking at and where you’re using your phone’s camera. So you can simply point your phone at something and ask Gemini to tell you about it, or ask it when the next stop on your bus route is.
Project Astra works on mobile phones, but takes your experience to the next level when combined with Google’s hands-free AI glasses prototypes, so you can just start asking Gemini questions about what you watch, without having to interact with a screen at all.

Look on it
While there is still no news regarding the release date of this next generation of Google Glasses, they will join Ray-Ban Meta Glasses in the emerging AI wearable market when they finally become available.