Until this year, I have never owned any other smartphone other than the iPhone. However, since the AI makes its way on every tech product of the planet, I needed to try Android to understand the differences between artificial intelligence in two ecosystem.
After using the Samsung Galaxy S25 for a few weeks, I returned to my iPhone 16 Pro Max. Not because it was better, but because when you have made your life your life, it is equal to the decisive factor when it comes to choosing among the flagship smartphones.
Once I returned to the iOS, I found myself missing a specific AI feature than others, and without access to the iPhone, I quickly default to stay with the Android device.
The feature of this AI that I am talking about is Gemini live, and when you can access it on iOS, the experience has decreased. It was yesterday, in Google I/O 2025, when Google announced that all Gemini live capabilities were operating on the iPhone, and without price.
Here is the best AI tool of Gemini Live that I’ve ever used, and how to add all its capabilities to the iPhone means I am ready to jump back to Apple.
What did visual intelligence want to be
Gemini Live was already in the Gemini app on the iOS, but it lacked two important elements that improve the Android version. First of all, Gemini Live on iOS was unable to access your iPhone camera, and secondly, couldn’t see what you were doing on your screen. I/O 2025 changed all this.
Now, iPhone users can give Gemini direct access to his camera and screen, allowing new ways to communicate with AI that we have not seen on iOS before.
Gemini’s camera’s capacity is alone, if not, the best AI tool I used to date, and I am happy that iPhone users can now experience it.
What is the feature of the Gemini Live Camera? Well, imagine a better version of Apple’s visual intelligence to be. You can easily show Gemini you are watching and ask questions without the need to describe this topic.
In situations like cooking, Gemini live camera functionality. Last week, I used it to make Baria Tacos, and not only did I give advice on every way, but it was also able to see everything and help me to attract delicious dinner.
My S25 not only presented the perfect angle directly offered on the stand, but since it can connect with Google Apps, I can ask him to get information directly from the content creator’s video. No need to touch your phone constantly with dirty hands in the kitchen, and no longer needs to be checked. Gemini live can do all this.
An AI partner at every step of the way
Screen sharing allows Gemini to see what is in your display at any time, which you can ask about the imagery, which you are working on, or even how to complete a puzzle in a game. It’s seriously cool, we were promised like Apple Intelligence -driven Siri but never returned to WWDC 2024.
The full free rollout of Gemini Live has just begun, so we are not yet seeing how this functionality will work on the iOS. He said, “If it works in half on Android, it will be a feature that I can see that many people are loved.”
Gemini lives completely unlocking AI on the smartphone in many ways to communicate with the world, and now that iPhone users can access it too, I have no reason to return to the Apple ecosystem.
Techradar
♬ Original sound – tech Radar