Google’s Android XR glasses make smart glasses even smarter

Google’s Android XR glasses make smart glasses even smarter

I use my Ray-Ban Meta smart glasses daily, not always for their AI capabilities, but because they have my prescription lenses. However, occasionally, I pull up AI capabilities, such as asking questions about the world around me and other queries.

At Google I/O, I had the opportunity to try out Android XR on glasses, and it seems like the natural next step to the smart glasses experience. Android XR can give users a visual display on one lens, both lenses, or neither. In my experience, the display was on my right lens.

The home screen was pretty much the Android At A Glance experience, putting the time, the temperature and your next event right in front of your eyes. I was told you can turn this off and on, but I didn’t try this in my demo.

Following this, I turned on Gemini by holding down a touchpad on the right side. In your lens display, the voice assistant activates and asks what you want to know. I asked it about pictures I was looking at, and Gemini transcribed the answer. It told me who painted the picture, where it was, and when. I could ask more, but for the sake of my short demo, I switched over to ask a question about a book I was looking at in French. Gemini responded in French, on my second try, as my second language failed me on my first attempt.

Following this, I asked Gemini for directions to my hotel, and the lens displayed the next step in turn-by-turn directions. You could also look at the ground to quickly pull up a blue-ish map showing a map view of your surrounding area. I typically know where I’m going when I’m in Toronto, but I can see this being useful when I’m travelling.

I could also take a picture with the glasses, which felt similar to the Ray-Ban Metas, as it’s done with a quick button press. What’s different and far better is that a quick preview of the picture shows up in the lens display, so you can see how the picture turned out. The image quality is very low, but it’s good enough to see whether you got everyone in the image, or if they were blinking. Holding down that same button takes a 30-second video.

Android XR also offers features like reminding users where they’ve placed an item, real-time translation and more, but these features weren’t available to try out.

Because this demo was focused on the Android XR platform and not the prototype hardware, I have no clue about the camera specs, battery specs, display quality or any of that information.

It was cool to try out these features, and I look forward to seeing how they’re implemented into smart glasses in the future. I think the display in the right lens is the natural progression of smart glasses technology. And though Meta AI isn’t bad, since I use an Android device, Gemini’s ability to interact with my Google suite of apps is perfect.

My experience with Project Moohan was a better showing of Android XR, and I fell in love with it. Still, I could see myself using the glasses’ form factor every day, replacing my Ray-Ban Meta glasses.

Google says it’s working with Warby Parker and Gentle Monster to create products in this form factor; however, it’s unclear if we’ll see them later this year or if we have to wait until 2026.

MobileSyrup may earn a commission from purchases made via our links, which helps fund the journalism we provide free on our website. These links do not influence our editorial content. Support us here.

0 Shares:
Leave a Reply

Your email address will not be published. Required fields are marked *

You May Also Like