Google Glass debuted in 2013. If you asked me over a decade ago, I would’ve sworn that glasses would be the predominant wearable technology. However, like many projects before and since, the company abandoned Google Glass, and I’ve been waiting for a suitable replacement ever since. The Apple Vision Pro and other AR wearables are impractical, and if Android XR is going to catch on, it won’t be with goggle-style products like Samsung’s upcoming Project Moohan.
Thankfully, Google gave us a glimpse of the future toward the end of its Google I/O 2025 keynote event. Google XR glasses are the AI technology I’ve been waiting for. They combine the potential power of AI with a form factor we’ll actually use. They bridge the practical gap that’s frustrated me about all these fancy AI gimmicks. XR glasses can work, but Google must stick with them this time.
Related
It’s finally practical
AI for the everyday
Jules Wang / AP“” data-modal-id=”single-image-modal” data-modal-container-id=”single-image-modal-container” data-img-caption=””””>
Jules Wang / AP
Project Astra has been around for a year, but Google’s XR glasses are the first implementation I’m excited about. I have my misgivings about AI for multiple reasons. No company has convinced me it’s a positive addition to the user experience and that it provides any value, certainly not the $20 a month Google wants to charge us. However, Android XR on Google’s XR glasses demonstrates the blueprint for AI success. I want an AR overlay of my environment. I want conversations translated in front of my eyes in real time, and my XR device to show me directions to my next destination.
Glasses that function normally when I don’t need the technology but can provide an AI experience when I do are the form factor I’ve been waiting for.
I don’t love the idea that my smartglasses will remember things I viewed earlier or will keep track of where I’m going, but at least it’s a convenience, and I know I’ve wanted to remember a sign or phone number I saw earlier but didn’t think of writing it down. I could glance at a business card or information for a restaurant and have my Google XR glasses remember the contact’s phone number or help me make a reservation. If I’m giving up privacy for AI, I want it to be useful, and the Google XR glasses are the first time I’ve thought about making a compromise.
I am impressed by the technology in the Apple Vision Pro, and I’m sure that Samsung’s Project Moohan will be an interesting headset, but I fear they’ll share a similar fate. No one wants to walk around or be stuck with a large headset on them for any length of time, and no one wants to be connected to a battery pack. I get the entertainment and productivity possibilities for them, but they’ll remain marginal products because they aren’t a natural extension of the human experience — technology should enhance, not intrude.
Glasses that function normally when I don’t need the technology but can provide an AI experience when I do are the form factor I’ve been waiting for. As a glasses wearer, it’s a natural transition. Even if you don’t wear prescription lenses, I’m sure you’ve worn a pair of sunglasses. It’s the same reason why flip phones are superior to book-style, larger folding devices. I don’t need to change how I use a smartphone to enjoy a flip phone, and I wouldn’t need to change how I wear glasses or go about my day to use the Google XR glasses — when adoption is easier, sales are greater.
Of course, a map overlay is only good if it points me in the right direction, and a real-time translation only provides value if it’s accurate. I don’t have the faith in AI I’d need to for Google’s XR glasses to work. Every Google I/O 2025 demo went off without a hitch, but as any current Google Gemini user will tell you, the reality is a mixed bag. I get numerous wrong answers weekly from Gemini Live, and AI assistants on multiple platforms still need to be rigorously double-checked. I hold my breath when I ask any AI model for information I need to act on, and if I’m going to trust AI to provide me with an overlay of the world I see, I will need greater accuracy.
Nothing will ever be perfect, and mistakes will always creep into any model, but if Google wants me to treat its various agentic AI features as a personal assistant with personal context, then I need to trust it. It’s the same standard I’d hold to a human assistant or friend, and if Google wants me to offload things I’d usually handle myself, I need to know it’s up to the task. I’m excited about Google XR glasses, but reliability is vital.
We’re on the right track
Plenty of questions remain unanswered. Google’s glasses can’t have a minuscule battery life or a terrible Bluetooth connection, but at least I approve of the direction. The technology might take a while to catch up. Still, Android XR makes me believe we’re headed towards a usable, valuable AI experience, which is something I can’t say about Samsung’s Galaxy AI or other Google Gemini functions. We’re close to the future; I just hope Google doesn’t give up.
Leave a Reply
View Comments