One of the biggest themes at CES 2025 was smart glasses, and there was no shortage of products to see on the show floor. These glasses either have tiny displays in the lenses or lack them altogether, but Gyges Labs is taking a different approach and I got to check it out.
The Halliday AI Glasses offer a tiny display module at the top of the frame, just above the right lens. This display effectively projects an image to your eye. But just so we’re clear, you can’t physically see a beam of light shooting out from this module.
I was able to use these smart glasses for a few minutes, and the first thing I noticed was that the calibration process was quite slow. Gyges Labs co-founder and CTO Dr. Felix Lyu was on hand to tune the display via an iPhone app, shifting the view laterally and horizontally as needed. However, this process took what felt like a few minutes as part of the display was hidden from view.
I eventually resorted to wearing the glasses lower down my nose and that gave me a full view of the display, although Lyu noted that this issue was down to the nose pads used on the demo unit. So we hope these glasses come with pads in various sizes if that’s the case.
What does this display actually look like? You’ve got a circular screen here that only offers black and green colors akin to Fallout’s Pip-Boy rather than a color screen. However, Lyu told us that a color display was in the works for the next generation. Either way, I’m glad it’s not a nauseating red-and-black affair like the Virtual Boy.
Another representative suggested that the display was akin to viewing an iPad from roughly a meter away. I’m not sure about that comparison as the circular screen and UI feel more like I was looking closely at a smartwatch display. I was able to view text and system menus just fine, though. In fact, I’m kind of curious to see Wear OS on these glasses owing to the smartwatch-inspired interface in the first place. The company also caters to various vision levels by adjusting the display module itself.
Hadlee Simons / Android Authority
Gyges Labs says it’s using its own AI models for a proactive AI agent on the Halliday glasses. You’ll need to activate this agent beforehand and it’ll then chip in with information based on the conversation you’re having. A representative gave the example of the AI agent being able to fact-check a statement given to you. We weren’t able to try out this feature or other features on the show floor, so it remains to be seen how well this will work. Fortunately, there’s also a more conventional chatbot functionality powered by ChatGPT if you’d like to ask specific questions. The glasses also offer capabilities like teleprompter functionality, memos, AI translation for audio, notification mirroring, and navigation assistance.
We weren’t able to navigate through the system UI ourselves either as the side touchpad was disabled on the show floor, with Lyu swiping through various menus remotely instead. However, the co-founder confirmed that you can swipe backward and forward and enter/exit menu screens via the arm-mounted touchpad. The company will also let you navigate via a smart ring, allowing you to “swipe” in more directions, although we weren’t able to try this out.
What about battery life? Another company representative told us you can expect two to three days of typical usage and up to seven hours of constant usage. By contrast, the official website lists 100 hours of typical usage and up to 12 hours of active use.
Different, but worth buying?
Hadlee Simons / Android Authority
The Halliday AI Glasses are scheduled to ship by the end of Q1 2025, with the co-founder estimating a late March or early April shipping window. Expect an early bird price of $369 and a full retail price of $489.
It’s too soon to tell whether these smart glasses will be worth the price as we simply weren’t able to interact with the software. The calibration process was also finicky, and we can only hope that this isn’t representative of the final product.
Would you buy the Halliday AI glasses?
0 votes
In saying all of this, the display technology actually worked well enough once the calibration was eventually sorted out. And the small size of the display module translates into a slimmer frame than I’ve seen on most other display-toting glasses. This suggests that we could see this display tech on loads of other normal-looking glasses down the line.
However, it seems like you should probably wait for the next version. Lyu previously told us that they would offer a color screen in the next generation, for one. The executive also confirmed that an integrated camera was on the roadmap, but didn’t clarify how this would be used. If it’s anything like other smart glasses products, then we’re guessing the camera will be used for general queries about the world around you (e.g. asking what you’re looking at) and translations.
But a color screen and camera are likely contingent on this first-generation model doing well in the first place. So it’s really a catch-22 for consumers, especially if you don’t want to be a guinea pig for a product that might go the way of the Humane AI Pin or Rabbit R1. But if all you want is an AI assistant, mirrored notifications, and a few other basic productivity features in a pair of glasses that don’t look goofy, you should definitely keep an eye on the first-generation Halliday AI Glasses.
Me? I’m more interested in display-free smart glasses like the Ray-Ban Meta line, but a less obtrusive, more power-efficient approach to displays like the Halliday AI Glasses has certainly left me intrigued.
Got a tip? Talk to us! Email our staff at news@androidauthority.com. You can stay anonymous or get credit for the info, it’s your choice.
Leave a Reply