18.5 C
Miami
Friday, November 14, 2025

Are Consumers Ready to Embrace AI Glasses?

- Advertisement -spot_imgspot_img
- Advertisement -spot_imgspot_img

Are you finally ready to hang a computer screen on your face?

Fifteen years ago, that would have seemed like a silly question. Then came the much-hyped and much-derided Google Glass in 2012, and frankly, it still seemed a silly question.

Now, though, it’s a choice consumers are beginning to make. Tiny displays, shrinking processors, advanced battery designs, and wireless communications are coming together in a new generation of smart glasses that display information that’s actually useful right in front of you. But the big question remains: Just why would you want to do that?

Some tech companies are betting that today’s smart glasses will be the perfect interface for delivering AI-supported information and other notifications. The other possibility is that smart glasses will replace bulky computer screens, acting instead as a private and portable monitor. But the companies pursuing these two approaches don’t yet know which choice consumers will make or what applications they really want.

Smart-glasses skeptics will point to the fate of Google Glass, which was introduced in 2012 and quickly became a prime example of a pricey technology in search of practical applications. It had little to offer consumers, aside from being an aspirational product that was ostentatiously visible to others. (Some rude users were even derided as “glass-holes.”) While Glass was a success in specialized applications such as surgery and manufacturing until 2023—at least for those organizations that could afford to invest around a thousand dollars per pair—it lacked any compelling application for the average consumer.

Smart-glasses technology may have improved since then, but the devices are still chasing a solid use case. From the tech behemoths to little brands you’ve never heard of, the hardware once again is out over its skis, searching for the right application.

During a Meta earnings call in January, Mark Zuckerberg declared that 2025 “will be a defining year that determines if we’re on a path toward many hundreds of millions and eventually billions” of AI glasses. Part of that determination comes down to a choice of priorities: Should a head-worn display replicate the computer screens that we currently use, or should they work more like a smartwatch, which displays only limited amounts of information at a time?

Head-worn displays fall into two broad categories: those intended for virtual reality (VR) and those suited for augmented reality (AR). VR’s more-immersive approach found some early success in the consumer market, such as the Meta Quest 2 (originally released as the Oculus Quest 2 in 2020), which reportedly sold more than 20 million units before it was discontinued. According to the market research firm Counterpoint, however, the global market for VR devices fell by 12 percent year over year in 2024—the third year of decline in a row—because of hardware limitations and a lack of compelling use cases. As a mass consumer product, VR devices are probably past their moment.

In contrast, AR devices allow the wearer to stay engaged with their surroundings as additional information is overlaid in the field of view. In earlier generations of smart glasses, this information added context to the scene, such as travel directions or performance data for athletes. Now, with advances in generative AI, AR can answer questions and translate speech and text in real time.

Many analysts agree that AI-enhanced smart glasses are a market on the verge of massive growth. Louis Rosenberg, CEO and chief scientist with Unanimous AI, has been involved in AR technology from its start, more than 30 years ago. “AI-powered smart glasses are the first mainstream XR [extended reality] devices that are profoundly useful and will achieve rapid adoption,” Rosenberg told IEEE Spectrum. “This, in turn, will accelerate the adoption of immersive versions to follow. In fact, I believe that within five years, immersive AI-powered glasses will replace the smartphone as the primary mobile device in our digital lives.”

Major tech companies, including Google and Apple, have announced their intentions to join this market, but have yet to ship a product. One exception is Meta, which released the Meta Ray-Ban Display in September, priced at US $799. (Ray-Ban Meta glasses without a display have been available since 2023.)

A number of smaller companies, though, have forged the path for true AR smart glasses. Two of the most promising models—the One Pro from Beijing-based Xreal and the AI glasses from Halliday, based in Singapore—represent the two different design concepts evolving in today’s smart-glasses market.

Halliday’s “Hidden Superpower”

Halliday’s smart glasses are a lightweight, inconspicuousdevice that looks like everydayeyewear. The glasses have a single small microLED projector placed above the right lens. This imager beams a monochrome green image directly to your eye, with a level of light dim enough to be safe but bright enough to be seen against ambient light. What the user sees is a virtual 3.5-inch (8.9-centimeter) screen in the upper right corner of their field of view. Like a typical smartwatch screen, it can display up to 10 or so short lines of text and basic graphics, such as arrows when showing turn-by-turn navigation instructions, sufficient to provide an interface for an AI companion.

In press materials, Halliday describes its glasses as “a hidden superpower to tackle life’s challenges.” And hidden it is. The display technology is much more discreet than those of other designs that use waveguides or prismatic lenses, which will often reveal a reflected image or a noticeable rainbow effect. Because it projects the image directly to the eye, the Halliday device doesn’t produce any such indications. The glasses can even be fitted with standard prescription lenses.

Xreal’s One Pro: The Full Picture

By contrast, the Xreal One Pro has two separate imagers—one for each eye—that show full-color, 1,080-pixel images that fill 57 degrees of the user’s field of view. This allows the One Pro to display the same content you’d see on a notebook or desktop screen. (A more typical field of view for AR glasses is 30 to 45 degrees. Halliday’s virtual screen occupies only a small portion of the user’s field of view.)

Xreal’s One Pro smart glasses consist of many layers that work together to create a full-color, high-resolution display.Xreal

In fact, the One Pro is intended to eliminate those notebook and desktop screens. “We’re now at the point where AR glasses’ spatial screens can truly replace physical monitors all day long,” Xreal CEO and cofounder Chi Xu said in a December 2024 press release. But it’s not a solution for use when you’re out and about; the glasses remain tethered to your computer or mobile device by a cable.

The glasses use microLED imagers that deliver good color and contrast performance, along with lower power consumption than an OLED. They also use a “flat prism” lens that is 11 millimeters thick—less than half the thickness of the prisms in some other AR smart glasses, but three to four times as thick as typical prescription lenses.

The flat-prism technology is similar to the “bird bath” prisms in Xreal’s previous glasses, which used a curved surface to reflect the display image to the wearer’s eye, but the flat prism’s thinner and lighter design offers a larger field of view. It also has advantages over the refraction-based waveguides used by other glasses, which can introduce visible artifacts such as colored halos.

In order to improve the visibility of the projected image, the glasses block much of the ambient light from the surroundings. Karl Guttag, a display-industry expert and author of the KGOnTech blog, says that the Xreal One Pro blocks about 78 percent of real-world light and are “like dark sunglasses.”

The One Pro also has a built-in spatial computer coprocessor, which enables the glasses to position an image relative to a direction in your view. For example, if you have an application that shows one set of information to your left, another in the middle, and a third to the right, you would simply turn your head to look at a different set. Or you could position an image in a fixed location—as with the Halliday glasses—so that it remains in front of you when you turn your head.

Having separate imagers for each eye makes it possible to create stereoscopic 3D effects. That means you could view a 3D object in a fixed location in your room, making for a more immersive experience.

The Cost of More-Immersive AR

All these features come at a cost. Xreal’s glasses draw too much power to be run by battery, and they need a high-speed data connection to access the display data on a laptop or desktop computer. This connection provides power and enables high-resolution video streaming to the glasses, but it keeps the user tethered to the host device. The Halliday glasses, by contrast, run off a battery that the company states can last up to 12 hours between charges.

Another key difference is weight. Early AR glasses were so heavy that they were uncomfortable to wear for long periods of time. The One Pro is relatively light at 87 grams, or a little less than the weight of a small smartphone. But the Halliday’s simpler design and direct projector yields a device that’s less than half that at 35 grams—a weight similar to that of many regular prescription glasses.

Attendees trying new AI glasses at the Halliday booth during the World Artificial Intelligence Conference. Customers try out Halliday’s smart glasses at an expo in China in July 2025. Ying Tang/NurPhoto/Getty Images

In both cases, this new generation of consumer-oriented smart glasses costs much less than enterprise AR systems, which cost several thousands of dollars. The One Pro lists for $649, while the Halliday lists for $499.

Currently, neither Halliday nor Xreal has a camera built into its glasses, which instead communicate through voice control and audio feedback. This eliminates extra weight and power consumption, helps keep costs down, and sidesteps the privacy concerns that proved to be one of the main sticking points for Google Glass.

There are certainly applications where a camera can be helpful, however, such as for image recognition or when users with impaired vision want to hear the text of signs read aloud. Xreal does offer an optional high-resolution camera module that mounts at the bridge of the nose. Whether to include a built-in camera in future models is yet another trade-off these companies will need to consider.

What Do Consumers Really Want in Smart Glasses?

Clearly, these two models of smart glasses represent very different design strategies and applications. The Halliday glasses exist largely as a mobile platform for an AI companion that you can use discretely throughout the day, the way you would use a smartwatch. The One Pro, on the other hand, can act as a replacement for your computer’s monitor—or several monitors, thanks to the spatial computing feature. The high resolution and full color deliver the same information that you’re used to getting from the larger displays, with the trade-off that you’re physically tethered to your computer.

Are either of these scenarios the killer app for smart glasses that we’ve been waiting for?

With the rise of generative AI agents, people are growing increasingly comfortable with easy access to all sorts of information all the time. Smart speakers such as Amazon Echo have trained us to get answers to just about anything simply by asking. Wearing a device on your face that can discreetly present information on demand, like Halliday’s glasses, will certainly appeal to some consumers, especially when it’s priced affordably.

Chris Chinnock, founder of Insight Media, thinks this is the path for the future. “I am not convinced that a display is needed for a lot of applications, or if you have a display, a small [field of view] version is sufficient. I think audio glasses coupled with AI capabilities could be very interesting in the near term, as the optics [or the] display for more full-featured AR glasses are developed.”

On the other hand, many people may be seeking a compact and convenient alternative to the large, bulky external monitors that come with today’s laptop and desktops. On an airplane, for instance, it’s difficult to comfortably open your laptop screen enough to see it, and there’s little expectation of privacy on a crowded flight. But with smart glasses that project multiple virtual screens, you may actually be able to do some useful work on a long flight.

For now, companies like Halliday and Xreal are hoping that there’s room for both strategies in the consumer market. And with multiple choices now available at consumer-friendly prices, we will soon start to see how much interest there is. Will consumers choose the smart AI companion, or a compact and private replacement for computer screens? In either case, your glasses are likely to become a lot smarter.

From Your Site Articles

Related Articles Around the Web

Source link

- Advertisement -spot_imgspot_img

Highlights

- Advertisement -spot_img

Latest News

- Advertisement -spot_img