Technology

Best Smart Glasses in 2026: Wait for Google

There’s one big question that comes to mind for anyone considering smart glasses technology right now: Do you want to wear something with technology on your face? And, how long? And is that something you’re comfortable with, mentally? The decision when it comes to powered glasses and wireless glasses is very different.

Show mirrors vs. camera and sound glasses

The goggles are really like eye headphones that you put on your face over your eyes. Although they have some contact lenses, they are not designed for all-day wear. You’ll put them on for movies, play games or do work, and then take them off. The level of commitment may be a few hours per day at most.

For now, wireless smart glasses aim to be true everyday glasses. They will likely replace your existing glasses, be an extra pair or act as smart sunglasses. But if you do, remember you’ll need to wear your prescription… or, get used to the limited battery life of wireless glasses. Meta Ray-Bans last several hours on a charge, depending on how they are used. After that, they need to be recharged in their case, so you will need to wear another pair of glasses or just accept wearing a pair with a dead battery.

Meanwhile, there are other smart glasses with long battery life, such as the Even Realities G2, but without built-in cameras and speakers.

Meta Ray-Bans on a red table next to a phone showing Live AI transcripts

Live AI, the new Meta feature of Ray-Bans, can keep the camera feed constantly in the world. I checked it out.

Scott Stein/CNET

AI and its limitations…and privacy

You’ll also want to consider what you’ll be using the glasses for, and what AI devices or services you’re using. Wireless audio and video glasses like Ray-Bans require a phone app to pair and use, but they can work as basic Bluetooth headphones with any audio source. However, Meta Ray-Bans is limited to Meta AI as an active AI service, with a few entries into apps like Apple Music, Spotify, Calm and Facebook’s main platforms. You live in a Meta world, and that’s a big problem when it comes to trusting mirrors to have a responsible data policy. You can choose not to use the AI ​​features in the Meta glasses, something I do because most of the AI ​​functions are not that useful to me.

Meta is opening up its smart glasses to app developers, though how much is still unknown. The new Meta glasses of Ray-Ban Display, meanwhile, add many applications but mainly for activities connected to Facebook. The Meta is also starting to support connected fitness devices, but only with Garmin and its upcoming Oakley Vanguard sports visor for now.

The next wave of Google glasses expected later this year should be more flexible, with access to Gemini AI and other Google applications and services. But we still don’t know all the limits of those glasses, either.

Apple is expected to have its own AI-powered glasses within the next year. In other words: things will change quickly in this space.

AI-enabled glasses can often use AI and an on-board camera for multiple assistive purposes such as live rendering or describing a location in detail. For those with vision loss or assistive needs, AI glasses are starting to become a fun and helpful type of device, but they’re more limited than what you can do on phones and computers right now. Meta AI functions in glasses are not flexible — you can’t add text and personal information to them the same way you can with other services. At least, not yet.

Fixed display mirrors have limitations, too

Display modems use USB-C to connect to gadgets that can output video via USB-C, such as phones, laptops, tablets and portable game consoles. But they don’t work the same. Phones can sometimes have app incompatibilities, preventing copyrighted videos from playing in rare cases (like Disney+ on iPhones). Steam and Windows game controllers work with tethered display glasses, but the Nintendo Switch and Switch 2 don’t, and require a proprietary battery pack and multiple “mini docks” sold separately to send the signal. Some glasses makers like Xreal build additional custom chipsets into the glasses to pin displays in space or customize display size, while others rely on additional software only available on laptops or certain devices to do more tricks. But the place here is also changing. Project Aura, coming this year, will pair Xreal glasses with a small Android computer to run multiple apps in 3D and have hand tracking, like an augmented reality headset. More devices like these can appear, adding real 3D augmented reality and more.

A man wearing Android XR glasses

Lexy Savvides

More on the horizon

If this all sounds like the Wild West, that’s because it is. Glasses right now remind me of the wearable scene before the Apple Watch and Android watches arrived: It was experimental, inconsistent, sometimes flashy and sometimes frustrating. Expect glasses to change rapidly over the next year or so, which means your choice to buy now isn’t guaranteed to be the perfect solution down the road.

While Meta is currently the leader in face wearables, it’s likely that future glasses will change dramatically. When Google and Apple enter the picture, expect more app and service compatibility in smart glasses, too.

Also, keep an eye on your wrist. Meta’s neural band for its display glasses is a sign of where others will follow, and Google and Apple will likely wrap the watch’s interaction with its glasses for easy gestures and shortcuts.

Many companies are entering this space, including long-time glasses maker (and social apps company) Snap. Snap’s everyday AR glasses are coming later this year, too, but we don’t know that much about them yet, although I’ve tried their senior developer prototypes a few times.



Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button