The Verge's verdict on the 2026 crop of smart glasses is damning and fascinating at the same time: "the most stylish, affordable, comfortable, and capable yet — but they still don't make sense." That sentence captures exactly where the smart glasses category sits right now: genuinely impressive hardware struggling to answer the fundamental question of what it's actually for.

A Brief History of Glasses That Didn't Work

The category has a troubled past. Google Glass launched in 2013 as the most hyped consumer electronics product in years and became one of the most famous failures in tech history. The problem wasn't purely technical — though the hardware was awkward and battery life was terrible — it was social. "Glassholes" became a genuine cultural insult. People didn't want to talk to you if you were wearing Glass. Restaurants banned them. Google retreated to enterprise applications where social friction mattered less.

Snapchat's Spectacles arrived in 2016 with a different strategy: lean into the camera, make them fun, make them colorful, don't pretend they're something they're not. Spectacles found a niche with content creators but never crossed into mainstream adoption. The first generation was endearing; subsequent versions added AR capabilities that were technically impressive but commercially underwhelming.

Meta's early Ray-Ban collaboration produced smart glasses that were genuinely wearable — they looked like regular sunglasses — but limited. They could play music, take photos, and do basic voice commands. Impressive for a first attempt; not impressive enough to justify the price for most consumers.

What's Actually New in 2026

The 2026 generation represents a real step change in several dimensions simultaneously:

The Key Players

Meta's Ray-Ban partnership remains the market leader by recognition. The third generation of Meta Ray-Bans represents the most polished version of the vision: fashionable frames, capable camera, good audio, and the Meta AI assistant that can answer questions about what the camera sees.

More unexpected entries include nostalgia plays: Commodore 64 and ZX Spectrum branded smart glasses targeting retro-gaming enthusiasts and middle-aged tech consumers who grew up with the original hardware. The hardware under the branding is solid; the story appeals to a demographic with disposable income. These products signal that smart glasses have reached the point where lifestyle branding, not raw capability, is the differentiating pitch.

Apple's Vision Pro sits in a different category entirely — it's a spatial computing headset priced at the high end of the market, designed for immersive experiences rather than all-day wear. It's not competing with Ray-Ban glasses; it's competing with... it's not entirely clear yet. Vision Pro remains a developer and enthusiast platform rather than a consumer mainstream product.

The AI Angle Is Real

The most interesting thing about 2026 smart glasses isn't the hardware — it's what capable AI does to the use case proposition. Previous smart glasses were essentially wireless earbuds with a camera attached. 2026 smart glasses are ambient AI interfaces.

Consider what's now possible: you're walking into a meeting and your glasses quietly tell you the name of the person approaching based on LinkedIn photo matching. You're in a grocery store abroad and your glasses translate all the labels. You're on a hike and your glasses identify the plant species you're looking at. You're reading a physical book and your glasses can look up any term you glance at.

Each of these use cases individually is interesting. The question is whether any of them is compelling enough to make consumers pay a premium and change their behavior. So far, the answer seems to be: for specific populations, yes; for the mass market, not yet.

The "Still Don't Make Sense" Problem

The Verge's criticism cuts to the core issue. Smart glasses in 2026 are genuinely capable — but most of the things they do well, your phone also does, often better. The camera on your phone is dramatically better. The screen on your phone shows information more clearly than audio in your ear. The AI assistant on your phone has the same capabilities as the one in your glasses.

The smart glasses' advantage is handsfree, ambient, always-available access. That advantage matters most in contexts where pulling out your phone is inconvenient or impossible: cycling, hiking, cooking, driving, exercising, navigating an unfamiliar city on foot. For desk workers who are already sitting in front of a computer and a phone, the value proposition is much weaker.

The category is still searching for its "killer use case" — the single application that makes consumers feel they cannot live without the device. The smartphone had several (GPS, camera, internet in your pocket), but the dominant one was probably the camera combined with social media. Smart glasses haven't found their equivalent yet.

Social Friction Hasn't Gone Away

Camera glasses make people uncomfortable. This was true in 2013 with Google Glass and it's still true in 2026, even though the cameras are now hidden in fashionable frames. The discomfort isn't irrational — a device that can record video without visible indication is genuinely different from a phone camera that people can see you raising.

Some 2026 models address this with indicator lights that glow when recording. This helps with explicit recording but does nothing about the always-on AI vision features that are processing what the camera sees even when not explicitly recording. The social and legal norms around what's acceptable haven't caught up to what the technology can do.

Privacy: The Elephant in the Room

Smart glasses with AI vision raise privacy concerns that are qualitatively different from existing technology. A phone camera requires an intentional act to photograph someone. Smart glasses with always-on AI vision can process facial recognition, read text, identify objects, and log environments continuously — without any obvious signal to people nearby.

The regulatory response is nascent. The EU's AI Act addresses some biometric surveillance use cases, and several US states have enacted camera regulations, but the smart glasses use case falls into gaps in existing law. The devices are legal; the specific applications of their AI capabilities exist in a partially unregulated space.

Who's Actually Buying Them

The early adopter population in 2026 is predictable but instructive:

What Needs to Change

For smart glasses to cross from enthusiast niche to mainstream consumer product, several things need to happen:

The Bottom Line

Smart glasses are the best they have ever been. The Meta Ray-Bans look like regular sunglasses. The cameras are usable. The battery lasts most of a day. The AI can answer questions about what you're looking at. These are real achievements compared to any previous iteration of the category.

But "best they've ever been" is a relative judgment, and the baseline was low. Mainstream adoption is probably 2-3 hardware generations away — when battery lasts multiple days, when the apps ecosystem matures, when a killer use case emerges, when the social norms around camera glasses normalize. The foundation is being laid. The building isn't there yet.