Connect with us

Tech

The Next Big Thing Is Still … Smart Glasses

Published

on

The Next Big Thing Is Still … Smart Glasses

Last week, Mark Zuckerberg stood on a stage in California holding what appeared to be a pair of thick black eyeglasses. His baggy T-shirt displayed Latin text that seemed to compare him to Julius Caesar—aut Zuck aut nihil—and he offered a bold declaration: These are Orion, “the most advanced glasses the world has ever seen.”

Those glasses, just a prototype for now, allow users to take video calls, watch movies, and play games in so-called augmented reality, where digital imagery is overlaid on the real world. Demo videos at Meta Connect, the company’s annual conference, showed people playing Pong on the glasses, their hands functioning as paddles, as well as using the glasses to project a TV screen onto an otherwise blank wall. “A lot of people have said that this is the craziest technology they’ve ever seen,” Zuckerberg said. And although you will not be able to buy the glasses anytime soon, Meta is hawking much simpler products in the meantime: a new Quest headset and a new round of software updates to the company’s smart Ray-Bans, which have cameras and an AI audio assistant on board, but no screen in the lenses.

Orion seems like an attempt to fuse those two devices, bringing a fully immersive computerized experience into a technology that people might actually be comfortable putting on their face. And it is not, you may have noticed, the only smart-glasses product to have emerged in recent months. Amazon, Google, Apple, and Snap are all either officially working on some version of the technology or rumored to be doing so. Their implementations are each slightly different, but they point to a single idea: that the future is about integrating computing more seamlessly into everyday life.

Smartphones are no longer exciting, and the market for them has been declining for the past few years. The primary new idea there is foldable screens, which effectively allow your phone to turn into a tablet—though tablet sales have slowed too. The virtual-reality headsets that companies have spent billions developing aren’t being widely adopted.

These companies are betting big that people want to be able to check the weather without pulling out a smartphone—and that they are more willing to wear a pair of Ray-Bans with cameras than spend hours in the metaverse. And after years of false starts on the glasses front, they’re betting that AI—despite some high-profile flops—will be what finally helps them achieve this vision.


Tech companies have been working on smart frames for decades. The first real consumer smart glasses started appearing in the late 1980s and ’90s, but none broke through. At last, in 2013, Google released its infamous Glass eyewear. A thin metal frame with a camera and tiny screen above one eye, Glass could be used to check emails, take photos, and get directions. They were advanced for their time, but the general public was spooked by the idea of face-cameras constantly surveilling them. In 2015, Google abandoned the idea that Glass might ever be a consumer product, though the frames lived on as an enterprise device until last year.

Glass’s failure didn’t deter other companies from taking a swing. In 2016, Snapchat launched its first generation of Spectacles, glasses that allowed users to capture pictures and videos from cameras mounted above each eye, then post them on their account. In 2019, Amazon jumped in, teasing its Echo Frames—camera-less smart glasses with Alexa built in—which went on sale to the public the following year. Meta, then called Facebook, launched the first iteration of its collaboration with Ray-Ban in 2021, though the frames didn’t catch on.

Then there are the virtual-reality headsets, such as Meta’s Quest line. Last summer, after Apple announced the Vision Pro, my colleague Ian Bogost deemed this the “age of goggles,” pointing out that companies have been spending billions developing immersive technology, even though the exact purpose of these expensive headsets is unclear.

Consumers also seem to be wondering what that purpose is. One analyst reports that sales of the Vision Pro were so dismal that Apple scaled back production. According to The Information, the company paused work on the next model, while Meta canceled its competitor device entirely.

In some ways, this glasses moment is something of a retreat: an acknowledgment that people may be less likely to go all in on virtual reality than they are to throw on a pair of sunglasses that happens to be able to record video. These devices are supposed to look and feel more natural, while allowing for ambient-computing features, such as the ability to play music anywhere just by speaking or start a phone call without having to put in headphones.

AI is a big part of this pitch. New advances in large language models are making modern chatbots seem smarter and more conversational, and this technology is already finding its way into the glasses. Both the Meta and Amazon frames have audio assistants built in that can answer questions (How do whales breathe?) and cue up music (play “Teenage Dirtbag”). Meta’s Ray-Bans can “look” using their cameras, offering an audio description of whatever is in their field of vision. (In my experience, accuracy can be hit or miss: When I asked the audio assistant to find a book of poetry on my bookshelf, it said there wasn’t one, overlooking an anthology with the word poetry in the title, though it did identify my copy of Joseph Rodota’s The Watergate when I asked it to find a book about the Washington landmark.). At Connect, Zuckerberg said that the company plans to keep improving the AI, with a couple of big releases coming in the next few months. These updates will give the glasses the ability to do translation in real time, as well as scan QR codes and phone numbers on flyers in front of you. The AI will also, he said, be able to “remember” such things as where you parked your car. One demo showed a woman ruffling through a closet and asking the AI assistant to help her choose an outfit for a theme party.

But whether AI assistants will actually be smart enough to realize all of this is still somewhat of an open question. In general, generative AI struggles to cite its sources and frequently gets things wrong, which may limit smart glasses’ overall usefulness. And, though the companies say the technology will only get better and better, that’s not entirely certain: The Wall Street Journal recently reported that, when Amazon attempted to infuse Alexa with new large language models, the assistant actually became less reliable for certain tasks.

Products such as Orion, which promise not just AI features but a full, seamless integration of the digital world into physical reality, face even steeper challenges. It’s really, really difficult to squish so many capabilities into eyewear that looks semi-normal. You need to be able to fit a battery, a camera, speakers, and processing chips all into a single device. Right now, even some of the most state-of-the-art glasses require you to be tethered to additional hardware to use them. According to The Verge’s Alex Heath, the Orion glasses require a wireless “compute puck” that can be no more than about 12 feet away from them—something Zuckerberg certainly did not mention onstage. Snap’s newest Spectacles, announced earlier this month, don’t require any extra hardware—but they have a battery life of only 45 minutes, and definitely still look big and clunky. The hardware problem has bedeviled generations of smart glasses, and there still isn’t a neat fix.


But perhaps the biggest challenge facing this generation of smart glasses is neither hardware nor software. It’s philosophical. People are stressed right now about how thoroughly technology has seeped into our everyday interactions. They feel addicted to their phones. These companies are pitching smart glasses as a salve—proposing that they could, for example, allow you to handle a text message without interrupting quality time with your toddler. “Instead of having to pull out your phone, there will just be a little hologram,” Zuckerberg said of Orion during his presentation. “And with a few subtle gestures, you can reply without getting pulled away from the moment.”

Yet committing to a world in which devices are worn on our face means committing to a world in which we might always be at least a little distracted. We could use them to quietly read our emails or scroll Instagram at a restaurant without our partner knowing. We could check our messages during a meeting while looking like we’re still paying attention. We may not need to check our phones so much, because our phones will effectively be connected to our eyeballs. Smart glasses walk a thin line between helping us be less obsessively on the internet and tethering us even more closely to it.

I spent some time this spring talking with a number of people who worked on early smart glasses. One of them was Babak Parviz, a partner at Madrona, a venture-capital firm, who previously led Google’s Glass project. We discussed the history of computers: They used to be bulky things that lived in research settings—then we got laptops, then smartphones. With Glass, the team aimed to shorten the time needed to retrieve information to seconds. “The question is, how much further do you need to take that? Do you really need to be immersed in information all the time, and have access to much faster information?” Parvis told me he’d changed his mind about what he called “information snacking,” or getting fed small bits of information throughout the day. “I think constant interruption of our regular flow by reaching out to information sources doesn’t feel very healthy to me.”

In my conversations, I asked experts whether they thought smart glasses were inevitable—and what it would take to unseat the smartphone. Some saw glasses not as a smartphone replacement at all, but as a potential addition. In general, they thought that new hardware would have to give us the ability to do something we can’t do today. Right now, companies are hoping that AI will be the thing to unlock this potential. But as with so much of the broader conversation around that technology, it’s unclear how much of this hype will actually pan out.

These devices still feel more like sketches of what could be, rather than fully realized products. The Ray-Bans and other such products can be fun and occasionally useful, but they still stumble. And although we might be closer than ever to mainstream AR glasses, they still seem a long way off.

Maybe Zuckerberg is right that Orion is the world’s most advanced pair of glasses. The question is really whether his big vision for the future is what the rest of us actually want. Glasses could be awesome. They could also be just another distraction.

Continue Reading