Connect with us

Tech

Hands-on with Orion, Meta’s first pair of AR glasses

Published

on

Hands-on with Orion, Meta’s first pair of AR glasses

They look almost like a normal pair of glasses. 

That’s the first thing I notice as I walk into a conference room at Meta’s headquarters in Menlo Park, California. The black Clark Kent-esque frames sitting on the table in front of me look unassuming, but they represent CEO Mark Zuckerberg’s multibillion-dollar bet on the computers that come after smartphones. 

They’re called Orion, and they’re Meta’s first pair of augmented reality glasses. The company was supposed to sell them but decided not to because they are too complicated and expensive to manufacture right now. It’s showing them to me anyway.

I can feel the nervousness of the employees in the room as I put the glasses over my eyes and their lenses light up in a swirl of blue. For years, Zuckerberg has been hyping up glasses that layer digital information over the real world, calling them the “holy grail” device that will one day replace smartphones.

Now, it’s time to find out if he’s onto something.

Orion is, at the most basic level, a fancy computer you wear on your face. The challenge with every face-computer has long been their displays, which have generally been heavy, hot, low-resolution, or offered a small field of view.

Orion’s display is a step forward in this regard. It has been custom-designed by Meta and features Micro LED projectors inside the frame that beam graphics in front of your eyes via waveguides in the lenses. These lenses are made of silicon carbide, not plastic or glass. Meta picked silicon carbide for its durability, light weight, and ultrahigh index of refraction, which allows light beamed in from the projectors to fill more of your vision. 

Zuckerberg imagines that people will want to use AR glasses like Orion for two primary purposes: communicating with each other through digital information overlaid on the real world — which he calls “holograms” — and interacting with AI. 

“I had thought that the hologram part of this was going to be possible before AI,” he tells me. “It’s an interesting twist of fate that the AI part is actually possible before the holograms are really able to be mass-produced at an affordable price.”

Orion takes the generative AI capabilities that already exist in the Ray-Ban Meta smart glasses and adds a visual element over what you’re looking at. During a demo last week, I used Meta AI in Orion to identify ingredients laid out on a table to create a smoothie recipe. In a few seconds, it correctly placed labels over the ingredients and generated instructions for a recipe in a floating window above them. 

To demonstrate how two people wearing Orion together could interact with the same holograms, I played a 3D take on Pong with Zuckerberg. We scanned a QR code to pair our glasses and then used hand tracking to control the paddle. This worked surprisingly smoothly, and I noticed little to no lag in the game. 

Zuckerberg beat me, unfortunately.

I also used a version of the Messenger app built for the glasses to make what I was told was the first external video call from Orion to The Verge’s Nilay Patel on his iPhone. He couldn’t see me (Meta plans to eventually show an avatar that tracks the wearer’s facial movements), but I could see and hear him well in the 2D window floating in front of me. To illustrate how avatar chats will work one day, a Meta employee then called me and appeared across the room as a cartoonish full-body avatar.

Here’s where I’ll note that my demo was on guardrails. There were computers to the side of the room that could trigger certain experiences for me. Employees in the room mostly guided me through what they wanted me to try, though I did manage to deviate from the instructions to make sure that what I was seeing wasn’t totally simulated. 

Orion isn’t a mirage. It’s also not a product. It’s somewhere in between.

The glasses need to be paired with a “neural wristband” and a wireless compute puck.

The hardware for Orion exists in three parts: the glasses themselves; a “neural wristband” for controlling them; and a wireless compute puck that resembles a large battery pack for a phone. The glasses don’t need a phone or laptop to work, but if they’re separated from the puck by more than 12 feet or so, they become useless.

Orion boasts a 70-degree field of view, which is wider than any pair of AR glasses I’ve tried to date. In my experience, a narrower field of view causes AR to feel small and less immersive, like you’re looking through a peephole. With Orion, I had to get pretty close to virtual objects before their edges started to disappear. 

At 98 grams, the glasses weigh significantly more than a normal pair but also far less than mixed reality headsets like the Meta Quest or Apple’s Vision Pro. The frames are made of magnesium, which is lighter than aluminum and used for evenly distributing heat. 

Seven cameras embedded in the frames are used to anchor virtual objects in real space, assist with eye and hand tracking, and allow Meta’s AI assistant to understand what you’re looking at. You can leave a virtual window open, turn your head and walk away, and as long as the glasses stay on, it’ll still be there when you come back. 

The quality of Orion’s display is surprisingly good given the form factor. Video calls look crisp enough to feel engaging, and I had no problem reading text on a webpage that was several feet away. However, I wouldn’t want to watch Avatar in them — I probably couldn’t finish it anyway since the battery only lasts about two hours.

The band uses electromyography (EMG) to interpret neural signals associated with hand gestures.

You control the glasses through a combination of eye tracking, hand tracking, voice, and the neural wristband, which loosely resembles a Fitbit without a screen. It’s made of a high-performance textile material and uses electromyography (EMG) to interpret neural signals associated with hand gestures. In milliseconds, those signals are translated into input. It’s not reading your thoughts, but it kind of feels like it.

It’s not reading your thoughts, but it kind of feels like it

The wristband recognizes a few gestures: pinching your index finger with your thumb selects things; pinching your middle finger and thumb invokes or hides the app launcher; and performing a coin-flipping gesture with your thumb against your closed palm allows you to scroll up or down. Haptic feedback in the band lets you know when a gesture is recognized, which is a nice touch.

Your eyes act as the pointer in Orion’s interface, while pinching your fingers together acts as the click. Altogether, using Orion felt more precise than controlling a Quest or Vision Pro with my hands. I breezed through a Space Invaders-like game in which tilting my head moved the ship and pinching my fingers together fired its lasers. And because the band doesn’t have to be visible to the sensors and cameras on the glasses, I was able to control them with my hand behind my back or in my jacket pocket.

The neural wristband feels more polished than Orion itself, which is likely because Meta will start selling it soon. While the company won’t comment, my sources say that Meta is planning to ship a pair of glasses with a smaller heads-up display that the wristband will also work with, codenamed Hypernova, as soon as next year.

Cameras on the outside of Orion anchor virtual objects to the real world.

Orion was supposed to be a product you could buy. When the glasses graduated from a skunkworks project in Meta’s research division back in 2018, the goal was to start shipping them in the low tens of thousands by now. But in 2022, amid a phase of broader belt-tightening across the company, Zuckerberg made the call to shelve its release. 

This decision is evident by the fact that there are multiple parts of Orion’s hardware that Meta isn’t using, from the front-facing cameras that can capture video but don’t to the disabled GPS in the compute puck. There’s a built-in modem for cellular data that isn’t active. “We’re saving $20 a month,” quips Alex Himel, Meta’s VP of wearables.

As Meta’s executives retell it, the decision to shelve Orion mostly came down to the device’s astronomical cost to build, which is in the ballpark of $10,000 per unit. Most of that cost is due to how difficult and expensive it is to reliably manufacture the silicon carbide lenses. When it started designing Orion, Meta expected the material to become more commonly used across the industry and therefore cheaper, but that didn’t happen. 

“You can’t imagine how horrible the yields are,” says Meta CTO Andrew Bosworth of the lenses. Instead, the company pivoted to making about 1,000 pairs of the Orion glasses for internal development and external demos. 

“It’s probably turned out significantly better than our 50-50 estimates of what it would be, but we didn’t get there on everything that we wanted to,” Zuckerberg says of the device. “We still want it to be a little smaller, a little brighter, a little bit higher resolution, and a lot more affordable before we put it out there as a product. And look, we have a line of sight to all those things.”

The glasses have a 70-degree field of view, wider than most other AR glasses.

Now, he says Meta expects what was originally going to be the second generation of Orion to be the first consumer version in a few years. Executives are cagey on cost but tell me to expect a price tag that is comparable to the phones and laptops of today.

“It was a science problem to get to a large field of view,” Rahul Prasad, the product lead for Orion, tells me. “The next phase is an engineering problem to get to higher resolution, higher brightness, and lower cost.”

The lenses won’t be made of silicon carbide, and the field of view won’t be quite as large, even though the resolution will be higher. And if everything goes according to plan, the frames will be about half as thick as the Orion glasses are now. 

“I think we aspire to build things that look really good,” Zuckerberg says. He recognizes that for AR glasses to work at scale, they need to function just as well when the AR is turned off. “It needs to be good in order for you to want to keep it on your face.”

“It needs to be good in order for you to want to keep it on your face.”

As Meta sees it, the march to full-fledged AR glasses will be gradual, not sudden. On one end of the spectrum, there will be AI-powered smart glasses without displays, like the Ray-Ban Meta. Then, there will be glasses with small displays, like the forthcoming Hypernova, that provide lighter-touch interactions, like interacting with Meta AI and texting with friends. Orion represents the end state: full-fledged AR glasses with enough computing power to actually leave your smartphone at home.

Zuckerberg has been trying to get past relying on smartphones for a long time. “A lot of it goes all the way back to our relationship with mobile platforms,” he says. “Mobile phones and smartphones got started around the same time as Facebook and early social media, so we didn’t really get to play any role in that platform transition.”

Practically, that has meant living under the thumb of Google and Apple, which, together, control the mobile app stores that people access Meta’s apps through. Apple, in particular, has been a thorn in Zuckerberg’s side for a long time. It has temporarily disabled Meta’s internal apps and issued policy changes like the App Tracking Transparency prompt, which temporarily tanked Meta’s ads business.

Now, Apple is competing in headsets with the Vision Pro and working on its own AR glasses. Snap is racing to win developers with its next-generation AR Spectacles. Google recently tried to muscle into Meta’s partnership with Ray-Ban’s parent company, EssilorLuxottica, and is working with Magic Leap and Samsung to develop headsets.

At the same time, Meta has yet to prove itself as a mainstream hardware company. The Quest headset has found early traction with gamers but struggled to break out as the general-purpose computing device the company originally hoped it to be. Headlines still focus on how many billions it spends a year on Reality Labs, the division that makes Quest, its glasses with Ray-Ban, and AR glasses. 

“We’re not just making this shit up here.”

It’s clear that perception — that Meta is spending a lot of money with little to show for it — has played into the company’s decision to show off Orion now. “We’re not just making this shit up here,” says Bosworth. “We’re not burning cash. The investments we’re making are a real, tangible technology.”

After years of talking about the AR glasses it’s going to build, Meta does finally have something tangible. It’s an impressive demo. Now comes the hard part.

Related:

Continue Reading