Connect with us

Tech

Google wants Android XR to power your next VR headset and smart glasses

Published

on

Google wants Android XR to power your next VR headset and smart glasses

The Android operating system runs on billions of devices worldwide. Most of them are phones, but many of them are also tablets, smartwatches, televisions, cars, and a bunch of random IoT products. Officially, though, Google only supports Android on phones, tablets, watches, TVs, and cars, but today, the company is expanding the OS to support a new category of devices: extended reality (XR) devices. Google has announced Android XR, a new platform dedicated to VR headsets and AR smart glasses.



Related

Samsung’s XR glasses could steal the Galaxy S25’s thunder at the next Unpacked

But in a prototype form


What is Android XR?

Android XR is a new version of the Android operating system that was built from the ground up for XR devices like VR headsets and AR smart glasses. It’s based on AOSP, the open-source foundation of all Android devices, but it’s been heavily customized to support XR experiences.

Android XR Logo

The logo for Android XR


Google hasn’t created a totally new flavor of Android in years; the last time it did so was back in 2017 with Android Automotive, which was well before the rise of generative AI. Compared to other flavors of Android, Android XR was completely developed around Google Gemini at its core. (Gemini, if you aren’t aware, is the name of Google’s AI chatbot and large language model family.) In fact, Google says that Android XR is the “first Android platform built for the Gemini AI era.”

The Android maker hasn’t been alone in developing Android XR, though. It partnered closely with Qualcomm, Samsung, and others to bring Android XR to life. Samsung is developing the first hardware to run Android XR, which is set to debut sometime in 2025. Qualcomm, meanwhile, is creating the chipsets that’ll power these devices.

samsung-google-xr-project-announced

Source: Samsung


Qualcomm and Samsung are just the first of many companies to work on Android XR hardware, though, as Google envisions Android XR to be the single unifying platform for a range of XR scenarios, ranging from VR headsets for gaming and productivity to smart glasses for lifestyle and healthcare. Companies like Lynx, Sony, and XREAL are already working on their own Android XR devices, for example. Google hasn’t made a “full determination” on whether it’ll publicly provide the source code for Android XR, though, so it remains to be seen whether any enterprising startup will be able to make hardware for it, at least not without entering into a partnership with Google.

Why is Google building Android XR?

Long-time readers are probably aware that Google isn’t new to XR, with the company having started and discontinued the AR-focused Google Glass project and the VR-focused Daydream VR platform. Google believes its original vision for XR was “correct” but that the technology simply wasn’t ready at the time. After discontinuing both projects, the company still held on to its XR ambitions, pivoting to phone-based AR initiatives like ARCore.


Recent breakthroughs in AI have convinced Google that it can finally make VR take off. Interacting with AI chatbots in a multimodal manner, ie. through not only speech but also through vision, is going to be the “killer app” of this era, Google argues, but it’s simply too awkward to do so right now through a smartphone. The company believes that VR headsets and AR smart glasses are a much more natural form factor for these kinds of interactions, which is something we can get behind after seeing the Project Astra demos from earlier this year.

Project_Astra_demo_on_a_phone

A demo of Project Astra on an Android phone. Source: Google.


This is why Google believes now is the right time to launch Android XR. The company believes it’s in a strong position to launch the platform given its “unique” position in AI. Google has a “full stack” of technology it can take advantage of to bring AI to XR platforms, ranging from cutting-edge AI models in the cloud to on-device AI models to an ecosystem of developers it can reach out to.

Related

Project Astra is the Google Glass we deserve

Seeing the glasses in action has left us both nostalgic and hopeful

What can you do on an Android XR headset?

Getting developers on board with Android XR isn’t going to happen if there aren’t consumers to sell apps to, and getting consumers to buy Android XR headsets is only going to happen if there are killer apps and experiences ready for them at launch. Google today offered a sneak peek at some of the experiences we can expect from Android XR headsets. This includes demos of how the OS enables a customizable, boundless immersive viewing experience that can be controlled through natural, multimodal AI interactions.


Related

Gemini 2.0 is here to bring multimodal AI closer to its agentic endgame

Almost exactly one year after Gemini 1.0

For example, Android XR allows apps like Google Photos, Google TV, and YouTube to be shown in windows floating above objects in the real world. These windows can be moved around, dragged or dropped, and minimized or closed by using hand gestures. Every app in Android XR has a header bar and sometimes a bottom bar with various buttons that can either be controlled through hand gestures or controlled conversationally through Gemini. Gemini in Android XR has the ability to interact with and even control your apps, as the platform allows Gemini to “see what you see, hear what you hear, and react to your gestures alongside your voice.”


In the following demo, we see specifically how Google Photos is being optimized for Android XR. The app is shown in its familiar tablet UI, but when an “Immersive” button is pressed at the bottom, the photo is shown without any borders. Tapping another button opens a carousel of photos and videos that you can seamlessly move through.

In the second demo, we see the immersive UI that Google has built for its Google TV app. Movies and TV shows are shown on large, expansive cards with high-resolution thumbnails. Trailers are shown in big, floating, borderless windows that can be put inside a virtual theater room for an even more immersive experience.


Next, Android XR devices have access to the full catalog of 180° and 360° content available through the YouTube app, as shown in this demo. You can even ask questions about the video and get answers back thanks to YouTube’s integration with Gemini.

Google Maps and Chrome will also support Android XR, with the former letting you view cities and landmarks in virtual space using the app’s Immersive View feature and the latter letting you browse the web on multiple virtual screens.

You’ll even be able to use Google’s Circle to Search feature on Android XR. You can use Circle to Search to select text and images in your view, look them up on Google, and then place 3D objects in your environment.


Related

Circle to Search could finally ditch a design element we hated back in the Android 12 days

After so many years it’s finally going away

These are just a few of the many experiences that’ll be available for Android XR devices, according to Google. Many more will hopefully be available as third-party developers get their hands on prototype hardware and start tinkering with the new Android XR SDK to create apps. We probably won’t see the full extent of what’ll be possible until actual hardware hits the shelves, which is thankfully going to happen sometime next year.

What will the first device to run Android XR be?

The first device to run Android XR will be Samsung’s VR headset code-named Project Moohan. It’ll go on sale sometime next year at an unspecified price. Details are sparse about the headset, but you can read more about it here.

When will smart glasses running Android XR arrive?

Even if Samsung does end up showing off its smart glasses next month, the product won’t ship until after the “Moohan” VR headset comes out. That’s because Samsung and Google have strategically decided to focus on VR headsets first before releasing smart glasses.


Both companies believe that VR headsets are the “most suitable form factor” to start building out a core XR ecosystem. This is due to the fact that they offer a higher level of immersion and higher resolution displays than smart glasses. They also offer eye, head, and hand recognition and can seamlessly transition between mixed and virtual realities. In contrast, XR smart glasses are a bit more limited in terms of immersion and input options as they’re much smaller and thus can’t pack as much hardware.

Still_Glasses_Lifestyle_2

Smart glasses are designed to be small and lightweight enough to be worn every day.


This is why smart glasses running Android XR are being developed with different experiences in mind. Google envisions smart glasses as more of a lifestyle product rather than a gaming or productivity one. You’re expected to wear smart glasses out in public just like regular glasses, and you’re expected to converse with them while moving about. Just like with VR headsets, AR smart glasses will also support voice controls through Gemini AI, but voice interactions are even more important for smart glasses than they are for VR headsets.

For example, you’ll be able to ask Gemini to summarize the content of group chats in Google Messages, send messages to contacts, look up information on nearby stores and restaurants through Google Maps, and get turn-by-turn navigation directions to a location.

You’ll also be able to point your smart glasses at a sign and ask Gemini to translate it, ask Gemini questions about the content of the sign, and get real-time translations during conversations.


Lastly, you can ask Gemini some general questions like you would on your phone. It’ll have access to whatever you’re seeing for context and can even remember things that it saw previously.


Because smart glasses tend to have significantly smaller batteries and weaker processors than VR headsets, a lot of the computing going on behind the scenes to make these features possible is actually happening on your phone rather than on the glasses themselves. Android XR on smart glasses takes advantage of what Google calls a “split-compute configuration” where a lot of the computing is offloaded to your smartphone, which streams sensor and pixel data to the smart glasses. This allows for smart glasses to be built without bulky hardware. Rumors suggest Samsung’s XR smart glasses could weigh just 50 grams, which is very close to the highly lauded Ray-Ban Meta smart glasses.

Smart glasses running Android XR will be coming soon, and they’ll be available in a variety of options. Google will soon begin real-world testing of its Project Astra service on prototype glasses running Android XR, and they’re inviting a small number of users to sign up to take part in this testing.

Related

Project Astra: Everything you need to know about the Google Deepmind project

Google’s new vision for AI assistants


We don’t know when Samsung’s smart glasses will launch or whether they’ll even have an in-lens display, but Google’s announcement strongly suggests the initial batch of Android XR smart glasses will have displays, as shown in the demo videos that Google shared with us. There’s a future where Android XR will run on display-less glasses, though for now, Google believes that displays are important to the form factor as they allow for richer content and more capabilities to be offered on the output side.


The unveiling of Android XR is a big moment for Google. It represents a return to a vision that many people once derided, but some saw the potential in. In hindsight, Google did a lot of things right with Glass, but it was simply ahead of its time. Whether Google will succeed this time in bringing XR to the masses will largely depend on how successful it is in convincing developers to create immersive VR games and apps for the new OS.

Continue Reading