Google Unveils Photorealistic ‘Likeness’ Avatars for Android XR

Google is rolling out photorealistic “Likeness” avatars on Android XR, giving users a new way to appear in video calls – and putting Apple’s Vision Pro Personas firmly in its sights.

4
Google
Devices & Workspace Tech​Immersive Workplace & XR TechNews

Published: December 15, 2025

Christopher Carey

Google has started rolling out a new photorealistic avatar system – Likeness – for Android XR headsets.

The feature is designed to make video calls feel more natural by replacing webcams with realistic digital versions of users.

The rollout begins in beta and follows earlier previews shown during Google’s Android XR announcements.

“This allows [feature] others to see you authentically while you’re using your Android XR headset for video calls, making your interactions feel natural and personal,” said Shahram Izadi, VP Android XR, Google.

While Apple currently leads the market with Vision Pro, Google is clearly positioning Android XR as a serious alternative. Likeness plays a key role in that strategy by focusing on communication, one of the most common use cases for extended reality.

The idea is simple: make remote conversations feel more human without forcing people to change how they already communicate.

How it works

Likeness is essentially a photorealistic avatar created by scanning a user’s face. The scan is then animated using sensor data from an Android XR headset.

As a result, the avatar mirrors facial expressions, head movement, and subtle gestures in real time.

The avatar replaces a standard webcam feed during video calls. Other participants see a lifelike digital face instead of a live camera image.

This allows users to maintain eye contact and express emotion without showing their real environment.

Importantly, the feature is not a cartoon or stylized avatar, with Google clearly aiming for realism.

Skin texture, lighting, and facial proportions are designed to closely resemble the real person. Early demonstrations suggest the avatars look convincing and avoid exaggerated expressions.

For now, Likeness appears as a two-dimensional image in calls. This makes it compatible with popular services like Google Meet, Zoom, and Messenger without any special integration.

Creating a Likeness avatar using a phone

Unlike Apple’s approach, Google does not use the headset to scan a user’s face.

Instead, the company has released a Likeness beta app for Android phones.

Users hold their phone in front of their face and follow guided instructions to complete the scan.

This method has clear advantages. A phone is lighter and easier to handle than a headset. It also makes the setup process faster and less awkward. For many users, this will feel more familiar and comfortable.

However, the phone-based approach also introduces limitations. The Likeness scanning app only works on a small number of devices. Supported phones include the Pixel 8 series, the Galaxy S23 lineup, and the Galaxy Z Fold5.

By comparison, Apple allows any Vision Pro user to create a Persona directly on the headset. That difference could affect adoption, especially in mixed-device workplaces.

No spatial meetings yet, but wide app support

While Likeness closely resembles Apple’s Personas, it lacks one major feature – there are no spatial meetings yet. Users cannot meet as fully three-dimensional avatars inside shared virtual spaces.

Apple’s Vision Pro already supports spatial FaceTime calls. These place Persona avatars in a shared environment, creating a stronger sense of presence.

Google has confirmed that spatial meetings are planned, but the company has not shared a release timeline. For now, Likeness remains limited to flat video call presentations.

That said, Google’s choice appears intentional.

By focusing on virtual webcam compatibility first, Likeness works immediately with existing tools. Users do not need to convince others to adopt new platforms or workflows.

This approach also enables cross-platform communication. Likeness users can join calls with standard webcam users or even Apple Persona users. The experience may be less immersive, but it is far more practical.

Hardware limits across Android XR devices

Not every Android XR device will support Likeness though.

Photorealistic avatars require powerful processors and multiple sensors. These include cameras for tracking eyes, mouth movement, and facial expressions.

Many upcoming Android XR smart glasses do not meet these requirements – they are designed to be lightweight and discreet.

As a result, they lack the hardware needed to animate realistic faces.

Some full Android XR headsets may also struggle.

Devices without eye- and mouth-tracking cameras cannot fully animate a Likeness avatar. While simulated facial motion is possible, it often looks unnatural when applied to realistic faces.

This highlights a larger challenge for the XR industry. As headsets become smaller, there is less room for sensors. Maintaining high avatar fidelity while reducing hardware size will be difficult.

Despite these challenges, Likeness represents an important step for Android XR. Photorealistic avatars offer a middle ground between webcams and full virtual worlds. They allow users to appear present without revealing their surroundings.

For remote work and long video calls, that balance could be appealing. It also shows that Google views avatars as a core communication feature, not a novelty.

While Apple may currently lead in immersion, Google is betting on accessibility, compatibility, and scale to close the gap.

Augmented RealityExtended RealityImmersive TechIndustrial MetaverseMetaverseMixed RealitySpatial Computing & XR​Virtual Reality
Featured

Share This Post