Samsung Teases AI-Powered Smart Glasses as Next Big Device Category

Samsung has revealed fresh details about its planned smart glasses, including an eye-level camera designed to feed real-time visual data into AI services running on a connected smartphone.

4
Devices & Workspace Tech​News

Published: March 9, 2026

Christopher Carey

Samsung has unveiled new details about its forthcoming smart glasses, including an eye-level camera designed to capture what users are looking at and feed that information to AI services running on a connected smartphone.

The comments, made by the company’s Executive Vice President Jay Kim during an interview with CNBC offer the clearest picture yet of how the tech giant plans to compete with devices like the Ray-Ban Meta smart glasses.

Kim said the glasses will rely on a paired smartphone to process visual data captured by the device’s camera.

The system is designed to allow artificial intelligence services to analyse what a user is looking at and provide contextual information in real time.

“The important thing is AI should understand where you’re looking at,” Kim said. “Then it feeds the information to the mobile phone and processes it to give you useful information.”

The approach reflects a growing industry focus on integrating AI into wearable devices, with manufacturers exploring new ways for users to interact with assistants beyond traditional smartphone apps.

Samsung has been working with Qualcomm and Google since 2023 to develop hardware and software for extended reality devices, combining augmented and virtual reality capabilities with AI-driven services.

The partnership previously produced the Samsung Galaxy XR headset, which runs on Android XR – Google’s operating system for virtual, augmented and mixed reality devices.

The Push Toward Wearable AI

While mixed reality headsets have generated significant attention in recent years, many technology companies believe smart glasses could ultimately prove more commercially viable because they fit naturally into everyday life.

Headsets tend to be bulkier and designed for specific use cases such as gaming or immersive collaboration. By contrast, glasses are lightweight and socially familiar, making them easier to integrate into daily routines.

Kim acknowledged this difference, suggesting that XR headsets may remain part of the ecosystem but are unlikely to reach the same scale as more subtle wearable devices.

“I think XR on headset will sort of be around, but not as a sort of mass scale business,” he said.

Instead, many companies now see smart glasses as a potential successor – or at least a companion – to the smartphone.

The idea is that rather than pulling out a phone and opening an app, users could simply speak to an AI assistant or rely on contextual cues captured by the glasses’ camera. The AI could then provide directions, identify objects, translate text or surface relevant information in real time.

Competition Intensifies

Samsung will enter a market currently dominated by Meta Platforms.

Its Ray-Ban Meta smart glasses hold roughly 82 percent of the global smart glasses market, according to Counterpoint Research.

Meta’s glasses, developed in partnership with eyewear brand Ray-Ban, allow users to take photos, record video, listen to audio and interact with the company’s AI assistant through voice commands.

However, a growing number of companies are attempting to challenge Meta’s early lead.

Chinese tech giant Alibaba and AR specialist Xreal are both developing smart eyewear products, reflecting increasing interest in the category.

For Samsung, the advantage may lie in its broader hardware ecosystem, which spans smartphones, wearables and displays.

By connecting smart glasses directly to its Galaxy smartphones, the company could leverage existing processing power and AI capabilities without needing to fit all the computing hardware into the glasses themselves.

This approach could also help reduce weight and extend battery life – two factors that have historically limited wearable devices.

AI Agents and the Next Computing Platform

Industry executives increasingly believe the rise of advanced AI models could accelerate the adoption of smart glasses.

Technologies such as Google Gemini and ChatGPT are already enabling more sophisticated AI assistants capable of understanding context and performing tasks on behalf of users.

But despite the optimism, the tech behind smart glasses still faces several hurdles.

Battery life, device weight and heat management remain technical challenges, particularly when integrating cameras, microphones, speakers and wireless connectivity into a lightweight frame.

Privacy concerns are also likely to remain a topic of debate. Devices equipped with always-available cameras could raise questions about surveillance and consent, particularly in public spaces.

Manufacturers have attempted to address these concerns by including visible recording indicators and limiting certain features, but public perception will still play a role in determining how widely the technology is accepted.

Another key factor will be the development of a robust application ecosystem.

Launch Timeline

Samsung has not yet confirmed whether its smart glasses will include a built-in display.

When asked about the possibility, Kim suggested users may instead rely on other devices such as smartphones or smartwatches when a screen is required.

The company is aiming to have a product ready for industry partners this year, and if the technology gains traction, smart glasses could represent the next stage in personal computing – bringing AI assistants closer to users and enabling a more seamless connection between the digital and physical worlds.

Much like the first generation of smartphones, the success of smart glasses may ultimately depend less on the hardware itself and more on the software and AI experiences that evolve around it.

AI GlassesDevice Performance
Featured

Share This Post