Samsung is reportedly planning to unveil its first AI smart glasses at Galaxy Unpacked in London on July 22.
According to the Seoul Economic Daily, the Galaxy Glasses will make their debut alongside the Galaxy Z Fold 7 and Z Flip 7 – putting Samsung directly in Meta’s lane for the first time.
The reported hardware is straightforward: Android XR under the hood, Gemini as the built-in assistant, a 12-megapixel camera, speakers, and microphones.
All processing offloads to a paired Android phone. There is no display, no onboard compute, and no confirmed price.
Samsung has not officially announced the product – but Google confirmed at I/O 2025 that it is co-developing the Android XR platform with Samsung specifically to extend beyond headsets into glasses, so the direction of travel is not in doubt.
Meanwhile, Meta’s Ray-Ban smart glasses have sold in the millions and established the template for what an AI wearable looks like in 2026 – camera, microphones, speakers, and a voice assistant, all packaged into something people actually want to wear.
Samsung and Google are betting they can do it better. Android XR brings deeper integration with Google’s ecosystem than Meta AI can offer Ray-Ban users, and Gemini’s ability to take actions across apps – not just answer questions – is the functional argument for switching.
Whether that’s enough to pull users away from a product that already has genuine momentum is the question July will start to answer.
What the Device Actually Does – and What It Doesn’t
Based on Google’s I/O 2025 demonstrations, the practical picture is this: Galaxy Glasses give hands-free access to Gemini without pulling a phone from a pocket.
Google showed the platform handling navigation directions, message replies, appointment scheduling, real-time language translation, and photo capture – everything a phone can do, routed through ears and a camera.
The absence of a display shapes the experience in ways that matter for professional use.
Audio directions work, but a glanceable map in the field of view is faster. Live translation audio is useful, but subtitles a speaker can read while maintaining eye contact are a different thing entirely.
But the hardware has a simple limitation: if the task requires you to see the answer rather than hear it, the glasses cannot help. That rules out more workflows than it might first appear.
The device is phone-tethered by design. That keeps the glasses light and the cost accessible, but it also means the experience degrades with signal quality, device distance, and battery charge.
For buyers assessing whether this category belongs in a device policy, that dependency is not a footnote.
The Privacy Question Is Not Theoretical
The more urgent conversation for regulated-sector buyers is not about display resolution or battery life – it’s about data.
Smart glasses are deliberately unobtrusive – that is the feature.
A phone held visibly signals to bystanders that recording may be in progress. Glasses provide no equivalent notice. A well-meaning employee using an AI translation feature in a client meeting, a healthcare consultation, or a legal review could inadvertently route sensitive material through an external AI system without intending any harm.
In organisations subject to GDPR, HIPAA, or financial services regulation, that is not a hypothetical risk – it is a compliance exposure.
The regulatory pressure on this category is already real.
Earlier this year, allegations emerged that Meta’s data-handling for its Ray-Ban AI glasses fell significantly short of its stated privacy assurances – including reports that human reviewers accessed footage from private spaces, with face-blurring said to have regularly failed.
The UK’s Information Commissioner’s Office wrote to Meta requesting details on UK data protection compliance. Italy’s Garante raised similar questions. The dispute has not been resolved.
Google has acknowledged the challenge. At I/O 2025, the company said it is testing Android XR prototypes specifically to ensure the product “respects privacy for you and those around you.”
The platform design spec includes an outward-facing LED to signal when recording is active.
That is a genuine concession to transparency, but it is not the same as a published data retention policy, a breakdown of on-device versus cloud processing, or documented enterprise controls.
Samsung has not disclosed any of those details for Galaxy Glasses. Those are the questions regulatory bodies and IT and legal teams in regulated sectors will ask first – and the July launch date is approaching faster than most organisations’ procurement cycles.
What July 22 Actually Settles
If Samsung ships Galaxy Glasses this summer, the company will have established Android XR as a glasses platform and given buyers something concrete to evaluate.
What the launch cannot settle is whether the device belongs inside corporate environments. That determination depends on data architecture disclosures, management controls, and regulatory clarity that does not yet exist.