Featured
At TED 2025, Google Put XR on Your Face — And It Remembers Everything You Don’t
Google’s Mind-Blowing AR & VR Demos Signal a New Tech Era
At TED 2025 in Vancouver, Google gave us a sneak peek into the future — and it’s looking straight at us through a new pair of augmented reality glasses. Quietly stylish and seemingly ordinary at first glance, these prototype AR glasses pack some seriously futuristic tech, powered by Google’s Gemini AI.
And no, this isn’t some concept in a lab. It’s real. It’s working. And it’s here.
About the TED Event
About the TED Event: The Stage Where the Future Walked In
This year’s TED wasn’t just another lineup of bright minds and big ideas — it was a turning point for how we’ll experience the world.
Google stole the show with a jaw-dropping demonstration of its next-generation wearable AI, giving us a glimpse into the era of ambient computing where your tech doesn’t sit in your pocket — it lives on your face.
The event was packed with energy, anticipation, and a whole lot of “Did that just happen?” moments. When Google stepped on stage, they weren’t just talking about the future.
They were wearing it.
Product Showcase: Google XR Glasses Demo
Eyes That Remember
During the live demo, Nishtha Bhatia, one of the product leads at Google, walked on stage wearing what looked like an ordinary pair of glasses. But the moment she interacted with them, the entire room leaned in.
She looked at a cluttered shelf and casually asked,
“Hey Gemini, do you remember the white book I saw on the shelf?”
Gemini didn’t miss a beat. It had seen it — and remembered the name instantly.
Then came the real kicker.
“I think I lost my hotel key.”
The glasses responded by reminding her exactly where she last left it.
Displaying a Second Brain
These weren’t just smart glasses — they were a wearable memory bank. A discreet color display inside the lens showed her reminders, object memories, and location-based info, all hands-free.
These glasses see, remember, understand, and display your surroundings — like a second brain with a built-in interface.
Live Translation & Context-Aware Assistance
🗣 Speak Your Language, See Any Language
Another moment that floored the audience: real-time language translation. Nishtha conversed in Hindi and Farsi, while English subtitles appeared seamlessly in her field of view.
Not voice-to-text. Not screen translation.
This was context-aware, real-time multilingual AI layered right onto reality.
No More Stacking Glasses
And yes — they’ve thought of the glasses wearers too. The AR glasses come with customizable prescription lenses, making the experience functional and inclusive. No doubling up, no compromise on clarity.
The Tech Behind the Magic: What’s Inside These AR Glasses?
These aren’t just smart. They’re a compact miracle. Here’s what’s packed into that sleek frame:
Tiny Camera — Captures what the user sees so Gemini can understand the world visually.
Microphones — Enable voice commands and conversations with the AI.
Speakers — Deliver clear audio feedback without disturbing your environment.
Hi Res In-Lens Color Display — Projects AR overlays directly into your vision.
Gemini AI Integration — The intelligence that makes it all work.
Prescription-Ready Design — Optional prescription lenses built in for seamless everyday use.
All of that in a frame you’d actually wear in public.
VR Showcases & AI in XR: Project Moohan
Switching gears from AR to full-blown immersive experiences, Google unveiled Project Moohan, a VR headset co-developed with Samsung that integrates Gemini AI into the heart of spatial computing.
This wasn’t just a VR headset for games. It’s your AI-powered personal assistant, travel agent, coworker, and co-pilot — all within a sleek immersive UI.
✈A Vacation Plan Without Lifting a Finger
Switching from AR to full immersion, Google also dropped Project Moohan, a VR headset co-developed with Samsung and powered by Gemini AI. This isn’t just another VR device. It’s a full-blown spatial computing assistant.
“Show me a good vacation spot with great views.”
Boom. The headset opens tabs, Organizes the tabs, recommends flights, checks your calendar”
Your itinerary takes shape right in front of your eyes, all using:
- No controllers just Hand gestures to interact
- Voice commands to converse naturally
AI Companion That Understands You
Inside Project Moohan, Gemini becomes more than just a voice — it’s a dynamic conversational companion that helps you:
- Assist in games, providing hints or strategy advice in real time.
- Manage digital windows, automatically organizing your virtual workspace.
- Explore virtual worlds with contextual info layered over environments.
- Watch videos with AI-enhanced narrations — from soothing bedtime story tones to quirky explainer vibes.
This was a masterclass in where AI and XR intersect. The headset wasn’t clunky, and the UI didn’t feel like a prototype. It felt ready.
Final Thoughts: A Tech Leap, Not Just a Step
What we saw at TED 2025 wasn’t a step forward — it was a tech leap. We’re witnessing the dawn of spatial computing — where our tools don’t sit in our hands, but live around us, with us, seamlessly integrated.
Phones might soon be the least interesting device we carry. These AR glasses don’t demand your attention. They quietly upgrade your perception. The VR headset doesn’t isolate you from the world. It helps you understand it better, faster, smarter.
The wearable AI age is no longer science fiction. It’s wearable. It’s functional.
It’s here :)