Samsung and Google are working on mixed reality VR headsets like the Apple Vision Pro, powered by Android XR and Google Gemini. We already know that and even demoed it last year. But Samsung also revealed a little more details at its mobile-focused winter event “Samsung Unpacked,” specifically, what could be the missing piece to tie everything together, Google and Samsung. The company also announced a common AI ecosystem partnership. That AI-powered experience will be available on next-generation VR/AR headsets this year. galaxy s25 A phone and glasses that connect to it.
In a sense, I had already grasped the future prospects at the end of last year.
This story is part of samsung eventsCNET’s collection of news, tips, and advice about Samsung’s most popular products.
AI that moves in real time
Samsung briefly covered its upcoming VR/AR headsets and glasses at its latest Unpacked event, but we already knew most of them. Still, Samsung’s demonstration of real-time AI that lets you see things through your phone or camera is right on trend with us. expected to arrive in 2025.
Project Moohan (meaning “infinite” in Korean) is a project that includes Vision Pro and meta quest 3. The design is very similar to the deprecated Meta quest pro But it has much better specs. The headset has hand and eye tracking, runs Android apps via the Android XR OS that will be fully rolled out later this year, and uses Google Gemini AI as an assistance layer throughout. Google project astra Technology that enables real-time assistance with glasses, phones, and headsets, I am making my debut On Samsung’s Galaxy S25 series phones. But I’ve already seen it actually show up on my face.
In last year’s demo, I was able to use Gemini to help me look around the room, watch YouTube videos, and basically do anything else. To use Live AI, I need to start it in live mode, and then I can see and hear what I’m seeing and hearing. There was also a pause mode to temporarily stop Live Assistant.
Samsung showed off what appears to be a similar real-time AI feature on its Galaxy S25 phone, with more features promised. I think it will work while watching videos on YouTube, just like my Android XR demo. It can also be used for live help during gameplay, according to Samsung and Google executives working on Android XR.
Improved battery life and processing…for glasses?
Samsung and Google also confirmed that they are working on smart glasses using Gemini AI. meta ray bans And other new waves of eyewear. AR glasses also seem to be in development.
Project Moohan is a standalone VR headset with its own battery pack and processor, similar to Apple’s Vision Pro, but the smaller smart glasses being developed by Google and Samsung, and all subsequent glasses, are built on a mobile phone. Depends on connectivity and processing assistance from. To work. That’s how smart glasses like Meta’s Ray-Bans are already working.
However, more features may require more intensive phone processing. Live AI could start to become an increasingly used feature, relying on mobile phones to assist these glasses. Better processing, graphics, and most importantly improved battery life and cooling seemed to me like a way to make these phones ultimately better as pocket computers for glasses. .
Personal data sets needed for these AI gadgets
Samsung also announced a vague-sounding personal data engine that will be powered by Google and Samsung’s AI. This buckets personal data into places where AI has the potential to develop richer conclusions and connections to everything that is part of your life.
It was very unclear how it would be deployed, how it would be kept safe, or what its limits were. But it sounds like a repository of personal data that Samsung and Google’s AI can train on and work with connected augmented products like watches, rings, and glasses.
Camera-enabled AI wearables are only as good as the data that can support them. That’s why many of these devices feel awkward and weird right now, including Meta’s Ray-Ban in AI mode. Typically, these AI devices hit a wall when it comes to knowing things that existing apps already know well. Google and Samsung are clearly trying to solve this problem.
Do you want to leave that process to Google, Samsung, or someone else? How will these smartphones and future glasses make the relationship between AI and data clearer and easier to manage? Google’s I/ I feel like we’re seeing other shoes come out here at a time when Android XR and Gemini advancements are likely to be discussed in more detail at developer conferences.
Samsung plans to make Project Moohan its first headset, followed by glasses. We expect Google to reveal further details with Samsung at the developer-centric Google I/O conference around May or June, and perhaps a full overview at Samsung’s next anticipated Unpacked event in the summer. be done. By then, we’ll know a lot more about why this new wave of seemingly boring Galaxy S25 phones is building an infrastructure that will be rolled out in more explicit detail by the end of the year, and perhaps even beyond. You might understand.