cnet.com

Pokemon Go Has a New Owner, But Niantic's Evolving Its Maps Into a Way to Fold in AI and AR

Your next game of Pokemon Go could be changing. The game is getting a new corporate boss. Meanwhile, previous parent company Niantic has other plans in the works for itself: a future that's less about gaming and more about AI driven by maps that those games have generated. I've seen pieces of that future already, and AR glasses may just be one part of where this future all plays out.

That future may still involve some games, too. Niantic is retaining control of Ingress and Peridot, which are two of Niantic's most augmented reality-focused, location and map-connected games. And those games, along with Niantic's further pivot to become a company known as Niantic Spatial, could be a sign of what's happening next: tech companies exploring how AI can understand the world around us more than its often broken attempts I get while wearing AI glasses like Meta Ray-Bans.

Niantic didn't have any additional comment on the news today, but I did demo the company's Quest-native Scaniverse app recently, which focuses on discovering existing 3D scans of real-world locations made using the Scaniverse app that Niantic acquired back in 2021. Much like other existing 3D-scanning apps including Polycam, the app looks to focus both on creating and viewing location data. But the future of Niantic's real plans here may be even more on training AI on this data, too – data that lots of players of games like Pokemon Go have been adding over time. If the future of AR and always-on wearable AI is really going to ever work, we're going to need a lot better sense and control of how and when we share data like people already have been playing games like Pokemon Go.

What's really going on next at the moment, from Niantic's standpoint, is a focus on that scanned map of the world as a data set for AI to feed off of. Niantic CEO John Hanke said on a LinkedIn post,

"We're in the midst of seismic changes in technology, with AI evolving rapidly. Existing maps were built for people to read and navigate but now there is a need for a new kind of map that makes the world intelligible for machines, for everything from smart glasses to humanoid robots, so they can understand and navigate the physical world. Today's LLMs represent the first step towards a future where a variety of expert models collaborate to reason and understand complex problems, and many of those problems will require deep and accurate knowledge of the physical world. Niantic is building the models that will help AI move beyond the screen and into the real world."

Niantic's already been focused on scanning the real world for AR and VR experiences on phones and in headsets like the Meta Quest, mainly to show off how interesting these sometimes uncanny 3D scans can be when you step into them again. Companies like Polycam, whose impressive 3D scans of real-world environments I also recently experienced on Vision Pro, have been exploring moves to use these scans for more business-focused purposes. Niantic is making those same claims with their scans. The tech behind these scans is known as Gaussian splats, which are generated using multiple photos and depth-sensing data.

The idea of AI beginning to understand and navigate the real world by studying these advanced scans is a whole other level, and not a surprising one. AR and VR have already been training grounds for AI and robotics, and the data sets we collect by wearing smart glasses and other world-mapping wearables are inevitably going to be the things generative AI starts to study next. It could very well be a huge part of what companies like Meta, Google and eventually Apple use as the underpinning to make real-world-wearable AR glasses actually work and recognize things around you with AI.

Companies like Meta and Google have already been using sensor-studded smart glasses prototypes to explore building out real-time world-aware assistance systems with AI. Project Aria and Project Astra are building blocks for more continuously-aware smart glasses to come, but they're still lacking a deeper helpful awareness of the world. Training off 3D-scanned map data could be a huge part of the next leap for what will eventually live in products like Meta's prototype Orion AR glasses and Google's Android XR devices.

Niantic isn't the only company focusing on scanning the real world into future maps: Google, Apple, Snap, and many others including companies like Polycam are already doing it. Niantic's current pitch seems like it's pivoting more to AI, but AI and AR already seem like they're destined to blend. What I really still don't know is how everyone's differently-collected map data, and our personal data, are clearly differentiated. And that's something that, as Niantic seems to focus even more on the tech versus the games of AR's future, is increasingly a question – and also a concern.

Read full news in source page