Your phone knows a lot. It knows the time, its location on the globe, its orientation and movement, maybe even the ambient light and air pressure. But it doesn’t really know its surroundings. Not like we humans do.
That’s the problem Google’s ATAP is trying to solve. The Advanced Technology and Projects group (official Google+ tagline: “We like epic shit.”) just announced Project Tango, with the goal of giving mobile devices a “human-scale understanding of space and motion.”
So what is it, exactly? Well, right now it’s a custom 5-inch Android phone. It’s sort of blocky and bulky, and very much looks like the prototype it is. This is not a consumer device. But this phone does something special. It’s got a 4-megapixel camera, an additional motion-tracking camera, and a third depth-sensing camera. Together with a pair of processors optimized for computer vision calculations, the phone tracks its absolute position and orientation in 3D more than 250,000 times a second—all while making a virtual 3D map of whatever it’s pointing at. The YouTube video released by Google’s ATAP will show you how it works.
If it looks an awful lot like someone crammed an Xbox Kinect sensor into a smartphone, there’s probably a good reason for that. Johnny Chung Lee, the project lead for Tango, was a key developer on the Kinect team at Microsoft before jumping ship for Google.
Note that this is still a very experimental endeavor. Interested developers can apply for a kit at the Project Tango site, but there are only 200 kits in total, so don’t get your hopes up. The 5-inch Android phones come with the software stack necessary to do 3D positioning and mapping, but it’s all at an early prototype stage and under active development. As the site says, “These experimental devices are intended only for the adventurous and are not a final shipping product.”
This is an exciting area of advancement for phones, and frankly more useful than something like Google Glass. If your phone knew its absolute position and orientation in real time with low latency and could map its environment in 3D, the possibilities for developers are boundless. Augmented reality apps and games could seamlessly meld a virtual world with the real one. A shopping app could literally walk you through the store to the exact position on the shelf where the item you want is located. You could navigate inside complex buildings the same way cars use turn-by-turn street navigation. You could look at a piece of furniture with your phone and virtually, accurately, place it in your living room. Phone assistants could start to “see” the world the way you do. This could be one of the breakthrough technologies that gets us from Siri to something out of Her.
But there are challenges galore. Extra cameras and custom processors aren’t cheap. The drain on battery life is likely to be terrible right now. The prototype phones are too thick and bulky for the average user. In addition to the mountain of software problems the Google team is working to solve, they’re going to have to find a way to make all this stuff affordable, energy efficient, and compact. So, don’t expect full 3D world-mapping to be a standard feature of Android Lollipop (if that’s the name of the next release), but maybe it’ll get here in time for Marzipan.