Google Launches Hand Gesture-Focused Mobile Platform

June 11, 2015 - 2 minutes read

Google’s Advanced Technology and Projects Group (ATAP) is said to be developing interfaces that will one day allow users to control wearable technologies, like fitness trackers and smartwatches, using simple hand gestures. The technology, currently known only as Project Soli, is said to use radar to detect, track and interpret hand gestures. It also won’t require a smartphone to work, as users will be able to use gestures to enter commands directly into connected wearable devices.

Media reports claim that Google’s advanced radar technologies go far beyond the capabilities of those already on the market in products like PlayStation Move and Kinect. According to reports, Google’s engineering and development team has created a highly sensitive radar field that can catch even the slightest and nimblest hand and finger gestures. It is also said to be capable of capturing data from both hands at once, greatly expanding the technology’s range of potential applications.

While mobile app development is still a long way off from seeing this kind of technology permeate the consumer market, Google’s developmental commitment does suggest a future in which these kinds of functionalities are mainstream. The gesture-detecting radar technology is already said to be at an advanced stage, with developers turning their focus on creating a reliable system for interpreting the kinetic input.

Will Boston mobile app developers and software professionals around the world soon be faced with a usage landscape in which these types of user inputs are commonplace? The majority of tech industry experts believe that the question isn’t “if,” but “when.”

Tags: , , , , , , , , , , , , , , , , , , , , , , , , , ,