ABI Research: wearable vendors are struggling with user interface complexity
The dominance of touchscreen user interfaces will reduce over the next five years as more sensors are introduced to mainstream products and entirely new product form-factors emerge, enabling and necessitating new user interfaces like voice, gesture, eye-tracking, and neural, according to ABI Research (www.abiresearch.com.
"Touch got mobile device usability to where it is today, but touch will become one of many interfaces for future devices as well as for new and future markets," says ABI Research Senior Practice Director Jeff Orr. "The really exciting opportunity arrives when multiple user interfaces are blended together for entirely new experiences."
Across 11 unique features from wireless connectivity to embedded sensors, ABI Research found that hand and facial gesture recognition will experience the greatest growth in future smartphone and tablet shipments with a compound annual growth rate of 30% and 43% respectively from 2014 to 2019. The range of applications for gesture recognition span user attentiveness to device navigation control. The impact of user interface (UI) innovation in mobile devices will be felt across a wide range of CE applications, including the car and in the home.
As mobile applications integrate more technology, the UI must be kept simple enough to be intuitive, says ABI Research.
"Packing a mobile device with sensors goes little beyond being a novelty," says Orr. "Complexity contradicts good UI design and a critical mass of engaging mobile applications are required for mainstream adoption."
This balancing act is best observed in today’s automobiles where myriad of subsystems are working with the driver to arrive at a destination safely with a minimal amount of learning curve. Key components have also evolved from single-function elements into multi-sensor, single-chip packages.
ABI Research says this has not only benefited the handheld form-factor, but been the premise for the leading commercially available wearable devices. As multiple sensors and gadgets work real-time to collect data from an individual and the surrounding environment, the potential for complexity arises once again with each person looking to have their own personalized experience.