Transforming Human-Computer Interface to a Natural User Interface
Added by Deborah Aks on May 4, 2012
Interacting with technology is essential to contemporary life. Just 15 years ago, it was not hard to find people unfamiliar with using a computer mouse (as I discovered when teaching students computer-based classes). Today, this would be unheard of. Both young and old are intimately familiar and at ease with the mouse, so much so that the word "mouse," easily conjures up an impression of a computer device rather than a rodent. But the rapid development in human-computer interface (HCI) alternatives may soon make the mouse obsolete--or perhaps not. A key determinant is usability, and how smoothly HCI is integrated with the world around us. Consider the wide range of recognition and tracking systems already available: touch, voice, face, and gesture recognition. Although still in its infancy, it is clear these will offer more natural ways to interact with computer devices so that using them is intuitive and effortless, and largely "invisible."
The limiting factor for achieving a Natural User Interface (NUI) lies in advances in computer science. To achieve "invisibility," HCI software needs to resemble human ability to recognize, track and manipulate objects. Take the xBox-Kinect as an example. Skeletal modeling segments a scene hierarchically: first, into people, then primary body parts, then head, torso, and arms, down to each finger. While this technology is impressive in recognizing coarse gestures, to "feel" convincingly natural, much refinement is still needed. Better HCI will use sensors calibrated in real-time to adjust for changes in body position as people move, and interact with virtual objects. Similarly, an authentic NUI needs better simulation of a user's fine motor control.
Another HCI on track to becoming NUI, reported in Engadget, are new perifoveal displays developed at MIT's media lab. These detect targeted information when monitoring concurrent information on multiple displays, such as when security needs to detect suspicious behavior among a multitude of ordinary behaviors. The Kinect tracks the head to signal where people look so that attended regions of the display are enhanced with the perifoveal software. Simultaneously, automated data analytics monitors unattended information in the periphery so when important information appears, a salient marker signals the user to shift over their gaze. Adding eye-tracking technology, no doubt, would add precision. Nevertheless, the perifoveal display exemplifies the ongoing transformation from HCI to NUI.
More remarkable are advances in the brain-computer interface (BCI), which uses brain signals to both simulate and stimulate movement: Thought is the control device and can be used to "defy" paralysis. Recently, Northwestern University neuroscientists Lee Miller and Christian Ethler have used neuroprosthesis with functional electrical stimulation to deliver electrical signals directly from the brain to muscles. This restored hand movements of a monkey with temporary paralysis induced by local anesthetic. The monkey only needed to think about moving his arms, allowing the user interface to "read" the thought and stimulate grasping, lifting, and releasing an object.
A major challenge to BCI and other HCI technologies is the complexity of behavioral patterns. Cryptic patterns appear in the muscle commands like those produced by EEG or eye movements (such as those shown here). Fortunately, data analytics offer a way to find reliable patterns, which can than be used to produce movement. ExtremeTech reports one BCI, NeuroVigil's iBrain, which reads electrical commands generated from the intact motor cortex of a paralyzed individual to control an external device such as a cursor on a computer screen. Right now, the main use for iBrain is to diagnose pathologies by reading brain signals during sleep. That it is not yet fully developed for awake applications likely relates to the complexity and noise associated with our waking state. For this, we need better data analytics and better coordination with the HCI. Then, we will be closer to an authentic NUI.
This post was written as part of the IBM for Midsize Business program, which provides midsize businesses with the tools, expertise and solutions they need to become engines of a smarter planet. Like us on Facebook. Follow us on Twitter.