Natural User Interfaces

by Pranav PramodWe are at the beginning of the 21st century. Most of our everyday actions are related somehow to computer interaction.  Many of them we perform them without even thinking about it. If we stop for a minute and think what sort of activities include a computational task we may be surprised by this. The interfaces that we utilize have been blended with our everyday life based on experimentation and experience. But many people especially the elderly and disabled have been segregated from these tasks as a result of bad interface design. The digitally native new generations have learnt how to operate these computational artefacts by empirical learning. It is now the time for a new era in the digital world in which Natural User Interfaces (NUI) increases integration of our genetic knowledge and experiences with the interfaces in order to perform computational tasks faster and with more ease.

What is an interface?

From the moment we begin to interact with any system and perhaps try to communicate with it, we have to define the method or context in which we will be interacting with it.  This method/context will create some boundaries between user-system/system-user.  The interface will be the channel through which the user will interact with this system.

In computer technologies, there are several types of interfaces:

  • User interface
  • Software interface
  • Hardware interface

The main goal is to produce a user interface that will make the experience for the user mainly efficient, easy and enjoyable.  This is based on the main concept of providing a design specially focused on Usability.  Merging interface design and usability design, we start introducing concepts from ergonomics, psychology and of course all the standard design theory used for static and dynamic design.

by Microsoft
Example of an interface

What is interaction?

We can define interaction as any demand for action from another user, program or artefact. This interaction can take place in many ways from the most traditional button pressing to the most modern motion sensed gesture tracking.  It is in these interfaces that the power of control is transferred through interaction. The level of interaction or the complexity of the controllers has to be properly managed in order to fulfil the objectives of what the interface was created for.

Interacting Empirically

Most of the problems with interactivity happen as a result of a bad interface design. It is essential that the interfaces in which we engage are the most ‘invisible’ as possible. By ‘invisible’ I mean that the user needs to learn empirically how to use the interface in the shortest time possible and with the least possible effort. So, most of the interfaces will require some knowledge or experience from the user in order to be able to interact with them. This means that most of the knowledge that we gather comes naturally from experimentation and investigation (a posteriori), but Immanuel Kant, a German philosopher had a theory that some knowledge does not happen only by experimenting or by experience. He argues that some knowledge can also be obtained by mundane or everyday acts (a priori) like queueing in line, or letting somebody in before you into a shop. These everyday actions create a ‘library’ of concepts in everybody’s minds. Therefore it is important to utilize these types of libraries of experiences as parameters for our next interface.

That is what Natural User Interfaces are trying to create. NUIs will use all the empirical knowledge that we have in order to generate tools for interaction. So, what is the most empirical knowledge that we have? Our senses. They have been facilitating our learning since the first seconds of our lives.  The use of our senses combined with experiences will provide one of the most powerful and fastest learning curves for our users if we manage to apply them correctly. There are several ways to achieve this so consequently there are several types of NUIs.

Multisensory Interface

A variety of sensory inputs can target the interface that in the same manner can respond in the same way. This includes voice, touch and gestures.

from Rekkerd.org
JazzMutant Lemur and the reactable

Human Like Interaction

Within this category, the interfaces can engage the user through a tactile element from everyday life or via a physical tool that is related to that task. The senses will be attached to the prior experience connected with the emulated object.

Biometric Interaction

Most of these interaction modes take place in real time. The use of physical characteristics can be used as interaction commands. This includes face recognition, eye tracking and pitch of voice among others to be able to command an interface either willingly or unwillingly.

Invisible Computing

This is also known as the effect of the Disappearing Computer. These interfaces happen when the computational element disappears from the user’s mind. This can happen physically or mentally. Physically in the way that the computer is physically hidden from us or is so small that we cannot perceive it. On the other hand mentally will happen when psychologically our eyes can not relate it to a computational object, for example a piece of intelligent furniture.

by PicoCool
MIT’s SENSEable City Lab - EyeStop

These types of NUIs do not restrict themselves to a single category. It is imperative to always adapt the interface to facilitate your audience’s ease of use. This means that is important to understand your target market to foresee the possible elements that could challenge your interactivity. This includes age, sex, physical disabilities and even race or religious beliefs. It is not only about the technology that you are using or developing but also about how you are willing or going to implement it. Elderly people for example feel extremely challenged by smartphones. The adaptation of elements related to the original task that the interface is trying to perform can benefit the user’s interaction. For example if there is a need of a magnifying glass device for elderly people, a simple handle that activates motion sensing to the amount of zoom can enhance the use of this interface with this sector of the population.

 Image

A simple human like implementation.

One of the main companies that is recently known for making these kind of adaptations is Nintendo with their Nintendo Wii; relatively simple implementations like a plastic gun or a plastic fishing cane that attaches to a remote is one of the best ways to introduce Human Like Interaction to the Multisensory Interaction of emulative gestures sent from the WiiMote to the game console. This facilitates the learning curve of playing a specific game.

What happens when there is no remote or controller? The case of Microsoft’s Kinect in which there is no longer a physical object controlling the game but instead the movements and gestures that the player performs will be utilised as interaction commands. This combines Biometric Interaction along with Multisensory Interaction in which specific characteristics of the user can be taken into account to play the game. The user needs to emulate real life event life gestures in order to play the game correctly; this requires a priori and a posteriori learning.

What does the future hold?

The power of modern computing objects allows developers and designers to implement interfaces, which they no longer have to be concerned if it can be developed. The psychological response from all users and the way we interact with the new technology is as important as the development of new technologies. The technology is already there ready for us to start using it. It is the challenge for developers and designers to implement them in the most productive way possible. The user experience is probably the most important element in the material world. If you drive a car, you need to know how it drives and about its safety features; if you use a notebook, can it save my writing into a computer, is it difficult to use, is it heavy? Even the way the objects look can put some people off. So NUIs can be able to include everybody in the use of digital technologies. This is really important since there can be massive implementations in Health sectors with elderly people or disabled people who have been previously segregated. So NUIs are a very positive and powerful way of increasing performance on any kind of product or service. We have over 50,000 years of experience as Homo-Sapiens which can be utilized to perform computational tasks and improve interface interaction. We have always adapted to the computer world and now the computer world is adapting to us. The use of smart furniture and smart buildings that connect us with them as well as the communication of our feelings to other people around us are one of many examples of how adaptive and ‘invisible’ they can become. NUIs have the ability to create a new paradigm in the digital world in which we no longer have to spend two hours reading a more confusing manual in order to use a simple interface that may be capable of making popcorn or transferring the notes of your notebook into a computer.

COMP6046 – Journalistic Article // Computational Thinking
jp6g11@ecs.soton.ac.uk
Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s