RAMS: Radial Audio Menu System

An eyes-free mobile phone menu using spatialized sound


Overview

Created for Human-Computer Interaction class, this project examined the potential for an eyes-free mobile phone menu by using simple gestural input and spatialized audio. An eyes-free menu is useful for visually impaired users or for those whose vision is otherwise occupied. The goal was to discover the usability issues of such a menu. I worked on it with Trevor Knight, Jacob Beard and Rajkuman Viswanathan and we implemented it in Max/MSP with an iPhone controller running TouchOSC.

The items of the menu were presented as up to 8 sounds around the users head. As the user moved his finger around the selection wheel, the items would come into focus, meaning they were played louder and more clearly.

Usability Issues

We discovered and examined several usability issues with such an interface, for example:

Audio Overload

Even with the cocktail party effect, there is a limit to how much audio a person can take in at once. We attempted to alleviate audio overload by:

  • – a ‘fish-eye’ gain level for the items with the loudest at the focussed items
  • – low-pass filtering non-focussed items so that their presence could be heard but wouldn’t be as distracting
  • – staggering the starts of neighbouring sounds

Sound Design

After testing several sound designs (earcons, auditory icons, spearcons), we decided to short, simple names generated by text-to-speech because of the clarity of the meaning to new users as well as their ability to be created without arbitrary decisions by humans.

As well, an optional auditory cursor played white noise that followed the user’s finger around the selection wheel in order to reinforce the spatialization illusion and the link between finger position and places in space.

Input Gestures

The menu was designed to be used eyes-free and so was tested with blind users. As a result of that testing, the control and button layout was made as simple as possible. Also, a ‘press and hold’ paradigm with auditory confirmations was implemented for the accept and back buttons to reduce the number of accidental selections and confusion.

Lastly, we had a ‘hover-over’ paradigm similar to mouse-over on GUIs which gave more information (such as phone number when a contact name was hovered over). This was a potentially time-saving feature but caused confusion with users. In an actual implementation, a consistent logic to distinguish between hover/select button would need to be developed.

Advertisements