Linkless ANFORA (Patented)

Methodology

  • Controlled evaluation study with 20 participants
  • Wizard-of-Oz technique
  • Quantitative and qualitative analysis

Project Description

To support mobile, eyes-free web browsing, users can listen to “playlists” of web content − aural flows. Interacting with aural flows, however, requires users to select interface buttons, tethering visual attention to the mobile device even when it is unsafe (e.g., while walking). To reduce such visual interaction, this paper explores the use of voice commands to control aural flows.

In a study, 20 participants browsed aural flows either through a visual interface only, or by augmenting it with voice commands. Results show that using voice reduced the time spent looking at the device by half while listening to aural flows, but yielded a similar navigation experience and cognitive effort as selecting buttons. All users enjoyed the directness of voice commands in combination with the visual interface, but some found voicing instructions socially uncomfortable. Our findings can be extended to mobile systems that leverage semi-aural interfaces.

Click here to access the patent document.

Click here to use Linkless ANFORA mobile web application using Safari or Chrome Browser.

Click here to control linkless ANFORA mobile web application using the admin page for each of the vocal commands.

Click here to watch a YouTube video on Linkless ANFORA.


Project Details

NSF-funded research project “Navigating the Aural Web”, PI: Dr. Davide Bolchini

Funding Opportunities for Research Commercialization and Economic Success (FORCES) by IUPUI Office of the Vice-Chancellor for Research (OVCR)–"Eyes-Free Mobile Navigation with Aural Flows", PI: Dr. Davide Bolchini