JOURNAL BROWSE
Search
Advanced SearchSearch Tips
An Empirical Analysis of Auditory Interfaces in Human-computer Interaction
facebook(new window)  Pirnt(new window) E-mail(new window) Excel Download
 Title & Authors
An Empirical Analysis of Auditory Interfaces in Human-computer Interaction
Nam, Yoonjae;
  PDF(new window)
 Abstract
This study attempted to compare usability of auditory interfaces, which is a comprehensive concept that includes safety, utility, effectiveness, and efficiency, in personal computing environments: verbal messages (speech sounds), earcons (musical sounds), and auditory icons (natural sounds). This study hypothesized that verbal messages would offer higher usability than earcons and auditory icons, since the verbal messages are easy to interpret and understand based on semiotic process. In this study, usability was measured by a set of seven items: ability to inform what the program is doing, relevance to visual interfaces, degree of stimulation, degree of understandability, perceived time pressure, clearness of sound outputs, and degrees of satisfaction. Through the experimental research, the results showed that verbal messages provided the highest level of usability. On the contrary, auditory icons showed the lowest level of usability, as they require users to establish new coding schemes, and thus demand more mental effort from users.
 Keywords
Auditory Interface;Verbal message;Earcon;Auditory icons;Usability;
 Language
English
 Cited by
 References
1.
S.A. Brewster, Providing a structured Method for integrating Non-speech Audio into Human-computer interfaces. PhD. Dissertation: University of York, 1994.

2.
W. Buxton, W. Gaver and S. Bly, Tutorial number 8: The use of mon-speech audio at the interface, In Processing of CHI' 91. New orleans: ACM Press: Addison-Wesley, 1991.

3.
J. Preece, (1994). Human-computer interaction, Wokingham, England: Addoson-Wesley, 1994.

4.
C. Nass and K.M. Lee, "Does computer-generated speech manifest personality? An experimental test of similarityattraction," Proceedings of the CHI 2000 conference on Human factors in computing systems, 2000, pp. 329-336.

5.
A. PappIII and M. Blattner, Dynamic presentation of Asynchronous auditory output. ACM Multimedia 96, Boston, MA, 1996, pp. 109-116.

6.
M. Blattner, D. Sumikawa and R. Greenberg, "Earcon and icons: Their structure and common design principles," Human computer interaction, vol. 4, no.1, 1989.

7.
S.A. Brewster, A. Capriotti and C.V. Hall, "Using compound earcons to represent hierarchies," In HCI Letters, vol. 1, no.1, 1998, pp. 6-8.

8.
D.K. McGookin and S.A. Brewster, "Understanding concurrent earcons: Applying auditory scene analysis principles to concurrent earcon recognition," ACM Transactions on Applied Perception (TAP) archive, vol. 1, no. 2, 2004, pp. 130-155. crossref(new window)

9.
S.A. Brewster, P.C. Wright and A.D.N. Edwards, "The design and evaluation of an auditory-enhanced scrollbar," In B. Adelson, S. Dumais, & J. Olson (Eds.), Proceedings of CHI'94, Boston: ACM Press, Addison-Wesley, 1994, pp. 173-179.

10.
S.A. Brewster and V.C. Catherine, "The design and evaluation of a sonically enhanced tool palette," ACM Transactions on Applied Perception, vol. 2, no. 4, 2005, pp. 455-461. crossref(new window)

11.
M.G. Crease and S.A. Brewster, "Making Progress With Sounds-The Design and Evaluation Of An Audio Progress Bar," In Proceedings of ICAD'98. Glasgow, UK: British Computer Society, 1998

12.
A.B. Barreto, A. J. Julie and H. Peterjohn , "Impact of spatial auditory feedback on the efficiency of iconic human-computer interfaces under conditions of visual impairment," Computers in Human Behavior, vol. 23, no. 3, 2007, pp.1211-1231. crossref(new window)

13.
S.A. Brewster, Providing a model for the use of sound in user interfaces (Technical Report No. YCS 169), University of York, Department of Computer Science, 1991.

14.
A. Dix, E. F. Janet, D. Gregory and B. Russell, Human-Computer Interaction, London: Prentice-Hall, 1993.

15.
J. A. Jacko, "The identifiability of auditory icons for use in educational software for children," Interacting with Computers vol. 8, no. 2, 1996, pp. 121-133. crossref(new window)

16.
D.K. Palladino and N.W. Bruce, "Learning rates for auditory menus enhanced with spearcons versus earcons," Proceedings of the International Conference on Auditory Display (ICAD2007): Montreal, Canada, 2007, pp. 274-279.

17.
D. I. Rigas, D. Hopwood and D. Memery, "Communicating spatial information via a multimediaauditory interface," In EUROMICRO Conference 1999, Proceedings 25th, vol. 2, 1999, pp. 398-405.

18.
E.D. Mynatt, "Designing with Auditory Icons," Proceedings of the Second International Conference on Auditory Display ICAD '94, Santa Fe Institute, New Mexico, 1994.

19.
C. Peirce, Philosophical Writings of Peirce, New York: Dover, 1955.

20.
U. Eco, A theory of Semiotics, Bloominton: Indiana University Press, 1979.

21.
F. Rossi-Landi, Linguistics and economics, The Hague: Mouton, 1975.

22.
Y. Nam and J. Kim, "Semiotic analysis of sounds in personal computers: Toward a semiotic model of humancomputer interaction," Semiotica, vol. 182 , no. 1/4, 2010, pp. 269-284.

23.
B. Reeves and C. Nass, The Media Equation, New York: Cambridge University Press, 1996.

24.
NASA Human performance research Group, Task load index. V 1.0 Computerized version, NASA Arnes Resarch Center, 1987.