Abstract
As part of a project to improve human computer interaction mostly for blind users, a survey with 50 blind and 100 sighted users included a questionnaire about their user habits during everyday use of personal computers. Based on their answers, the most important functions and applications were selected and results of the two groups were compared. Special user habits and needs of blind users are described. The second part of the investigation included collecting of auditory representations (auditory icons, spearcons etc.), mapping with visual information and evaluation with the target groups. Furthermore, a new design method for auditory events and class was introduced, called “auditory emoticons”. These use non-verbal human voice samples to represent additional emotional content. Blind and sighted users evaluated different auditory representations for the selected events, including spearcons for different languages. Auditory icons using environmental, familiar sounds as well emoticons are received very well, whilst spearcons seem to be redundant except menu navigation for blind users.
Access provided by Autonomous University of Puebla. Download to read the full chapter text
Chapter PDF
Similar content being viewed by others
References
Boyd, L.H., Boyd, W.L., Vanderheiden, G.C.: The Graphical User Interface: Crisis, Danger and Opportunity. Journal of Visual Impairment and Blindness, 496–502 (December 1990)
http://www.freedomscientific.com/fs_products/software_jaws.asp last viewed (December 2009)
http://www.gwmicro.com/Window-Eyes/ last viewed (December 2009)
Mynatt, E.D.: Transforming Graphical Interfaces into Auditory Interfaces for Blind Users. Human-Computer Interaction 12, 7–45 (1997)
Crispien, K., Petrie, H.: Providing Access to GUI’s Using Multimedia System – Based on Spatial Audio Representation. J. Audio Eng. Soc. 95th Convention Preprint, New York (1993)
Nees, M.A., Walker, B.N.: Encoding and Representation of Information in Auditory Graphs: descriptive reports of listener strategies for understanding data. In: Proc. of the 14th International Conference on Auditory Display (ICAD 2008), Paris, p. 6 (2008)
Nees, M.A., Walker, B.N.: Listener, Task, and Auditory Graph: Toward a Conceptual Model of Auditory Graph Comprehension. In: Proc. of the 13th International Conference on Auditory Display (ICAD 2007), Montreal, pp. 266–273 (2007)
Gaver, W.W.: The SonicFinder, a prototype interface that uses auditory icons. Human Computer Interaction 4, 67–94 (1989)
Mynatt, E.D.: Designing Auditory Icons. In: Proc. of the International Conference on Auditory Display (ICAD 1994), Santa Fe, pp. 109–120 (1994)
Petrie, H., Morley, S.: The use of non-speech sounds in non-visual interfaces to the MS Windows GUI for blind computer users. In: Proc. of the International Conference on Auditory Display (ICAD 1998), Glasgow, p. 5 (1998)
Wersényi, G.: Localization in a HRTF-based Minimum Audible Angle Listening Test on a 2D Sound Screen for GUIB Applications. J. Audio Eng. Soc. 115th Convention Preprint, New York (2003)
Wersényi, G.: Localization in a HRTF-based Minimum-Audible-Angle Listening Test for GUIB Applications. Electronic Journal of Technical Acoustics 1 (EJTA), 16 (2007), http://www.ejta.org
Wersényi, G.: What Virtual Audio Synthesis Could Do for Visually Disabled Humans in the New Era. AES Convention Paper, presented at the AES Tokyo Regional Convention, Tokyo, Japan, pp. 180–183 (2005)
Wersényi, G.: Localization in a HRTF-based Virtual Audio Synthesis using additional High-pass and Low-pass Filtering of Sound Sources. Journal of the Acoust. Science and Technology 28(4), 244–250 (2007)
Wersényi, G.: Effect of Emulated Head-Tracking for Reducing Localization Errors in Virtual Audio Simulation. IEEE Transactions on Audio, Speech and Language Processing (ASLP) 17(2), 247–252 (2009)
Wersényi, G.: Simulation of small head-movements on a virtual audio display using head-phone playback and HRTF synthesis. In: Proc. of the 13th International Conference on Auditory Display (ICAD 2007), Montreal, pp. 73–78 (2007)
Gaver, W.W.: Auditory Icons: using sound in computer interfaces. Human-Computer Interactions 2(2), 167–177 (1986)
Blattner, M.M., Sumikawa, D.A., Greenberg, R.M.: Earcons and Icons: Their structure and common design principles. Human-Computer Interaction 4, 11–44 (1989)
Gaver, W.W.: Everyday listening and auditory icons. Doctoral thesis, Univ. of California, San Diego (1988)
Gygi, B., Shafiro, V.: From signal to substance and back: insights from environmental sound research to auditory display design. In: Proc. of the 15th International Conference on Auditory Display (ICAD 2009), Copenhagen, pp. 240–251 (2009)
Gygi, B.: Studying environmental sounds the watson way. The Journal of the Acoustical Society of America 115(5), 2574 (2004)
Gygi, B., Kidd, G.R., Watson, C.S.: Spectral-temporal factors in the identification of envi-ronmental sounds. The Journal of the Acoustical Society of America 115(3), 1252–1265 (2004)
Ballas, J.A.: Common factors in the identification of an assortment of brief everyday sounds. Journal of Exp. Psychol. Human 19(2), 250–267 (1993)
Gygi, B., Shafiro, V.: The incongruency advantage in elderly versus young normal-hearing listeners. The Journal of the Acoustical Society of America 125(4), 2725 (2009)
Fernström, M., Brazil, E.: Human-Computer Interaction design based on Interactive Sonification – hearing actions or instruments/agents. In: Proc. of 2004 Int. Workshop on Interactive Sonification, Bielefeld Univ. (2004)
Heller, L.M., Wolf, L.: When Sound Effects Are Better Than The Real Thing. The Journal of the Acoustical Society of America 111(5/2), 2339 (2002)
Vargas, M.L.M., Anderson, S.: Combining speech and earcons to assist menu navigation. In: Proc. of the International Conference on Auditory Display (ICAD 2003), Boston, pp. 38–41 (2003)
Walker, B.N., Nance, A., Lindsay, J.: Spearcons: Speech-based earcons improve navigation performance in auditory menus. In: Proc. of the International Conference on Auditory Display (ICAD 2006), London, pp. 63–68 (2006)
Palladino, D.K., Walker, B.N.: Learning rates for auditory menus enhanced with spearcons versus earcons. In: Proc. of the 13th International Conference on Auditory Display (ICAD 2007), Montreal, pp. 274–279 (2007)
Dingler, T., Lindsay, J., Walker, B.N.: Learnabiltiy of Sound Cues for Environmental Features: Auditory Icons, Earcons, Spearcons, and Speech. In: Proc. of the 14th International Conference on Auditory Display (ICAD 2008), Paris, p. 6 (2008)
Wersényi, G.: Evaluation of user habits for creating auditory representations of different software applications for blind persons. In: Proc. of the 14th International Conference on Auditory Display (ICAD 2008), Paris, p. 5 (2008)
Wersényi, G.: Evaluation of auditory representations for selected applications of a Graphical User Interface. In: Proc. of the 15th International Conference on Auditory Display (ICAD 2009), Copenhagen, pp. 41–48 (2009)
http://www.w3.org/ (Last viewed December 2009)
http://www.independentliving.com/prodinfo.asp?number=CSH1W (Last viewed December 2009)
Cobb, N.J., Lawrence, D.M., Nelson, N.D.: Report on blind subjects’ tactile and auditory recognition for environmental stimuli. Journal of Percept. Mot. Skills 48(2), 363–366 (1979)
http://guib.tilb.sze.hu/ (last viewed December 2009)
http://www.freesound.org (last viewed December 2009)
http://www.soundsnap.com (last viewed December 2009)
Gygi, B., Divenyi, P.L.: Identifiability of time-reversed environmental sounds. Abstracts of the Twenty-seventh Midwinter Research Meeting, Association for Research in Otolaryngology 27 (2004)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2010 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Wersényi, G. (2010). Auditory Representations of a Graphical User Interface for a Better Human-Computer Interaction. In: Ystad, S., Aramaki, M., Kronland-Martinet, R., Jensen, K. (eds) Auditory Display. CMMR ICAD 2009 2009. Lecture Notes in Computer Science, vol 5954. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-12439-6_5
Download citation
DOI: https://doi.org/10.1007/978-3-642-12439-6_5
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-12438-9
Online ISBN: 978-3-642-12439-6
eBook Packages: Computer ScienceComputer Science (R0)