Abstract
The field of information retrieval (IR) has experienced tremendous growth over the years. Researchers have however identified Human-Computer Interaction (HCI) aspects as important concerns in IR research. Incorporation of HCI techniques in IR can ensure that IR systems intended for human users are developed and evaluated in a way that is consistent with and reflects the needs of those users. The traditional methods of evaluating IR systems have for a long period been largely concerned with system-oriented measurements such as precision and recall, but not on the usability aspects of the IR system. There also are no well-established evaluation approaches for studying users and their interactions with IR systems. This chapter describes the role and place of HCI toward supporting and appropriating the evaluation of IR systems.
Access provided by Autonomous University of Puebla. Download to read the full chapter text
Chapter PDF
Similar content being viewed by others
References
Ahmed, S.M.Z., McKnight, C., Oppenheim, C.: A user-centred design and evaluation of IR interfaces. Journal of Librarianship and Information Science 38(2), 157–172 (2006)
Allen, B.: From research to design: A user-centered approach. In: Ingwersen, P., Pors, N.O. (eds.) CoLIS 2. Second International Conference on Conceptions of Library and Information Science: Integration in Perspective, pp. 45–59. The Royal School of Librarianship, Copehagen (1996)
Allen, B.: Information tasks. Towards a user-centered approach to information systems. Academic Press, San Diego (1996)
Alonso, O., Rose, D.E., Stewart, B.: Crowdsourcing for relevance evaluation. SIGIR Forum 42, 10–16 (2008)
Baeza-Yates, R., Ribeiro-Neto, B.: Modern information retrieval. Addison-Wesley, Reading (1999)
Bates, M.J.: The design of browsing and berrypicking techniques for the online search interface. Online Review 13(5), 407–424 (1989)
Belkin, N.J., Cool, C., Stein, A., Thiel, U.: Cases, scripts and information seeking strategies: on the design of interactive information retrieval systems. Expert Systems with Applications 9, 379–395 (1995)
Belkin, N., J., Cole, M., Liu, J.: A Model for Evaluation of Interactive Information Retrieval. In: SIGIR Workshop on the Future of IR Evaluation (2009)
Berger, L.: Look ahead caching process for improved information retrieval response time by caching bodies of information before they are requested by the user. United States Patent (1999)
Bernal, J.D.: Preliminary Analysis of Pilot Questionnaires on the Use of Scientific literature. In: The Royal Society Scientific Information Conference, pp. 589–637 (1948)
Borgman, C.L., Hirsh, S.G., Walter, V.A., Gallagher, A.L.: Children’s searching behaviour on browsing and keyword online catalog: the Science Library Catalog Project. Journal of the American Society for Information Science 46(9), 663–684 (1995)
Borlund, P.: The concept of relevance in IR. Journal of the American Society for Information Science 54, 913–925 (2003)
Borlund, P.: The IIR evaluation model: a framework for evaluation of interactive information retrieval systems. Information Research 8(3) (2003)
Borlund, P., Ingwersen, P.: Measure of relative relevance and ranked halflife: Performance indicators for interactive information retrieval. In: Proceedings of the 21st ACM SIGIR Conference on Research and Development of Information Retrieval (SIGIR 1998), Melbourne, Australia, pp. 324–331 (1998)
Boyce, B.R., Meadow, C.T., Kraft, D.H.: Measurement in Information Science. Academic Press, Inc., London (1994)
Callan, J., Allan, J., Clarke, J.L.A., Dumais, S., Evans, D.A., Sanderson, M., Zhai, C.: Meeting of the MINDS: An information retrieval research agenda. SIGIR Forum 41, 25–34 (2007)
Cleverdon, C.W.: The Cranfield tests on index language devices. In: Spark-Jones, K., Willett, P. (eds.) Readings in Information Retrieval. Reprinted from Aslib Proceedings, pp. 173–192. Morgan Kaufman Publishers, San Francisco (1967)
Cleverdon, C. W., Mills, L., Keen, M.: Factors Determining the Performance of Indexing Systems. Design, vol. 1. Aslib Cranfield Research Project, Cranfield (1996)
Cogdill, K.: MEDLINEplus Interface Evaluation: Final Report, University of Maryland, Human-Computer Interaction Lab (HCIL), College Park, MD (1999)
Cooper, W.S.: Expected search length: A single measure of retrieval effectiveness based on the weak ordering action of retrieval systems. American Documentation 19, 30–41 (1968)
Cooper, W.S.: On selecting a measure of retrieval effectiveness, part 1: The “subjective” philosophy of evaluation. Journal of the American Society for Information Science 24, 87–100 (1973)
Csikszentmihalyi, M.: Finding Flow: The Psychology of Engagement with Everyday Life. Basic Books, New York (1997)
Dalrymple, P.W.: Retrieval by reformulation in two university library catalogs: Toward a cognitive model of searching behavior. Journal of the American Society (1990)
Dalrymple, P.W.: User-Centered Evaluation of Information Retrieval. In: Allen, B. (ed.) Evaluation of Public Services and Public Services Personnel, pp. 85–102. University of Illinois, Urbana (1991)
Dix, A., Finlay, J., Abowd, G., Beale, R.: Human-Computer Interaction. Prentice Hall (2003)
Doubleday, A.R., Ryan, M., Springett, M., Sutcliffe, A.: A comparison of usability techniques for evaluating design. In: Proceedings of the Conference on Designing Interactive Systems: Processes, Practices, Methods, and Techniques, August 18-20, pp. 101–110. ACM, Amsterdam (1997)
Dunlop, M.: Time, relevance and interaction modeling for information retrieval. In: Proceedings of the 20th ACM Conference on Research and Development in Information Retrieval (SIGIR 1997), Philadelphia, PA, pp. 206–213 (1997)
Ellis, D.: The derivation of a behavioral model for information system design. Unpublished doctoral dissertation. University of Sheffield, England (1987)
Ellis, D.: A behavioural approach to information retrieval system design. Journal of Documentation 45(3), 171–212 (1989)
Ellis, D., Haugan, M.: Modeling the information seeking patterns of engineers and research scientists in an industrial environment. Journal of Documentation 53(4), 384–403 (1997)
Fenichel, C.H.: Online searching: Measures that discriminate among users with different types of experience. Journal of the American Society for Information Science 32, 23–32 (1981)
Ford, N., Miller, D., Moss, N.: The role of individual differences in Internet searching: An empirical study. Journal of the American Society for Information Science and Technology 52, 1049–1066 (2001)
Hansen, P.: Evaluation of IR User Interface. Implications for user interface design. Human IT (2), 28–41 (1998)
Harter, S.: Online information retrieval. Concepts, principles, and techniques. Academic Press, Orlando (1986)
Hearst, M.: User Interfaces and Visualization. In: Baeza-Yates, R., Ribeiro-Neto, B. (eds.) Modern Information Retrieval, ch.10 (1999)
Hearst, M.: Query Reformulation. In: Search User Interfaces, ch. 6. Cambridge University Press (2009)
Hewett, T., Baecker, R., Card, S., Carey, T., Gasen, J., Mantei, M., Perlman, G., Strong, G., Verplank, W.: ACM SIGCHI Curricula for Human-Computer Interaction (1992)
Hix, D., Hartson, H.R.: Developing user interfaces. Ensuring usability through product and process. Wiley, New York (1993)
Hornbaek, K.: Current practice in measuring usability: Challenges to usability studies and research. International Journal of Human–Computer Studies 64, 79–102 (2005)
Hutchinson, H.B., Drunin, A., Bederson, B.B.: Supporting elementary-age children’s searching and browsing: design and evaluation using the International Children’s Digital Library. Journal of the American Society for Information Science 58(11), 1618–1630 (2007)
Ingwersen, P.: Cognitive perspectives of information retrieval interaction: Elements of a cognitive IR theory. Journal of Documentation 52, 3–50 (1996)
Ingwersen, P., Järvelin, K.: The Turn: Integration of Information Seeking and Retrieval in Context. Springer, Dordrecht (2005)
ISO: Ergonomic Requirements for Office Work with Visual Display Terminals (VDTs): Part II, Guidance on Usability, ISO 9241- 11:1998 (1998)
Järvelin, K., Kekäläinen, J.: IR evaluation methods for retrieving highly relevant documents. In: Proceedings of the 23rd ACM SIGIR Conference on Research and Development of Information Retrieval (SIGIR 2000), Athens, Greece, pp. 41–48 (2000)
Järvelin, K., Kekäläinen, J.: Cumulated gain-based evaluation of IR techniques. ACM Transactions on Information Systems (TOIS) 20, 422–446 (2002)
Johnson, F.C., Griffiths, J.R., Hartley, R.J.: Task dimensions of user evaluations of information retrieval systems. Information Research 8(4) (2003)
Käki, M., Aula, A.: Controlling the complexity in comparing search user interfaces via user studies. Information Processing and Management 44, 82–91 (2008)
Kelly, D.: Methods for evaluating interactive information retrieval systems with users. Foundations and Trends in Information Retrieval 3(1-2), 1–224 (2009)
Kelly, D., Harper, J., Landau, B.: Questionnaire mode effects in interactive information retrieval experiments. Information Processing and Management (2008)
Kuhlthau, C.C.: Inside the search process: information seeking from the user’s perspective. Journal of the American Society for Information Science 42, 361–371 (1991)
Kuhlthau, C.C.: Seeking meaning: a process approach to library and information services. Ablex Publishing, Norwood (1994)
Lievesley, M.A., Yee, J.S.R.: Surrogate Users – A Pragmatic Approach to Defining User Needs. In: Conference Proceedings and Extended Abstracts, CHI 2007, San Jose. ACM Press (2007)
Lin, X.: Map displays for information retrieval. Journal of the American Society for Information Science 48(1), 40–54 (1997)
Losee, R.M.: Evaluating retrieval performance given database and query characteristics: Analytical determination of performance surfaces. Journal of the American Society for Information Science 47, 95–105 (1996)
Marchionini, G.: An invitation to browse: designing full text systems for novice users. Canadian Journal of Information Science 12(3), 69–79 (1987)
Marchionini, G.: Toward Human-Computer Information Retrieval Bulletin. Bulletin of the American Society for Information Science (June/July 2006), http://www.asis.org/Bulletin/Jun-06/marchionini.html
Mira working group: Evaluation Frameworks for Interactive Multimedia Information Retrieval Applications (1996), http://www.dcs.gla.ac.uk/mira
O’Brien, H., Toms, E.: What is user engagement? A conceptual framework for defining user engagement with technology. Journal of the American Society for Information Science and Technology 59, 938–955 (2008)
Pirolli, P., Card, S.K.: Information foraging. Psychological Review 106(4), 643–675 (1999)
Rocchio, J.: Relevance feedback in information retrieval. In: Salton, G. (ed.) The SMART Retrieval System (1971)
Salton, G.: Evaluation problems in interactive information retrieval. Information Storage and Retrieval 6, 29–44 (1970)
Salton, G.: Automatic text processing: the transformation, analysis and retrieval of information by computer. Addison-Wesley, Reading (1989)
Salton, G.: The state of retrieval system evaluation. Information Processing and Management 28, 441–449 (1992)
Salton, G., Buckley, C.: Improving retrieval performance by relevance feedback. Journal of the American Society for Information Science 41(4), 288–297 (1990)
Saracevic, T.: Modeling interaction in information retrieval (IR): A review and proposal. In: Proceedings of the 59th ASIS Annual Meeting 1996, vol. 33, pp. 3–9 (1996)
Saracevic, T.: The stratified model of information retrieval interaction: Extension and applications. In: Proceedings of the 60th ASIS Annual Meeting 1997, vol. 34, pp. 313–327 (1997)
Saracevic, T., Kantor, P.: A study of information seeking and retrieving. Journal of the American Society for Information Science 39(3), 177–216 (1988)
Siatri, R.: The Evolution of User Studies. Libri 49, 132–141 (1999)
Su, L.T.: Evaluation measures for interactive information retrieval. Information Processing and Management 28, 503–516 (1992)
Tague, J.: Informativeness as an ordinal utility function for information retrieval. SIGIR Forum 21, 10–17 (1987)
Tague-Sutcliffe, J.M.: The pragmatics of information retrieval experimentation, revisted. Information Processing and Management 28, 467–490 (1992)
Tague-Sutcliffe, J.M.: Measuring Information: An Information Services Perspective. Academic Press, San Diego (1995)
Urquhart, D.J.: The Distribution and Use of Scientific and Technical Information. In: The Royal Society Scientific Information Conference, pp. 408–419 (1948)
Veerasamy, A., Belkin, N.J.: Evaluation of a tool for visualization of information retrieval results. In: Proceedings of the 19th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, Zurich, Switzerland, pp. 85–92 (1996)
Veerasamy, A., Heikes, R.: Effectiveness of a graphical display of retrieval results. SIGIR Forum 31, 236–245 (1997)
Voorhees, E.M., Harman, D.K.: TREC: Experiment and Evaluation in Information Retrieval. MIT Press, Cambridge (2005)
Yuan, W., Meadow, C.T.: A study of the use of variables in information retrieval user studies. Journal of the American Society for Information Science 50, 140–150 (1999)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2013 Springer-Verlag Berlin Heidelberg
About this chapter
Cite this chapter
Catarci, T., Kimani, S. (2013). Human-Computer Interaction View on Information Retrieval Evaluation. In: Agosti, M., Ferro, N., Forner, P., Müller, H., Santucci, G. (eds) Information Retrieval Meets Information Visualization. PROMISE 2012. Lecture Notes in Computer Science, vol 7757. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-36415-0_3
Download citation
DOI: https://doi.org/10.1007/978-3-642-36415-0_3
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-36414-3
Online ISBN: 978-3-642-36415-0
eBook Packages: Computer ScienceComputer Science (R0)