Skip to main content

Changing Analysts’ Tunes: The Surprising Impact of a New Instrument for Usability Inspection Method Assessment

  • Conference paper
People and Computers XVII — Designing for Society

Abstract

We describe the impact on analyst performance of an extended problem report format. Previous studies have shown that Heuristic Evaluation can only find a high proportion of actual problems (thoroughness) if multiple analysts are used. However, adding analysts can result in a high proportion of false positives (low validity). We report surprising interim results from a large study that is exploring the DARe model for evaluation method effectiveness. The DARe model relates the effectiveness of an evaluation method to evaluators’ command of discovery and analysis resources. Previous work has shown that Heuristic Evaluation poorly supports problem discovery and analysis: heuristics tend to be inappropriately applied to problem predictions. We developed an extended problem report format to let us study analyst decision making during usability inspection. Our focus was on the quality of insights into analyst behaviour delivered by this extended report format. However, our first use of this format revealed unexpected improvements in validity (false positive reduction) and appropriate heuristic application. We argue that the format has unexpectedly led to more care and caution in problem discovery and elimination, and in heuristic application. Evaluation performance can thus be improved by indirectly ‘fixing the analyst’ via generic fixes to inspection methods. In addition, we provide the first direct evidence of how evaluators use separate discovery and analysis resources during usability inspection.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  • Barnum, C., Bevan, N., Cockton, G., Nielsen, G., Spool, J., & Wixon, D. [2003], The ‘Magic Number 5’: Is It Enough for Web Testing?, in G. Cockton & P. Korhonen (eds.), CHI’03 Extended Abstracts of the Conference on Human Factors in Computing Systems, ACM Press, pp.698-9.

    Google Scholar 

  • Cockton, G. & Lavery, D. [1999], A Framework for Usability Problem Extraction, in A. Sasse & C. Johnson (eds.), Human-Computer Interaction — INTERACT’ 99: Proceedings of the Seventh IFIP Conference on Human-Computer Interaction, Vol. 1, IOS Press, pp.347–55.

    Google Scholar 

  • Cockton, G. & Woolrych, A. [2001], Understanding Inspection Methods: Lessons from an Assessment of Heuristic Evaluation, in A. Blandford, J. Vanderdonckt & P. Gray (eds.), People and Computers XV: Interaction without Frontiers (Joint Proceedings of HCI2001 and IHM200I), Springer-Verlag, pp.171-92.

    Google Scholar 

  • Cockton, G., Lavery, D. & Woolrych, A. [2003], Inspection-based Methods, in J. A. Jacko & A. Sears (eds.), The Human-Computer Interaction Handbook, Lawrence Erlbaum Associates, Chapter 57, pp.1118-38.

    Google Scholar 

  • Connell, I. W. & Hammond, N. V. [1999], Comparing Usability Evaluation Principles with Heuristics: Problem Instances vs. Problem Types, in A. Sasse & C. Johnson (eds.), Human-Computer InteractionINTERACT’ 99: Proceedings of the Seventh IFIP Conference on Human-Computer Interaction, Vol. 1, IOS Press, pp.621–9.

    Google Scholar 

  • Dumas, J. S. [2003], User-based Evaluations, in J. A. Jacko & A. Sears (eds.), The Human-Computer Interaction Handbook, Lawrence Erlbaum Associates, Chapter 56, pp. 1093-117.

    Google Scholar 

  • Gray, W. D. & Salzman, M. C. [1998], Damaged Merchandise? A Review of Experiments that Compare Usabilty Evaluation Methods, Human-Computer Interaction 13(3), 203–61.

    Article  Google Scholar 

  • Hertzum, M. & Jacobsen, N. E. [2001], The Evaluator Effect: A Chilling Fact about Usability Evaluation Methods, International Journal of Human-Computer Interaction 13(4), 421–43.

    Article  Google Scholar 

  • Instone, K. [1997], Usability Engineering on the Web, Advancing HTML: Style and Substance 2(1). Available at http://www.w3j.eom/5/s3.instone.html (last accessed 2003.05.26).

  • Kieras, D. [2003], Model-based evaluations, in J. A. Jacko & A. Sears (eds.), The Human-Computer Interaction Handbook, Lawrence Erlbaum Associates, Chapter 58, pp.1139-68.

    Google Scholar 

  • Kuhn, S. & Muller, M. J. [1993], Participatory Design — Introduction to the Special Section, Communications of the ACM 36(4), 24–8.

    Google Scholar 

  • Lavery, D., Cockton, G. & Atkinson, M. P. [1996], Heuristic Evaluation: Usability Evaluation Materials, Technical Report TR-1996-15, University of Glasgow. Available at http://www.dcs.gla.ac.uk/publications/reports/1996-15.pdf (last accessed 2003.05.17).

  • Lavery, D., Cockton, G. & Atkinson, M. P. [1997], Comparison of Evaluation Methods using Structured Usability Problem Reports, Behaviour & Information Technology 16(4–5), 246–66.

    Article  Google Scholar 

  • Lavery, D. & Cockton, G. [1997], Representing Predicted and Actual Usability Problems, in H. Johnson, P. Johnson & E. O’Neill (eds.), Proceedings of the International Workshop on Representations in Interactive Software Development, Queen Mary and Westifeld Colelge, University of London, pp.97-108.

    Google Scholar 

  • Nielsen, J. [1992], Finding Usability Problems Through Heuristic Evaluation, in P. Bauersfeld, J. Bennett & G. Lynch (eds.), Proceedings of the CHI’92 Conference on Human Factors in Computing Systems, ACM Press, pp.373-80.

    Google Scholar 

  • Nielsen, J. [1994], Enhancing the Power of Usability Heuristics, in B. Adelson, S. Dumais & J. Olson (eds.), Proceedings of the CHI’94 Conference on Human Factors in Computing Systems: Celebrating Interdependence, ACM Press, pp. 152-8.

    Google Scholar 

  • Sears, A. [1997], Heuristic Walkthroughs: Finding the Problems Without the Noise, International Journal of Human-Computer Interaction 9(3), 213–34.

    Article  Google Scholar 

  • Wharton, C, Rieman, J., Lewis, C. & Poison, P. [1994], The Cognitive Walkthrough Method: A Practitioners Guide, in J. Nielsen & R. L. Mack (eds.), Usability Inspection Methods, John Wiley & Sons, pp. 105-40.

    Google Scholar 

  • Woolrych, A. [2001], Assessing the Scope and Accuracy of the Usability Inspection Method Heuristic Evaluation, MPhil Thesis, School of Computing, Engineering and Technology, University of Sunderland, UK.

    Google Scholar 

  • Woolrych, A. & Cockton, G. [2001], Why and When Five Test Users aren’t Enough, in J. Vanderdonckt, A. Blandford & A. Derycke (eds.), Proceedings of IHM-HCI’2001, Joint AFIHM-BCS Conference on Human-Computer Interaction: Interaction without Frontiers, Volume 2, Cépaduès-Éditions, pp. 105-8.

    Google Scholar 

  • Woolrych, A. & Cockton, G. [2002], Testing a Conjecture Based on the DR-AR Model of UIM Effectiveness, in H. Sharp, P. Chalk, J. LePeuple & J. Rosbottom (eds.), Proceedings of HCI’02: Volume 2, British Computer Society, pp.30-3.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2004 Springer-Verlag London

About this paper

Cite this paper

Cockton, G., Woolrych, A., Hall, L., Hindmarch, M. (2004). Changing Analysts’ Tunes: The Surprising Impact of a New Instrument for Usability Inspection Method Assessment. In: O’Neill, E., Palanque, P., Johnson, P. (eds) People and Computers XVII — Designing for Society. Springer, London. https://doi.org/10.1007/978-1-4471-3754-2_9

Download citation

  • DOI: https://doi.org/10.1007/978-1-4471-3754-2_9

  • Publisher Name: Springer, London

  • Print ISBN: 978-1-85233-766-7

  • Online ISBN: 978-1-4471-3754-2

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics