Abstract
The process of selecting the right set of requirements for a product release is dependent on how well the organisation succeeds in prioritising the requirements candidates. This paper describes two consecutive controlled experiments comparing different requirements prioritisation techniques with the objective of understanding differences in time-consumption, ease of use and accuracy. The first experiment evaluates Pair-wise comparisons and a variation of the Planning game. As the Planning game turned out as superior, the second experiment was designed to compare the Planning game to Tool-supported pair-wise comparisons. The results indicate that the manual pair-wise comparisons is the most time-consuming of the techniques, and also the least easy to use. Tool-supported pair-wise comparisons is the fastest technique and it is as easy to use as the Planning game. The techniques do not differ significantly regarding accuracy.
Article PDF
Similar content being viewed by others
Avoid common mistakes on your manuscript.
References
Beck K (1999) Extreme programming explained. Addison-Wesley, Reading, MA
Berander P (2004) Using students as subjects in requirements prioritization. Proc Int Symp Empirical Software Engineering, Redondo Beach, CA, USA, pp 167–176
Berander P, Andrews A (2005) Requirements prioritization. In: Aurum A, Wohlin C (eds), Engineering and Managing Software Requirements. Springer-Verlag, Berlin, Germany
Carmone FJ, Kara A, Zanakis SH (1997) A Monte Carlo investigation of incomplete pairwise comparison matrices in AHP. European Journal of Operational Research 102:538–553
Carver J, Jaccheri L, Morasca S, Shull F (2003) Issues in using students in empirical studies in software engineering education. Proc Int Software Metrics Symp Sydney, Australia, pp 239–249
Davis AM (2003) The art of requirements triage. IEEE Computer 36:42–49
Greer D, Ruhe G (2004) Software release planning: an evolutionary and iterative approach. Information and Software Technology 46:243–253
Harker PT (1987) Incomplete pairwise comparisons in the analytic hierarchy process. Mathl. Modelling 9:837–848
Höst M, Regnell B, Wohlin C (2000) Using students as subjects—a comparative study of students and professionals in lead-time impact assessment. Empirical Software Engineering 5:201–214
IEEE Std 830–1998. (1998). IEEE recommended practice for software requirements specifications. IEEE
Karlsson J (1996) Software requirements prioritizing. Proc Int Conf Req Eng Colorado Springs, Colorado, USA, pp 110–116
Karlsson J, Ryan K (1997) A cost-value approach for prioritizing requirements. IEEE Software 14:67–74
Karlsson J, Olsson S, Ryan K (1997) Improved practical support for large-scale requirements prioritising. Journ Req Eng 2:51–60
Karlsson J, Wohlin C, Regnell B (1998) An evaluation of methods for prioritizing software requirements. Inf and Software Techn 39:939–947
Karlsson L, Berander P, Regnell B, Wohlin C (2004) Requirements prioritisation: an experiment on exhaustive pair-wise comparisons versus planning game partitioning. Proc Int Conf Empirical Assessment in Software Engineering. Edinburgh, United Kingdom, pp 145–154
Lauesen S (2002) Software requirements-styles and techniques. Addison-Wesley, Harlow
Leffingwell D, Widrig D (2000) Managing Software Requirements-A unified approach. Addison-Wesley
Lehtola L, Kauppinen M (2004) Empirical evaluation of two requirements prioritization methods in product development projects. Proc European Software Process Improvement Conf Trondheim, Norway, pp 161–170
Lubars M, Potts C, Richter C (1992) A review of the state of the practice in requirements modeling. Proc IEEE Int Symp Req Eng, pp 2–14
Moisiadis F (2002) The fundamentals of prioritising requirements. Proc Systems Engineering, Test & Evaluation Conf, Sydney, Australia, pp 108–119
Newkirk JW, Martin RC (2001) Extreme programming in practice. Addison-Wesley, Harlow
Robson C (1997) Real World Research. Blackwell, Oxford
Ruhe G, Eberlein A, Pfal D (2002) Quantitative WinWin: a new method for decision support in requirements negotiation. Proc of the Int Conf on Software Engineering and Knowledge Engineering, pp 159–166
Runeson P (2003) Using students as experiment subjects—an analysis on graduate and freshmen student data. Proc Int Conf Empirical Assessment and Evaluation in Software Engineering. Keele, United Kingdom, pp 95–102
Saaty TL, Vargas LG (2001) Models, methods, concepts & applications of the analytic hierarchy process. Kluwer Academic Publishers, Norwell, MA
Sawyer P (2000) Packaged software: challenges for RE. Proc Int Workshop on Req Eng: Foundations of Software Quality. Stockholm, Sweden, pp 137–142
Shen Y, Hoerl AE, McConnell W (1992) An incomplete design in the analytic hierarchy process. Mathl. Comput. Modelling 16:121–129
Siddiqi J, Shekaran MC (1996) Requirements engineering: the emerging wisdom. IEEE Software 13:15–19.
Siegel S, Castellan JN (1988) Nonparametric statistics for the behavioral sciences. 2nd ed. McGraw-Hill, New York
Tichy WF (2000) Hints for reviewing empirical work in software engineering. Empirical Software Engineering 5:309–312
Wiegers K (1999) Software requirements. Microsoft Press, Redmond, WA
Wohlin C, Runeson P, Höst M, Ohlsson MC, Regnell B, Wesslén A (2000) Experimentation in software engineering—an introduction. Kluwer Academic Publishers
Wohlin C, Aurum A (2005) What is important when deciding to include a software requirement in a project or release? Proc Int Symp on Empirical Software Engineering. Noosa Heads, Australia, pp 237–246
Yourdon E (1999) Death March. Prentice-Hall, Upper Saddle River, NJ
Zhang Q, Nishimura T (1996) A method of evaluation for scaling in the analytic hierarchy process. Proc Int Conf Systems, Man and Cybernetics. Beijing, China, pp. 1888–1893. http://www.telelogic.com/corp/products/focalpoint/overview.cfm, last visited 2005–12–18
Acknowledgments
The authors would like to thank all experiment participants for contributing with their time and effort.
Author information
Authors and Affiliations
Corresponding author
Additional information
Editor: Daniel Berry
Appendix
Appendix
Table A1 Experiment 1 using counter-balancing design
Subject | Nbr of requirements | Tech 1 | Tech 2 | Criterion 1 | Criterion 2 |
1 | 8 | PWC | PG | Price | Value |
2 | 8 | PWC | PG | Price | Value |
3 | 16 | PWC | PG | Price | Value |
4 | 16 | PWC | PG | Price | Value |
5 | 8 | PWC | PG | Value | Price |
6 | 8 | PWC | PG | Value | Price |
7 | 16 | PWC | PG | Value | Price |
8 | 16 | PWC | PG | Value | Price |
9 | 8 | PG | PWC | Price | Value |
10 | 8 | PG | PWC | Price | Value |
11 | 16 | PG | PWC | Price | Value |
12 | 16 | PG | PWC | Price | Value |
13 | 8 | PG | PWC | Value | Price |
14 | 8 | PG | PWC | Value | Price |
15 | 16 | PG | PWC | Value | Price |
16 | 16 | PG | PWC | Value | Price |
Table A2 Experiment 2 using counter-balancing design
Subject | Occasion | Tech 1 | Tech 2 | Criterion 1 | Criterion 2 |
1 | PM | TPWC | PG | Value | Price |
2 | PM | TPWC | PG | Value | Price |
3 | PM | TPWC | PG | Value | Price |
4 | PM | TPWC | PG | Value | Price |
5 | EV | TPWC | PG | Value | Price |
6 | EV | TPWC | PG | Value | Price |
7 | EV | TPWC | PG | Value | Price |
8 | PM | TPWC | PG | Price | Value |
9 | PM | TPWC | PG | Price | Value |
10 | PM | TPWC | PG | Price | Value |
11 | PM | TPWC | PG | Price | Value |
12 | PM | TPWC | PG | Price | Value |
13 | EV | TPWC | PG | Price | Value |
14 | EV | TPWC | PG | Price | Value |
15 | PM | TPWC | PG | Value | Price |
16 | PM | PG | TPWC | Value | Price |
17 | PM | PG | TPWC | Value | Price |
18 | PM | PG | TPWC | Value | Price |
19 | EV | PG | TPWC | Value | Price |
20 | EV | PG | TPWC | Value | Price |
21 | EV | PG | TPWC | Value | Price |
22 | PM | PG | TPWC | Price | Value |
23 | PM | PG | TPWC | Price | Value |
24 | PM | PG | TPWC | Price | Value |
25 | PM | PG | TPWC | Price | Value |
26 | PM | PG | TPWC | Price | Value |
27 | PM | PG | TPWC | Price | Value |
28 | PM | PG | TPWC | Price | Value |
29 | EV | PG | TPWC | Price | Value |
30 | EV | PG | TPWC | Price | Value |
Table A3 Requirements prioritised in the experiments
Requirement | Selected for 8 requirements |
Alarm | X |
Bluetooth | |
Calculator | |
Calendar | X |
Call alert creation | |
Colorscreen | X |
Games | X |
IR | |
MMS | |
Notebook | X |
Phonebook | |
SMS | |
Timer | X |
WAP | X |
Vibrating call alert | X |
Voice control |
Rights and permissions
About this article
Cite this article
Karlsson, L., Thelin, T., Regnell, B. et al. Pair-wise comparisons versus planning game partitioning—experiments on requirements prioritisation techniques. Empir Software Eng 12, 3–33 (2007). https://doi.org/10.1007/s10664-006-7240-4
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10664-006-7240-4