Skip to main content

Comparison of End-to-End Testing Tools for Microservices: A Case Study

  • Conference paper
  • First Online:
Information Technology and Systems (ICITS 2021)

Abstract

Microservices has emerged as a architectural style that provides several benefits but also poses some challenges. One such challenge is testability, since an application may have hundreds or thousands of services operating together, and each of them needs to be tested as they evolve. To overcome this challenge, test automation is key, and together with it, the use of effective and efficient testing tools. Hence, we aim to contribute to this area by evaluating two tools that support end-to-end (E2E) testing of microservices. E2E tests allow to verify if the system works well as a whole (particularly relevant for systems made up of microservices). In this work, we first surveyed E2E testing tools reported in academic literature and by industry practitioners. Then, we applied the IEEE 14102-2010 standard to evaluate those tools. The two top-rated tools, Jaeger and Zipkin, were selected for further evaluation of their effectiveness and efficiency. Results from our case study reveal that Jaeger is more efficient and effective than Zipkinin terms of execution and failure detection times, as well as information provided to detect faults, severity and coverage.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    https://www.jaegertracing.io/.

  2. 2.

    https://zipkin.io/.

  3. 3.

    https://www.docker.com/community/captains.

  4. 4.

    https://github.com/microservices-demo/microservices-demo.

References

  1. Aderaldo, C.M., et al.: Benchmark requirements for microservices architecture research. In:2017 IEEE/ACM 1st International Workshop on Establishing the Community-Wide Infrastructure for Architecture-Based Software Engineering (ECASE), pp. 8–13 (2017)

    Google Scholar 

  2. Alvaro, P., et al.: Automating failure testing research at internet scale. Association for Computing Machinery, New York (2016). ISBN 9781450345255

    Google Scholar 

  3. Bogner, J., et al.: Microservices in industry: insights into technologies, characteristics, and software quality. In: 2019 IEEE International Conference on Software Architecture Companion (ICSA-C), pp. 187–195, March 2019. https://doi.org/10.1109/ICSA-C.2019.00041

  4. de Camargo, A., et al.: An architecture to automate performance tests on microservices. In: Proceedings of the 18th International Conference on Information Integration and Web-Based Applications and Services, pp. 422–429. ACM, New York (2016). ISBN: 978-1-4503-4807-2

    Google Scholar 

  5. Chen, L.: Microservices: architecting for continuous delivery and DevOps. In: Proceedings - 2018 IEEE 15th International Conference on Software Architecture, ICSA 2018, pp. 39–46 (2018)

    Google Scholar 

  6. Eldh, S., et al.: A framework for comparing efficiency, effectiveness and applicability of software testing techniques. In: Testing: Academic Industrial Conference - Practice and Research Techniques, pp. 159–170 (2006)

    Google Scholar 

  7. Fowler, M., Lewis, J.: Microservices, pp. 1–15 (2018)

    Google Scholar 

  8. Ghani, I., et al.: Microservice testing approaches: a systematic literature review. Int. J. Integr. Eng. 11(8), 65–80 (2019). https://doi.org/10.30880/ijie.2019.11.08.008

    Article  Google Scholar 

  9. Gkikopoulos, P.: Data distribution and exploitation in a global microservice artefact observatory. In: 2019 IEEE World Congress on Services, vol. 2642-939X, pp. 319–322 (2019). https://doi.org/10.1109/SERVICES.2019.00089

  10. Harsh, P., et al.: Cloud enablers for testing large-scale distributed applications. In: Proceedings of the 12th IEEE/ACM International Conference on Utility and Cloud Computing Companion, pp. 35–42. ACM, New York (2019). https://doi.org/10.1145/3368235.3368838

  11. Heinrich, R., et al.: Performance engineering for microservices: research challenges and directions. In: Proceedings of the 8th ACM/SPEC on International Conference on Performance Engineering Companion, pp. 223–226. ACM, New York (2017). ISBN: 978-1-4503-4899-7

    Google Scholar 

  12. Heorhiadi V., et al.: Gremlin: systematic resilience testing of microservices. In: Proceedings - International Conference on Distributed Computing Systems, vol. 2016, pp. 57–66, August 2016

    Google Scholar 

  13. Las-Casas, P., et al.: Weighted sampling of execution traces: capturing more needles and less hay. In: Proceedings of the ACM Symposium on Cloud Computing, SoCC 2018, pp. 326–332. Association for Computing Machinery, New York (2018). https://doi.org/10.1145/3267809.3267841, ISBN: 9781450360111

  14. Lei, Q., et al.: Performance and scalability testing strategy based on kubemark. In: 2019 IEEE 4th International Conference on Cloud Computing and Big Data Analysis (ICCCBDA), pp. 511–516, April 2019

    Google Scholar 

  15. Lundell, B., Lings, B.: Comments on ISO 14102: The standard for CASE-tool evaluation. In: Computer Standards and Interfaces 24.5. cited By 7, pp. 381–388 (2002). https://doi.org/10.1016/S0920-5489(02)00064-8

  16. Meinke, K., Nycander, P.: Learning-based testing of distributed microservice architectures: correctness and fault injection. In: Software Engineering and Formal Methods, pp. 3–10. Springer, Heidelberg (2015). ISBN: 978-3-662-49224-6

    Google Scholar 

  17. Powell, A., et al.: A practical strategy for the evaluation of software tools, July 1996. https://doi.org/10.1007/978-0-387-35080-6_11. ISBN: 1475758243

  18. Quenum, S., Aknine, J.G.: Towards executable specifications for microservices. In: Proceedings - 2018 IEEE International Conference on Services Computing, SCC 2018 - Part of the 2018 IEEE World Congress on Services (2018), pp. 41–48 (2018)

    Google Scholar 

  19. Runeson, P., Höst, M.: Guidelines for conducting and reporting case study research in software engineering. Empir. Softw. Eng. 14(2), 131 (2009)

    Article  Google Scholar 

  20. Schreiber, M.: Prevant (Preview servant): composing microservices into reviewable and testable applications. In: OpenAccess Series in Informatics, vol. 78 (2020). https://doi.org/10.4230/OASIcs.Microservices.2017-2019.5

  21. Singh, C., et al.: Comparison of different CI/CD tools integrated with cloud platform. In: 2019 9th International Conference on Cloud Computing, Data Science Engineering (Conuence), pp. 7–12, January 2019

    Google Scholar 

  22. Sotomayor, J.P., et al.: Comparison of runtime testing tools for microservices. In: 2019 IEEE 43rd Annual Computer Software and Applications Conference (COMPSAC), vol. 2, pp. 356–361, July 2019

    Google Scholar 

Download references

Acknowledgments

This work was partially supported by University of Costa Rica’s projects No. 834-B8-A27 and 834-C0-726, financed by the Research Center on ICT (CITIC) and the Department of Computer Science (ECCI). We thank the Empirical Software Engineering Group (ESEG) for its valuable feedback and help.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Cristian Martínez Hernández .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Hernández, C.M., Martínez, A., Quesada-López, C., Jenkins, M. (2021). Comparison of End-to-End Testing Tools for Microservices: A Case Study. In: Rocha, Á., Ferrás, C., López-López, P.C., Guarda, T. (eds) Information Technology and Systems. ICITS 2021. Advances in Intelligent Systems and Computing, vol 1330. Springer, Cham. https://doi.org/10.1007/978-3-030-68285-9_39

Download citation

Publish with us

Policies and ethics