Container orchestration frameworks play a critical role in modern cloud computing paradigms such as cloud-native or serverless computing. They significantly impact the quality and cost of service deployment as they manage many performance-critical tasks such as container provisioning, scheduling, scaling, and networking. Consequently, a comprehensive performance assessment of container orchestration frameworks is essential. However, until now, there is no benchmarking approach that covers the many different tasks implemented in such platforms and supports evaluating different technology stacks. In this paper, we present a systematic approach that enables benchmarking of container orchestrators. Based on a definition of container orchestration, we define the core requirements and benchmarking scope for such platforms. Each requirement is then linked to metrics and measurement methods, and a benchmark architecture is proposed. With COFFEE, we introduce a benchmarking tool supporting the definition of complex test campaigns for container orchestration frameworks. We demonstrate the potential of our approach with case studies of the frameworks Kubernetes and Nomad in a self-hosted environment and on the Google Cloud Platform. The presented case studies focus on container startup times, crash recovery, rolling updates, and more.
CITATION STYLE
Straesser, M., Mathiasch, J., Bauer, A., & Kounev, S. (2023). A Systematic Approach for Benchmarking of Container Orchestration Frameworks. In ICPE 2023 - Proceedings of the 2023 ACM/SPEC International Conference on Performance Engineering (pp. 187–198). Association for Computing Machinery, Inc. https://doi.org/10.1145/3578244.3583726
Mendeley helps you to discover research relevant for your work.