Please log in to watch this conference skillscast.
With the proliferation of container orchestrators and people comparing and evaluating them for fitness concerning the workloads they wish to run on them, one topic has so far been neglected: benchmarking. When Michael was still at Mesosphere, they started a project called 'Cloud Native Benchmarking Group' with the goal to provide a vendor-neutral benchmark for cloud native systems. Now, at Red Hat, the work continues.
The first of these benchmark focuses on orchestrators (see the initial work here: https://github.com/cnbm/container-orchestration) and in this talk Michael will talk about design considerations for the benchmark, how to make it modular and extensible and he will also share the challenges involved establishing a benchmark in an objective manner. Initial talks with the CNCF were positive, so they hope once the benchmark is completed (in mid 2017) and other parties start contributing to it that it will find a home in the CNCF.
YOU MAY ALSO LIKE:
Keynote: How to Benchmark Container Orchestrators
Jörg is a software engineer at Mesosphere in San Francisco. In his previous life, he implemented distributed and in memory databases and conducted research in the Hadoop and Cloud area. His speaking experience includes various Meetups, international conferences, and lecture halls.
Michael is a Developer Advocate for Go, Kubernetes, and OpenShift at Red Hat where he helps appops to build and operate distributed services. His background is in large-scale data processing and container orchestration and he's experienced in advocacy and standardization at W3C and IETF. Before Red Hat, Michael worked at Mesosphere, MapR and in two research institutions in Ireland and Austria. He contributes to open source software (mainly using Go), blogs and hangs out on Twitter too much.