OpenTelemetry Gets Better Metrics
VALENCIA, Spain – OpenTelemetry is defined by its creators as a collection of APIs used to instrument, generate, collect and export telemetry data for observability. This data is in the form of metrics, logs and traces and has emerged as a popular CNCF project. For this interview, we’re delving deeper into OpenTelemetry and its metrics support which has just become generally available.
The specifications provided for the protocol of the metric are designed to connect metrics to other signals and to provide a path to OpenCensus, which enables customers to migrate to OpenTelemetry and to work with existing metrics-instrumentation protocols and standards, including, of course, Prometheus.
In this episode of The New Stack Makers podcast, recorded on the show floor of KubeCon + CloudNativeCon Europe 2022, Morgan McLean, director of product management, Splunk, Ted Young, director of developer education, LightStep and Daniel Dyla, senior open source architect, Dynatrace discussed how OpenTelemetry is evolving and the magic of observability in general for DevOps.
OpenTelemetry can be described as standard for the artifacts and data and information users need to extract from infrastructure. “It’s generally been focused on backend infrastructure, and the status is that distributed tracing has been mature and generally available for over a year now,” McLean said. “Following the general availability for metrics, logging support is coming later this year.”
The idea is also for OpenTelemetry to be “observability-tool agnostic,” Young said. “Since day one, a big focus for us has been not to try to choose winners or anything like that,” Young said. “Any vendor that wants to offer any open source tool can implement a receiver or a plugin in the OpenTelemetry collector, and begin working with the data.”
At the end of the day, it is widely assumed that it is very hard to do any kind of deep analysis without accurate and reliable observability data. Different DevOps and CI/CD techniques and tools “are great,” but good, robust and accurate data is a prerequisite for all of them,” Dyla said. “You can’t do any of it until the data is there.”