Will real-time data processing replace batch processing?
At Confluent's user conference, Kafka co-creator Jay Kreps argued that stream processing would eventually supplant traditional methods of batch processing altogether.
Absolutely: Businesses operate in real-time and are looking to move their IT systems to real-time capabilities.
Eventually: Enterprises will adopt technology slowly, so batch processing will be around for several more years.
No way: Stream processing is a niche, and there will always be cases where batch processing is the only option.

What Monitoring Can Learn from Major League Baseball

Jul 23rd, 2018 11:28am by
Featued image for: What Monitoring Can Learn from Major League Baseball

Blue Medora sponsored this post.

In 1984, Bill James, was frustrated that Major League Baseball (MLB) refused to publish play-by-play accounts of every game.

Bill recognized the value that standardized, easy-to-access data depth could bring to his approach to analyzing the game of baseball by looking at specific aspects of individual player performance — a practice now known as sabermetrics.

Because MLB didn’t have all the data he needed for his model to work, Bill recruited a network of fans that would work together to collect and distribute this information (known as Project Scoresheet) and it later evolved into STATS Inc., the company that provided data and analysis to every major media outlet before Fox Sports acquired it in 2001.

Figure 1: Could MIaaS be to monitoring what Bill James’ STATS Inc was to baseball?

Mike Kelly, CTO Blue Medora
As CTO, Mike Kelly is focused on advancing Blue Medora’s overall strategy and direction and defining the future of monitoring across the IT landscape. He also leads Blue Medora’s VMware product integrations, product champions and new product development teams. Before becoming CTO, Kelly led the creation and development of a leading software solution for monitoring and managing Oracle databases on VMware. Prior to Blue Medora his career was focused on new product development and research, and includes experience at every stage of the product development cycle. Mike holds a BSc. in Computer Engineering from Western Michigan University.

It’s about at that time that Paul DePodesta changed the game by taking an analytical, even algorithmic, approach to selecting players for the Oakland Oakland A’s team that was struggling to stay competitive with a payroll one third the size of teams like the New York Yankees.

The A’s General Manager Billy Beane assembled using DePodesta’s analysis on James’s data won a record-setting 20 games in a row in 2003. His then highly unorthodox yet ultimately successful “coaching by algorithm” approach that changed America’s great game forever later served as the basis for Michael Lewis’ book “Moneyball: The Art of Winning an Unfair Game” and Hollywood movie.

Today, after a decade of working in the IT monitoring business, I can affirm algorithmically guided analytics in the monitoring space is also shaking up the IT world. Like Baseball in the early 2000s, algorithms have the opportunity to completely change the monitoring game. Advances in machine learning have the potential to elevate monitoring to observability.

But, like baseball before the advent of STATS Inc., our traditional data-collection methods are holding us back. Sadly, some of the most incredible analytics engines are relying on a hodgepodge of data collection sources. Many are built by technology providers, while others are based on open source or even community members.

Figure 2: The state of monitoring integrations today has been exacerbated by the API explosion and limits the accuracy of predictive analytics.

Most large organizations run six or more monitoring tools. Maintaining dozens of individual integrations can actually turn into a significant investment, one with the potential to rival monitoring platform costs over time.

However, perhaps most importantly though, the results aren’t really good. Many times it can be a struggle to get clear insights when things go wrong.

Monitoring Integration as-a-Service  (MIaaS) has the opportunity to change that by decoupling data collection from data analytics. In a way, building an integration layer that any monitoring or analytics platform an organization uses can draw upon.

Figure 3: MIaaS offers a single, self-maintaining connection from every enterprise technology in your environment to any monitoring platform.

Here are four modern monitoring use cases where MIaaS really makes sense:

Microservices Monitoring

Microservice architectures require a different approach to monitoring integrations, one that takes into account the temporary nature of many resources. MIaaS makes it easy to auto-discover new resources dynamic tech stacks, such as those using containers or serverless technologies.

MIaaS also goes deeper than community or open-source integrations like Collectd or StatsD, because it has a more flexible ingestion framework that can accommodate more and varied API types. This makes it possible to deliver component data in highly abstracted environments. See the comparison below of a Redshift database running in a containerized environment:

Figure 4. The flexibility of a MIaaS framework enables the highly granular insights into the behavior of systems along with rich, relational context.

Hybrid Cloud Monitoring

Pinpointing application performance problems in a cloud-native application is already tough enough. Throw in on-premises stacks and one or more cloud providers, and suddenly you have an environment with multiple moving parts, affecting application performance from several points.

MIaaS can unify visibility between legacy datacenter technologies and across clouds. Understanding the relationships or context between the individual components of an entire system like a node and a cluster or a container and a host makes alerting more accurate and root causes analysis much quicker.

Multicloud Application Optimization

Research from Edwin Yeun at ESG suggests the greatest driver for multi-cloud utilization is the individual application or workload requirements. Each of the major public cloud monitoring providers offers their own monitoring solutions, but it becomes difficult to compare apples to apples. Metrics can be displayed in different depths or even units.

MIaaS levels the playing field for all public cloud metrics. It also makes it easy to standardize on one cloud provider’s monitoring platform or to use third-party analytics tools to analyze cross clouds from an application-oriented view via an application performance monitoring platform (APM).

DevOps Adoption

As I mentioned, most large organizations run multiple monitoring tools. This can be for a variety of reasons, but one of them may be individual team preferences. But, as the traditional silos of “Dev” and “Ops” continue to collide, ensuring that everyone has access to the same MIaaS is an easy way to increase collaboration.

You may recognize Bill James and Paul DePodesta’s story from the book or movie. If so, you may remember how James’s data and DePodesta’s analysis changed the game of baseball forever. Decoupling data collection from data analytics could revolutionize IT monitoring in the same way. MIaaS has the potential to unleash the full potential of the best AI technologies, by eliminating current challenges caused by data collection.

Feature image via Pixabay.

Group Created with Sketch.
THE NEW STACK UPDATE A newsletter digest of the week’s most important stories & analyses.