How has the recent turmoil within the OpenAI offices changed your plans to use GPT in a business process or product in 2024?
Increased uncertainty means we are more likely to evaluate alternative AI chatbots and LLMs.
No change in plans, though we will keep an eye on the situation.
With Sam Altman back in charge, we are more likely to go all-in with GPT and LLMs.
What recent turmoil?
Cloud Services / Data / Operations

Kafka on Kubernetes: Should You Adopt a Managed Solution?

A look at the various factors to consider when deciding whether to deploy Kafka yourself or to purchase a managed solution
May 11th, 2023 10:43am by
Featued image for: Kafka on Kubernetes: Should You Adopt a Managed Solution?

Companies are increasingly choosing to run Apache Kafka on Kubernetes, and for good reason. Kubernetes provides a highly scalable, resilient orchestration platform that simplifies the deployment and management of Kafka clusters, allowing DevOps to spend less time worrying about infrastructure and more time building applications and services. Experts expect this trend to accelerate as more organizations use Kubernetes to manage their data infrastructure.

If your company is just getting started with Kafka in your Kubernetes environment, you’ll have several decisions to make, beginning with whether to deploy Kafka yourself or to purchase a managed solution.

The right answer will depend on your specific environment and the regulations that govern your industry. In this article, we’ll walk through the various factors at play, so you can help your organization make an informed decision.

Costs and Benefits of Self-Managed Kafka

Self-managed or “do-it-yourself” (DIY) Kafka has some advantages. You’ll have more control over your deployment, including whether to extend it across multiple clouds. It may be easier to align with your internal security and operations policies, accommodate your specific data residency concerns, and better control costs.

In this scenario, your in-house staff must perform the following tasks:

  • Setting up the infrastructure and storage
  • Installing and configuring the Kafka software
  • Setting up Apache Zookeeper™, if necessary. (Zookeeper is now deprecated and will no longer be supported as of Kafka v. 4.0. After that point, Kafka will use KRaft, the Kafka Raft consensus protocol.)
  • Monitoring and troubleshooting your clusters
  • Security
  • Horizontal and vertical scaling
  • Replication (for disaster recovery and availability)

Is Managed Kafka a Better Fit?

“Managed” Kafka is a service you can purchase from some hyperscalers, such as Amazon Web Services, and other third-party vendors. While the initial cost of the service may give you sticker shock, you may save money on hosting and payroll.

That said, some managed solutions may still require your team to have some level of Kafka expertise on board, especially during the setup phase.

With managed Kafka, you’ll lose the ability to control your data residency. What’s more, if you’re not sure how much compute or storage space you’ll need, you may end up with some surprise hosting costs.

What’s Included in a Managed Solution?

While each Kafka vendor’s exact offering varies a bit, hosted solutions include setup of the cloud infrastructure necessary to run Kafka clusters, including virtual machines, network, storage, backups and security.

Most managed solutions (whether or not they include hosting), provide features that:

  • Install and manage the Kafka software, including upgrades, patches, and security fixes.
  • Monitor Kafka clusters for issues such as running out of memory or storage space and provide alerts and notifications when problems arise. These solutions usually also include tools for troubleshooting and resolving problems like the above.
  • Ensure that data stored in Kafka clusters is durable and available by replicating data across multiple nodes and data centers.
  • Perform a variety of additional functions, depending on the solution. For example, they may include features that easily install additional functionality — such as schema management, connectors, and ksqlDB—which allow you to easily integrate with other data systems, transform data and build real-time applications.

The Decision-Making Process: Where to Start

Installing, configuring, and maintaining Kafka isn’t merely a matter of opening the manual and diving in. Every organization is different. Your Kafka implementation will vary, depending on your cloud provider, the size of your deployment, the applications you’re running and the size of your company, among other factors. So you’ll need a team with the specific skills required to perform the tasks in your unique environment.

In some companies, there may be more than one department involved—one to install the clusters and set up the infrastructure and another to “administer” Kafka, meaning set up topics, configure the producers and consumers, and connect it all to the rest of your application(s). Even if you have folks on board with some Kafka experience, they may not have the knowledge they need to set it up in a cloud or Kubernetes environment. So you may have to hire in this skill set or get training for your existing staff. It may take them a while to come up to speed. This indirect cost may not be trivial, especially if you work for a smaller organization.

To muddy the waters further, there are many “flavors” of technical staff who work with Kafka. Many don’t have the word “Kafka” in their titles. However, a quick search on LinkedIn turned up a few of the job titles that do:

  • Kafka site reliability engineer (SRE)
  • Staff software engineer, Kafka
  • Kafka admin
  • Kafka developer
  • Kafka engineer
  • Kafka support engineer
  • Java developer with Kafka

Depending on your location, as well as on the seniority and the specific job responsibilities, the cost to hire staff members to work on Kafka can vary tremendously.

In some cases, you may wish to split the job into two (the infrastructure responsibilities and the development responsibilities). In others, you may want to hire folks who will have responsibilities beyond just your Kafka deployment. Either way, this is one of the major costs associated with DIY Kafka.

If you choose a managed Kafka solution, you won’t need as much Kafka expertise on your team, since your provider will take care of most of the operational tasks involved.

However, as mentioned earlier, some solutions may still require you to perform a significant number of setup tasks. You’ll still need staff to build your Kafka-based applications and/or integrate them into your application ecosystem.

Consider Your Cloud Provider

Depending on the Kafka solution you’re considering, you’ll need to think about hosting. While this is obvious in the DIY scenario, there are still decisions to make with managed Kafka. Some providers, such as Confluent and Amazon Managed Streaming for Apache Kafka (MSK), include cloud hosting as part of their solutions. Others, such as Aiven and Cisco Calisti, are not hosted solutions. Still others, such as Instaclustr, give you the option to run your Kafka deployment in their cloud environment or use your own. So you’ll need to factor in cloud cost and convenience as you make your choices.

Open Source: A Hybrid Option

If you’d like the idea of using some of the features available in a managed Kafka solution, but would still prefer to retain control over your data and cloud compute and storage, consider using an open source solution.

An example is Koperator, a Kubernetes operator that automates provisioning, management, autoscaling, and operations for Kafka clusters deployed to Kubernetes.

Koperator provisions secure, production-ready Kafka clusters, and providing fine-grained configuration and advanced topic and user management through custom resources. Have a look at Koperator’s file and feel free to contribute to the project.

Learn more about Cisco Open Source and join our Slack community to be part of the conversation.

Group Created with Sketch.
TNS owner Insight Partners is an investor in: Pragma.
THE NEW STACK UPDATE A newsletter digest of the week’s most important stories & analyses.