Kubiya Launches First Generative AI for Platform Engineering
AMSTERDAM – As platform engineering becomes a more widely adopted discipline, the platform team’s to-do list only increases. And as developers adopt new best practices and tools, that team needs to upskill and understand more and more of the distributed, often seven-layer stack. The team whose goal is to reduce developer burnout, in turn, has its own ever-growing cognitive load to bear.
Last October, Kubiya launched a conversational AI for DevOps teams, looking to improve the self-service, end user experience for both internal and external developers. Now, at KubeCon+CloudNativeCon Europe, Kubiya is releasing a new generative AI workflow engine geared to those platform engineering teams.
The New Stack spoke to CEO Amit Eyal Govrin about the growing use cases behind generative AI throughout the whole software development lifecycle — and how the industry is readier to embrace it, as Kubiya’s positioning evolves from “Siri for DevOps teams” to “ChatGPT for platform teams.”
Generative AI for End User Experience
Out of stealth mode for about six months now, Kubiya has found two common customer use cases — for internal developers and external developers. The first is within enterprises dealing with internal complexities, and the latter for smaller software companies of about a thousand-plus, speeding up customer response time when they require access to infrastructure.
Govrin spoke of a client in a leading digital media solutions and services company, which in turn has customers like Disney and NBCUniversal. Their customers may ask for a specific video that exists on their servers in a secure Amazon S3 bucket. When this happens, traditionally, the client would have to confirm the legitimacy of a request and authorize it through an approval flow. Then, another staffer has to go to the S3 bucket and extend the secret management over to the customer on a secure Slack Channel. The customer will then authorize the request on their end to then be able to download it.
“Imagine there are like four or five different sub-flows between cross-functional teams that go into a full transaction, and oftentimes they require the project manager on the provider’s side to go on to manually configure all of these different steps, and then get somebody from the TAM [technical account manager] to essentially copy over an S3 bucket and send it over in a secure manner to the end customer. All of that can take them, because of the back and forth and the human-in-the-loop intervention, oftentimes, a couple days,” Govrin said, harkening back to the days of FTP servers. This process usually requires a project manager, time, and people downloading things off of on-premise servers.
With Kubiya’s workflow automation, secure access controls, and knowledge management, he explained, a team can more easily validate a request and is even able to offer a service-level agreement (SLA) of ten minutes turn around to their end customers, removing that friction and humans in the loop.
And it can be done where the customer relationship is already happening, like Slack, Notion, Confluent or a Gitbook. The human customers simply ask a question like “Can I have X actor’s most recent video?” And this human-to-machine interaction responds.
Importantly, the conversational AI user experience, like within Slack, includes human feedback, like thumbs up or down or yes or no, clarifying what the user wants, which in turn goes back and trains Kubiya’s reinforcement learning with human feedback (RLHF), increasing its precision, continually customizing the response for the end user.
“That’s actually a big reason why ChatGPT has fallen short on a lot of enterprise use cases,” Govrin observed. So far, he says, ChatGPT is unable to validate its responses around hallucinations — which are outputs that sound plausible but are either inaccurate or unrelated — nor properly secure the integrity of the data in response to many business cases. Speaking of Kubiya, he continued, “Not only do we fine-tune the models on domain-specific data, but we also allow the user to give feedback — thumbs up, thumbs down — [which] will actually go and give additional reinforcement, to further fine-tune the model and be personalized to the organization and their end users.”
Why Kubernetes Demands Conversational AI Help
Unsurprisingly, Kubiya’s most common use case — or user question — is the omnipresent, but not always simple to understand, Kubernetes orchestration.
“Kubernetes troubleshooting and operations are chaotic and tend to be riddled with human errors due to context switching and lack of domain expertise. Requests that typically are handled through kubectl, Helm, Argo CD and other tools and commands exist in data silos and outdated wikis,” Govrin told The New Stack. “Having the ability to democratize Kubernetes operations into natural language prompts levels the playing field for many inexperienced operators and significantly increases the velocity of experienced operators.”
If a developer were looking to set up a Kubernetes namespace, they could just type “Kubernetes” to the Kubiya Slackbot and return the full Kubernetes deployment roll-out, which will give them a visualization of the workflow, from which they can select namespaces. They could also more specifically ask it, “Give me the workflow for how to create Kubernetes namespaces,” from which the Slackbot will return a way to easily trigger said workflow, as well as a link to further information about it.
Notably, as each organization uses Kubernetes differently — part of its challenge — Kubiya’s fine-tuned model learns on and adapts to an organization’s domain-specific knowledge to hone its question-and-answer-like, human-to-machine conversational AI.
Natural Language Processing Extended to Platform Teams
“How do you go and help create these automations and workflows without having to spit blood trying to stand up a new platform?” Next, Govrin told The New Stack about extending the use case and breadth of Kubiya beyond individual user experiences and to entire platform engineering teams. “Oftentimes platform engineering is a lot of toil, setting up and maintaining these systems, and creating new workflows is very much a burden on operators. That’s one of the buying decisions they typically have: Do I want to invest all this time and effort into creating all these custom actions for myself?”
Platform engineering teams are already tasked not only with understanding the needs of application development teams, but connecting them to delivering business value faster — no small feat unto itself. But platform engineers also have to be able to comprehend and create the business logic and workflows, and then communicate that to app teams. That’s why, this week at KubeCon, Kubiya is releasing a new feature that offers generative AI for workflow creation within its new web app.
Govrin provided the example of typing into Kubiya “I want to create a new parameter for AWS SSM,” which stands for Amazon Web Services Systems Manager Agent. The resulting workflow looks like a human-to-machine conversation, which can include a necessary human-in-the-loop trigger. This stands out against other platforms, like Jenkins, that are machine-to-machine interaction.
“What we’re saying is somebody comes in and asks for something that requires a different way of looking at a workflow. So all of how we look at workflows is human to machine. How do you go and create this experience that looks and acts very much like somebody on the other end is giving you the service,” Govrin said, explaining the aim of Kubiya’s generative AI conversational workflows.
This works as a workflow simulator, a way for platform teams to debug and evaluate the usability of new workflow candidates within existing systems and processes. Without investing a lot of time and energy, they are able to test, run and debug in a conversational AI, drag-and-drop setting. Platform engineers can also import YAML or JSON and lay it all out visually as low code within the existing business logic.
Even with typical low-code platform engineering tools, Govrin warns that it can take a few hours to test out new DevOps workflows. Kubiya’s new web app is attribute-based access control or ABAC aware and business logic aware. This and its organizational knowledge, he explains, enables it to answer domain- and business-specific questions faster.
And it allows teams to ask and answer questions in different ways — some that only your team may know, offering both the in-house colloquial names and the more traditional ones.
“Automatically it will capture that and train on that knowledge base,” he said. “It’s essentially organizational domain-specific knowledge that extends into generative workflow creation. And the two of them [both tools] give you an almost human-like experience.” Even for workflows that haven’t been created yet, a user can simply go into Slack and ask for something new.
Other active Kubiya use cases include wrapping the Backstage open source developer platform with natural language processing to enhance and speed up development with an interactive user experience.
“We’re abstracting all of the toil,” Govrin said.
This week, Kubiya also released a playground workplace, a sort of sandbox where developers and platform engineers can play around with the tool without having to commit to hooking into their own environment.
Check back often this week for all things KubeCon+CloudNativeCon Europe 2023. The New Stack will be your eyes and ears on the ground in Amsterdam!