TNS
VOXPOP
Favorite Social Media Timesink
When you take a break from work, where are you going?
Instagram/Facebook
0%
Discord/Slack
0%
LinkedIn
0%
Video clips on TikTok/YouTube
0%
X, Bluesky, Mastodon et al...
0%
Web surfing
0%
I do not get distracted by petty amusements
0%
AI / Large Language Models / Operations

Why Knowledge Management Doesn’t Work

AI-powered chatbots give users instant access to common information without interrupting your experts.
Feb 7th, 2024 6:02am by
Featued image for: Why Knowledge Management Doesn’t Work
Featured image by Syd Wachs on Unsplash.

DevOps and platform teams are bombarded with nonstop messages interrupting their work — from one-off questions they’ve answered dozens of times to requests for information readily available in docs: “How do I connect to the database? Why isn’t my deployment working? What does this error mean?”

It’s not just DevOps — users send volumes of security questions, compliance inquiries, human resources requests and so on. No one wants to waste an expert’s time and energy responding to those repetitive questions, but there hasn’t really been a better option.

And the consequences of these interruptions are far more than just the time it takes to answer the questions. There’s the cost of context switching: researchers at UC Irvine found that it takes 25 minutes after an interruption to fully return to your previous task. There’s also the opportunity cost when higher-impact tasks get squeezed out by repeat questions, as backlogs get longer and longer.

The Same Questions, Over and Over

Much of a DevOps engineer’s job is providing internal support for their platforms, pipelines, docs and more. Teams often take on on-call shifts to resolve internal questions across multiple communication platforms. These on-call shifts take DevOps engineers and site reliability engineers (SREs) away from their focus on building tools that make reliability processes efficient and consistent, and that often leads to burnout and turnover.

While DevOps teams may produce documentation for developers to self-serve information, most people don’t want to manually search through documents (often in different knowledge stores) to find answers. Instead, team chats are where people now collaborate, ask questions and get meaningful answers in real time.

The problem with traditional knowledge management is that it’s focused on capturing and storing information. It doesn’t deliver on the knowledge-sharing and distribution promise; instead, that burden is still on the asker. This has led to DevOps teams being bombarded with messages in their team chat. They get the same questions, over and over, from different people who don’t realize their question was answered a few days ago in the same channel. So they do what they always do: go into Slack or Microsoft Teams and ask the question, knowing that an expert will provide the correct answer.

Connecting People with the Right Knowledge

Artificial intelligence (AI) chatbots powered by retrieval-augmented generation (RAG) and large language models (LLMs) are one solution to this problem. By pulling in information from a wide variety of sources — previously answered questions in chat channels (including Slack and Teams), knowledge bases, community-generated content in GitHub, Notion, Confluence, company uploaded documents, administrator-specified websites and more — the chatbot can answer repetitive questions automatically and instantly. The asker doesn’t have to change how they ask questions or where they ask questions. They get the information they need without taxing experts’ time.

This technology indexes unstructured information in chat conversations and combines it with structured knowledge in various repositories. This data is stored for fast, semantic search, and when the search finds the match or multiple matches to a user’s query, it fetches the related data and passes it back to the LLM to define the optimal response. This enables the chatbot to answer questions that may be ill-formed or imprecise, summarize its findings and cite its sources. This can save hours of time and energy otherwise spent manually answering redundant questions.

This also saves users time by bringing all of your best practices, how-to guides, processes, resolved conversations and frequently asked questions to any authorized employee in real time — all in chat. Having information come to users, instead of having to search through different repositories, is an efficient way to interact with enterprise data.

This AI chatbot is not about replacing humans; it’s about supporting people and taking away the mundane so teams can focus on more high-impact tasks.

For more information on implementing a chatbot that automatically surfaces relevant information for users in chat, learn how to get started with QueryPal.

Group Created with Sketch.
THE NEW STACK UPDATE A newsletter digest of the week’s most important stories & analyses.