What You Can Do with Vector Search
Monterey AI analyzes millions of user voices and aggregates user feedback, reviews, bug reports, and support tickets from various social media, CRM, and community channels; it then clusters common themes with trends to provide recommendations to aid in the product development process. It connects customer feedback to the product development process, from the front lines of customer support to leadership, to align with user needs. Companies like Figma and Comcast use it.
In an interview with the founders and lead engineers, we asked about the team it takes to build an LLM-style service and why they chose such tools as Zilliz for vector search, powered by Milvus, the open source vector database.
“There’s no one out there with like, a decade of building LLM-based products,” said Ben Kramer, co-founder and CTO at Monterey. “There’s obviously plenty of people with traditional ML (machine learning) backgrounds, traditional NLP (natural language processing) experience. And while that experience is super useful, and honestly, it’s something we use in our product alongside the latest stuff, you know, it’s not the only thing that’s important.”
The need for all types of skills becomes evident when looking at the job requirements for AI web companies that need full-stack software engineers. Often, these companies build with Python. Next.js and React get implemented for fast, responsive, and scalable user interfaces; for type safety, Typescript sometimes shows up. Docker, Kubernetes, Grafana — the tools are diverse, and the skills needed to use them are apparent.
And it’s also very, very new, Kramer said. How to operate and fine-tune the models has only emerged in the past few years.
“It’s really about solid engineering fundamentals and then the ability to move quickly and learn new things,” he said.
Embeddings and vector search touch many of the pieces in the Monterey product. Embeddings represent numeral representations of what words mean and how they relate to other words. Vectors allow for semantic-type searches, meaning similar data from different sources gets processed and represented as a search answer. Google, for example, uses vector search in YouTube, Google Play, and its core search capabilities.
Monterey uses Zilliz to do semantic searches for clustering its “theme reports,” said Cole Haffer, founding machine learning engineer at Monterey. For example, a company collects 1,000 feedback items in one week. Using embeddings and vector search, the customer may generate the top feature requests and cluster on those embeddings.
Another example is adding context to the prompts based on what the user is asking through a chat interface using retrieval augmentation powered by embeddings and a vector store. According to The New Stack, “Incorporating knowledge and context from external sources or databases, retrieval-augmented generation models can produce more contextually accurate, coherent, and informative text that is free of hallucination. Most importantly, RAG can harness an application’s internal data and augment an LLM’s knowledge to find the specific answer to a question.”
But why Zilliz?
Kramer said it comes down to the flexibility of Zilliz and the underlying Milvus technology with its choice of algorithms for semantic search.
“Going into it, we didn’t know which would perform best for our use case,” Kramer said. “And we didn’t want to switch from provider to provider or database to database, just to try something different.”
“It was important to us to be able to build this in our system and have that flexibility. Additionally, something that was also very important to us: privacy and security. So in many of our customer conversations, you know, they’re sending us potentially sensitive customer information. Also, stuff customers might have submitted through chat support, etc,” Kramer said.
“So having data storage that is safely inside of our production, private network was very important. And Zilliz was one of the cloud providers that gave us that out of the box, it was super easy to set up, we didn’t have to talk to you know, a bunch of teams there. So that was great.
And lastly, we went with Zilliz over anything else because it offered a great cloud solution. You know, as mentioned before, we’re a very small team. So having to manage infrastructure ourselves is something we try to avoid where we can. So something that meets all of our needs but still offers that cloud-managed solution was really important enough to us to make that decision.”