Where are you using WebAssembly?
Wasm promises to let developers build once and run anywhere. Are you using it yet?
At work, for production apps
At work, but not for production apps
I don’t use WebAssembly but expect to when the technology matures
I have no plans to use WebAssembly
No plans and I get mad whenever I see the buzzword
AI / Frontend Development / JavaScript / Software Development

JavaScript/React Library Lets Developers Build AI Chatbots

A frontend library called NLUX ships with adapters for ChatGPT and Hugging Face’s large language models. It also supports personalization.
Jan 24th, 2024 3:00am by
Featued image for: JavaScript/React Library Lets Developers Build AI Chatbots
Image by Alexandra Koch from Pixabay

NLUX, a new open source Javascript React library, lets developers build their own chatbot user interfaces. It also allows developers to customize the personality of the bot using natural language cues.

NLUX can be used with any large language model (LLM) backend service but comes with pre-built adapters to connect with OpenAI’s ChatGPT and Hugging Face’s large language models (LLM). It’s also possible to build custom streaming or promise adapters to connect with other LLMs or APIs.

“Let’s say a big company that doesn’t use the public OpenAI large language model, but they want to use a custom model that’s hosted in their own servers,” said NLUX creator Salmen Hichri. “They can actually build and customize their own models, but they still can use NLUX to connect to those models. They need to build a custom adapter for their own model and API.”

Hichri said there are older chatbot libraries, but to his knowledge, NLUX is the first library that’s AI-specific.

Currently, there are two “flavors” of NLUX:

  • NLUX React JS, which includes React components and hooks; and
  • NLUX JS, which is a vanilla Javascript library that can be used with any web framework.

Why React?

Hichri told The New Stack that part of the reason the library started with React is that it provides an intuitive approach to building applications. Plus, a large number of developers are using React, he added. That’s backed up by the recently released 2023 JavaScript Rising Stars survey, which found that React remained the most popular JavaScript framework for the third year.

“There are already millions of developers who are using React and JavaScript, and those developers are already at the forefront of building digital experiences today,” Hichri said. “They are writing web apps, they are creating websites, mobile apps, and we want to help them build intuitive conversational experiences.”

Currently, developers do need to know a bit of React to use the library, although if a developer-only knows JavaScript, they can still use the JavaScript version of the library with other frameworks. Hichri plans to expand NLUX to support Angular, React Native and possibly Preact.

Giving Chatbots Personality

Developers can personalize their chatbot with natural language queues and a few lines of code to give the conversation a bit of personality. They can also instruct the bot to be serious, funny, modest or confident.

“For the adapters, we already provide for OpenAI and for Hugging Face, we allow the customization through what are called system messages,” he explained. “So when developers use NLUX, the system message — which is not visible to the user — this is like the developer telling the chatbot to behave in a certain way.”

NLUX is also currently building an adapter for LangChain, along with support for server-side rendering. Voice chat is also on the roadmap for the library.

Beyond Chatbots: The Next Stage of AI-Powered Apps

Right now, the focus for generative AI application development is on building what are essentially AI-powered chatbots. But Hichri and others in the space say the focus will soon shift toward AI copilots, which will be able to perform actions within applications after receiving natural language commands.

“It’s not just a conversation, but it’s a smart system that can perform actions on behalf of the user and it’s embedded within the application or the software,” he said. “The user still needs to define what kinds of actions can be can be performed on their software, but the trigger won’t be a click or looking for a menu — the trigger will be natural language expression.”

This copilot mode is in the roadmap for NLUX and should be coming very soon, he added.

The other trend that will impact AI apps will be the ability to have spatial awareness and coupling that with augmented reality, Hichir predicted. Specifically, he pointed to Apple’s work with Vision Pro, but noted that OpenAI is also starting to offer features that would position them to be an augmented reality assistant.

“That might not be a big deal for someone who’s working with an office but for certain types of jobs in industries or in architecture, or some other jobs where a spatial awareness is very important, giving access to a smart AI system through augmented reality assistance — it’s a game changer,” he said.

Group Created with Sketch.
THE NEW STACK UPDATE A newsletter digest of the week’s most important stories & analyses.