AI Engineer Summit Wrap-up and Interview with Co-Founder Swyx
This week the inaugural AI Engineer Summit was held in San Francisco. It was a heady week of startup excitement and networking around an emerging job role: the AI Engineer.
After the event, I met co-founder Shawn “Swyx” Wang at a new AI hackerspace in San Francisco, called Newton, to discuss the current status of AI app development and how AI engineering fits into the software development landscape.
The AI Engineer Summit was restricted to 500 attendees, but it could’ve easily been double that — such was the demand for tickets. Among the presenters were founders and representatives from OpenAI, GitHub, LangChain, Replit, Fixie and DataStax. I found many of the presentations thought-provoking and the networking in the hallways was equally stimulating.
After congratulating Swyx on an enjoyable event, I asked whether he thought the conference was successful in terms of bringing together the nascent AI engineering community?
“It was successful, yes — it was the first in-person gathering around the AI engineer community, which was started a few months ago,” he said. The attendees, he added, either “self-identified [as] an AI engineer, or they wanted to be one, or they wanted to hire one.”
What Is an AI Engineer?
The role of “AI engineer” is still very new. In practice, at the present time, it means a developer who uses large language models and associated tooling — such as the LangChain framework and vector databases. In a recent presentation, Swyx used the term “software 3.0” to describe the stack that an AI engineer is currently working with. But the fact he has question marks in the UI section indicates that we’re still in the very early stages of what it means to use AI tools as a software engineer.
Not everyone accepts the term “AI engineer,” though. One counter-theory I heard from an attendee at the AI Engineer Summit is that all software engineers will effectively become AI engineers over time — since all of them will eventually use AI tools. I asked Swyx whether he agreed with that?
“No, I have never had that perspective,” he replied. “I definitely don’t think that all software engineers should learn AI. I do think that there should be space, though, for people to specialize in it, or be the domain expert within a team. And so I definitely see that happening, at least first. But I do think there’s a place for traditional software — I’m definitely not one of those AI maximalists.”
He then likened the AI engineer role to being a mobile specialist.
“So, think of AI as a platform, like mobile engineering, right? Like, you just specialize in the mobile stack. I don’t want to touch it, because mobile is gnarly. You go to all the mobile conferences, you know all the mobile tech, and you know the debates. But when I need anything mobile done, I come to you and you know how to get it done.”
He added that all developers should at least familiarize themselves with what AI engineering is — just as they would’ve at least learned the scope of mobile engineering when that became popular ten to fifteen years ago.
Although the UI layer is still up for grabs, it became clear at the AI Engineer Summit that LangChain has become one of the leading application development tools in this ecosystem. I asked Swyx how he views LangChain at this time. He replied that it’s “the most prominent of many” UI tools, but that “it may not necessarily last as the lead one.”
“Everyone’s trying to think about, like, what’s next,” Swyx said. “There are competitors. […] People are happy with LangChain, but they do see that it’s not the end state yet. And whether it’s LangChain that wins, or someone else, it’s still an open question.”
He noted that LangChain is less than a year old — the implication being that it is likely to continue evolving rapidly.
Swyx also pointed out that just having a framework isn’t the complete solution. “How do you make money using that framework? You make something like Vercel.”
The AI Engineer Summit did a great job, I thought, of digging into the current state of LLMs and tools like LangChain. But I wasn’t convinced by everything I heard at the event. One area I was skeptical about was agents, such as AutoGPT (which happened to be the lead sponsor of this event).
Agents are automated pieces of software that use LLMs for various tasks. At the conference, I sensed overconfidence among some of the speakers about the abilities of these automated agents. Maybe it was even hubris, because the general idea of agents seems to be to take humans out of the equation. But if you have ever dealt with automated chatbots from the likes of your bank or phone company, chances are you wished that a human was on the other end of the chat.
Jacob Marks, a machine learning engineer at Voxel51 and one of the conference attendees, put it this way in a LinkedIn post: “AI Agents are far from reaching their full potential. In part, this is due to the difficulty in creating robust evaluations for said agents. AutoGPT is in major flux.”
Swyx thinks the safety issue around agents is overblown, because right now agents haven’t even proven that they can consistently do basic tasks. He also thinks that there are a lot of people building evaluations software, but not enough people building agents.
“So my common joke is in San Francisco, there’s more people building agent eval companies than actually building agents,” he said. “Because they want to sell picks and shovels, they don’t actually want to go for gold. So my argument is, maybe you should just go for gold.”
“I don’t know if it’s the right approach, but it’s an interesting approach,” he replied. “It’s the most direct: hey, we are React for LLMs.”
Regardless of which language AI engineers use, though, the key point that Swyx wants to emphasize is that software engineering is key to the continued advancement of AI.
“The more I build in this space, the more I realize that you just cannot do anything interesting unless you can write software to orchestrate the [AI] systems, and then use the systems to write software,” he said. “So it’s kind of like a virtuous cycle.”
The AI Engineer Summit will be back next year, along with a larger Expo event grandly called the AI Engineer World’s Fair. The prospects for AI startups and hackers alike are looking bright; and if the event this week was any indication, developers might want to consider specializing in AI in 2024.