How Will Generative AI Change the Tech Job Market?

When OpenAI released ChatGPT-3 in November — and especially after it unveiled the GPT-4 version in March — a flurry of articles about prompt engineering hit the mainstream and business press.
“AI ‘prompt engineer’ jobs can pay up to $375,000 a year and don’t always require a background in tech,” promised a piece in Insider in April. Time and Forbes also joined the chorus.
But prompt engineering — crafting queries for generative AI that help surface more sophisticated and useful results, and help train the tool — is not likely to be a long-term career path, according to leaders of AI organizations.
“It’s a short-term fad that will go away, because everybody will just start to learn how to write better prompts, how to make sure that you can actually interface with AI,” Vivek Ravisankar, co-founder and CEO of HackerRank, which uses AI to create skill-based hiring evaluations for engineers, told The New Stack.
“Interfacing with AI will become a real skill, just like how you’re interfacing with Google as a search engine.”
How else will generative AI change the tech job market? Eighty-two percent of participants in a survey released in April by HackerRank said they believe generative AI will redefine the nature of coding.
The truth is, no one knows yet what impact fast-iterating tools like ChatGPT will have on the tech job market. Some clues, however, have already emerged.
At Tribe AI, a four-year-old consulting company that recruits and matches AI talent with customer projects, inbound customer inquiries have been up 100% each month this year, an “explosion” in demand, according to Jaclyn Rice Nelson, Tribe AI’s co-founder and CEO.
“The market has completely changed,” Nelson told The New Stack. “It’s not just how companies see what’s possible, but the urgency with which they see that it’s not just possible, but important to act. It’s taking everyone from, ‘One day, we could be an AI company,’ to ’We have to be an AI-enabled company today. And we can, because of these tools that exist.”
Fortunately, Tribe AI has also seen a big jump in the number of people applying to join its network of AI talent — a 50% increase from month to month in 2023, Nelson said. And the quality of the candidates is improved from before GPT-3 was introduced.
“Interfacing with AI will become a real skill, just like how you’re interfacing with Google as a search engine.”
— Vivek Ravisankar, HackerRank
Her company she said, was started because it saw an imbalance between demand for AI talent and the supply of AI engineers. “We figured out how to activate talent that wasn’t otherwise on the job market. And the reason was, they didn’t want full-time jobs, they wanted fractional jobs.”
By accommodating this desire, Tribe AI has been able to assemble hundreds of members and match them to AI and machine learning projects. Nelson remains confident that there are more engineers and other specialists who can work with the next wave of AI: “We don’t feel like we’ve even come close to tapping the true depth of the supply and the talent marketplace.”
‘Prompt Hackers’ and AI Strategists
As the generative AI revolution gets underway, talent with deep experience in Large Language Models (LLMs), which ChatGPT and other tools have trained on, will be avidly sought, experts said.
“If you are an engineer that’s trained a Large Language Model of over 10 billion parameters, congratulations, you hit the jackpot,” said Noah Gale, co-founder of Tribe AI. “You are very much in demand.”
Part of the reason for that demand, said Nelson, is that people with that deep experience are so scarce: “LLM engineers, I would say those are probably the hardest to find, because so many of them are still at places like Google, OpenAI, etc.”
But what companies need is shifting fast, said Nelson.
“The projects are quite different,” she said, then they were before ChatGPT-3 was introduced. Now, she said, companies are asking, “What should my generative AI strategy be? I have an idea for something that could use generative AI, does it make sense? Is that where I should be focused?”
That direction, she said, “requires someone to both have a deep knowledge of how these models work, and what you can do with them, the power of them. But also have some product or strategic sense. An engineer who can speak English, as well. And that translates to people who may not be as technical. So I think that sort of changes the profile to more of an AI strategist.”
The ideal candidate for some roles in generative AI, she said, might have more in common with a growth marketer than an engineer. “Some of the profiles we’ve seen that have been really good at this are totally counterintuitive — like, improv comedian. It’s just someone who’s really creative and really good with language and very good at coming up with things on the fly.”
Tribe AI has seen increased demand for “prompt hackers,” said Gale, people who can try to break foundational models to reveal their vulnerabilities, so that the models can be improved.
“This is really important to these foundational model companies, in order to make sure that the things that they’re putting out to the wild are safe,” he said.
Tribe AI has seen interest from organizations seeking help in fine-tuning its generative AI tools, Gale said. Once organizations start using those tools, he said, the next logical step is they will want to train the model on the company’s own internal data, and perhaps build an entirely new model that’s specific to their own use cases.
Training generative AI on an organization’s own data — “so it knows about all your company’s internal documents and Slack history” — may be an area of growth in AI-related work in the future, echoed Dylan Fox, founder and CEO of AssemblyAI, which transcribes and analyzes audio data.
Gale offered what he called a “very, very hot take” on the short-term future of prompt engineering-related roles: “Within a year, someone is going to get paid a million dollars plus, for this role. And within a year of that, that role will no longer exist.
“Because these large models are going to get so good, through reinforcement learning and being trained on our own data, of understanding our intentions, that it will no longer require [us] to get like super specific and creative with these prompts. It will just know.”
Systems Engineers Wanted
But it’s not just AI/machine learning engineers and “prompt hackers” who will be in high demand, experts said. UX and full-stack engineers will also be avidly sought by organizations looking to use the new generative AI tools.
These, Nelson said, are “complementary roles that build around the advanced AI technology, because you need to know where to actually, like, put the thing. And then you need to plug it into your product in a way that consumers can engage with it. And it needs to fit into your back end.
“That’s kind of part of the story that gets missed. The nuance that we see is just a tremendous increase in demand [for] SaltStack.”
Operations and systems engineers might be the unsung heroes of the generative AI revolution, experts told The New Stack.
“With LLMs, a lot of what people are doing now is exploiting the architectures and the scaling laws, and just designing bigger and bigger data centers, to train bigger and bigger models. It’s just scaling, which is a lot of engineering work.”
— Dylan Fox, AssemblyAI
Fox, of AssemblyAI, marveled at the systems engineering required for generative AI to reach its current stage: “It’s almost more of an engineering accomplishment than a research accomplishment.”
He asked, “How do you string together hundreds and hundreds of servers that have GPUs on them, to talk to each other with really low latency, and to move a lot of data around with really low latency, so that you can be really efficient as you try to find good models?
“With LLMs, a lot of what people are doing now is exploiting the architectures and the scaling laws, and just designing bigger and bigger data centers, to train bigger and bigger models,” he said. “It’s just scaling, which is a lot of engineering work.”
Fox’s company is working on what he called “the Blue Apron of AI.” Like the meal-kit service, it would offer a user-friendly but customizable product, what he envisions as “a superhuman AI system for multilingual, low latency, automatic speech recognition.”
As he’s grown his company, he’s adhered to hiring two or three engineers for every AI researcher. “A lot of this is like scaling up datasets, scaling up training infrastructure, and that requires like really deep and complex systems engineering.”
Be Curious, Stay Relevant
Three of every four participants in HackerRank’s new survey of 42,000 developers said they will be adjusting their skills to respond to the growth of AI.
How does a technologist stay in the game as generative AI rapidly changes the way software is built, deployed and managed in the coming years?
The good news: the traits that probably drew you to development and engineering in the first place: curiosity and a willingness to tinker and experiment. (In HackerRank’s survey, four out of five respondents said they’ve begun to play with ChatGPT and similar tools.)
“The most important skill is just to be very curious,” Gale said. “And being down to build, experiment, and ship really quickly.”
Every day, generative AI is dazzling users with what they can do. When asked what’s blown his mind lately, Fox told The New Stack about how his wife, who is not a programmer, created a Chrome plug-in in an hour using ChatGPT, a task he said would likely take him half a day.
“It’s not like AI will be replacing human beings. It’s human beings that are using this as a tool, replacing those that choose to not use this as a tool.”
— Noah Gale, Tribe AI
And the models are getting smarter all the time. Fox’s company released a new speech recognition model earlier this spring. “When I was playing around with that, I just threw a Drake song at it,” he said. “And it was perfect at transcribing the lyrics. And I was like, we didn’t even train this thing on music! That’s crazy.”
But no one The New Stack spoke to about generative AI and the tech job outlook went into doomsday mode when talking about AI’s impact on technologists.
“It’s not like AI will be replacing human beings,” Gale said. “It’s human beings that are using this as a tool, replacing those that choose to not use this as a tool. And luckily, this technology seems pretty democratized.”
He added, “Now with these language models, a single engineer can do the work of 10. And it’s actually the people that really understand this, that have even more of this leverage.”
In addition to playing with the new tools, Fox recommended digging into published research on them. “Read the actual research papers,” he said. “Usually, when you look at the research papers, they’re a lot more sobering and conservative than what you see like on Twitter, or in the news. If you look at the research papers, you actually can see the metrics.”
Innovation always kills some jobs but creates others, Ravisankar noted. “In general, it’s better to embrace technologies. You are probably better off in the long run, even if some of those things actually don’t pan out. So I’d probably give the same advice of embracing AI. How can you incorporate AI into your daily work? That would just like put you ahead of the curve in the long run.”
And many innovations over the last two or three decades, he noted, have resulted in more people working in programming, not fewer. “If you think about it from this particular framing, you can almost think of this as like another toolkit in the developer’s backpack that is just going to make developers more productive, as well as is going to lower the barrier to entry for new people to come in and build.”
He summed up, “There’s going to be a very exciting time over the next decade or so.”