TNS
VOXPOP
Which agile methodology should junior developers learn?
Agile methodology breaks projects into sprints, emphasizing continuous collaboration and improvement.
Scrum
0%
Kanban
0%
Scrumban (a combination of Scrum and Kanban)
0%
Extreme Programming (XP)
0%
Other methodology
0%
Bah, Waterfall was good enough for my elders, it is good enough for me
0%
Junior devs shouldn’t think about development methodologies.
0%
AI / Large Language Models

Open Source May Yet Eat Google’s and OpenAI’s AI Lunch

Recent AI progress by the open source community has prompted a reevaluation of strategy for Google and OpenAI.
Jun 19th, 2023 6:00am by
Featued image for: Open Source May Yet Eat Google’s and OpenAI’s AI Lunch

A recently leaked Google memo reveals that while Microsoft and Google are getting all the generative AI hype, open source developers may yet win the market battle.

A Google AI engineer wrote, “The uncomfortable truth is, we aren’t positioned to win this [Generative AI] arms race, and neither is OpenAI. While we’ve been squabbling, a third faction has been quietly eating our lunch.” And, who is this hidden third party? Amazon Web Services (AWS)? IB? Baidu? It’s none of them. It’s the open source community.

How can that be? Doesn’t Generative AI require hyperscale clouds to deliver large language models (LLM) provided high-quality answers? Ah, actually, no, no, it doesn’t.

It turns out you can run LLMs on a smartphone: People are running foundation models on a Pixel 6 at five LLM tokens per second. While others have shown, you can finetune a personalized AI on your laptop in an evening.

In other words, “Being able to personalize a language model in a few hours on consumer hardware is a big deal, particularly for aspirations that involve incorporating new and diverse knowledge in near real-time.”

The Revolution

The key to this revolution? The recent leak of Meta’s Large Language Model Meta AI model (LLaMA). This spurred an avalanche of innovation from the open source community. Despite lacking initial instruction or conversation tuning, the model was rapidly iterated upon, with enhancements such as instruction tuning, quantization, quality improvements, and others developed in quick succession.

Chief among the game changers is the use of a cheap fine-tuning mechanism known as low-rank adaptation (LoRA). This enables model fine-tuning at a fraction of the cost and time. This technology has reduced the barrier to entry for training and experimentation significantly, enabling individuals to personalize a language model in a few hours on consumer hardware.

As our mystery developer said, “Part of what makes LoRA so effective is that — like other forms of fine-tuning — it’s stackable. Improvements like instruction tuning can be applied and then leveraged as other contributors add on dialogue, or reasoning, or tool use. While the individual fine tunings are low rank, their sum need not be, allowing full-rank updates to the model to accumulate over time. This means that as new and better datasets and tasks become available, the model can be cheaply kept up to date without ever having to pay the cost of a full run.”

So it is that Generative AI is now within the reach of pretty much any AI-savvy, open source developer. Additionally, the open source community has been efficient in using high-quality, curated datasets for training, following the line of thinking that data quality scales better than data size. These datasets are typically developed using synthetic methods and scavenging from other projects.

Reevaluation

The recent progress by the open source community has prompted a reevaluation of strategy for Google and OpenAI. The rapid innovation, combined with the lack of usage restrictions, makes open source AI models an attractive alternative for many users.

I think this is only appropriate. After all, while the FAANG has profited so far from Generative AI, all their work has been based on open source AI programs. Without TensorFlowPyTorch, and Hugging Face’s Transformer, there would be no ChatGPT or Bard.

Of course, Meta, who sparked this revolution, is also uniquely placed to make the most from incorporating its code into its products. Perhaps, the other top companies, betting their future on AI, might realize that letting open source developers work with their data models will work to their advantage. After all, it has for essentially every major software advance for the last twenty years. Why should generative AI be any different?

As our Google mystery developer said, “Directly Competing With Open Source Is a Losing Proposition. … we should not expect to be able to catch up. The modern internet runs on open source for a reason. Open source has some significant advantages that we cannot replicate.” Exactly so.

Group Created with Sketch.
THE NEW STACK UPDATE A newsletter digest of the week’s most important stories & analyses.