How has the recent turmoil within the OpenAI offices changed your plans to use GPT in a business process or product in 2024?
Increased uncertainty means we are more likely to evaluate alternative AI chatbots and LLMs.
No change in plans, though we will keep an eye on the situation.
With Sam Altman back in charge, we are more likely to go all-in with GPT and LLMs.
What recent turmoil?

WebAssembly’s Status in Computing

In this episode, Liam Crilly of NGINX joins us at the Open Source Summit to share his unique perspective on WebAssembly.
Nov 14th, 2023 11:14am by
Featued image for: WebAssembly’s Status in Computing

BILBAO, Spain — During a recent conversation with Liam Crilly, senior director of product management at NGINX, Crilly brought a unique perspective on WebAssembly, drawing from over three decades of software development and operations experience. Although WebAssembly doesn’t directly run on physical devices, it has the potential to operate across a network of devices used for data exchange and deployment, employing WebAssembly modules, as explained by Crilly during a recording of The New Stack Makers podcast at the Open Source Summit in Bilbao, Spain.

The conversation on this episode was hosted by B. Cameron Gain, a frequent TNS contributor.

“I think the long-term promise, the ultimate promise of WebAssembly is that you get to build this thing once,” Crilly said. “That’s, you know, what’s in it for developers build something once, run it anywhere. And there are some other like advantages we can go into about what else WebAssembly can bring those. But for me, the number one thing is universal portability.”

WebAssembly’s more stated promise of being able to, in theory, deploy once and deploy anywhere on numerous endpoints simultaneously has yet to see full fruition. This aspect of WebAssembly’s maturity is under much discussion about how to realize its potential.  We need to always clarify whether we are discussing the client side in the browser or the server side. If we are talking about the browser, things are much more mature. First and foremost, browser vendors have done a great job building runtimes,” Crilly said. “The ecosystem, environment, and constraints of where you’re running are pretty well understood. It’s a web browser and it’s the client-side view. It’s what I used to do with JavaScript applications, and now I can compile to WebAssembly and they can run in there.”

For the server side, running Wasm as the backend or an API endpoint, or “when I’m working on a microservices application or whatever it may be, it’s far less mature,” Crilly said. “We don’t have plenty of runtimes to choose from, and these are newer than the ones in browsers. Moreover, as I discussed in my talk at WASMCon a couple of weeks back, the toolchain for not only building WebAssembly modules but also how to run them, especially when building a web app, which is what we often do, is not quite there yet,” Crilly said. “Another dimension to consider is that the standards are not quite there yet.”

Furthermore, WebAssembly can be seen as a powerful compiler target, as Crilly explained: “What’s fascinating about WebAssembly is that it provides the advantages of a compiler, enabling you to take a high-level language and generate well-optimized instruction set code.” However, because WebAssembly functions as an abstracted computer, it necessitates a virtual machine or runtime to take this instruction set and execute it on the hardware. While this might initially seem like an additional abstraction layer, it’s actually quite ingenious. With WebAssembly, it’s possible to construct a runtime for any hardware, eliminating the need for developers and operators to concern themselves with specific hardware details, Crilly said.

Crilly further emphasized, “If I have a WebAssembly module that I’ve compiled to this instruction set, I reap the benefits of compiler optimizations as well as the insights gained from over a decade of JavaScript experience in the browser, which includes just-in-time (JIT) optimizations during runtime as I convert the bytecode from this instruction set into CPU instructions. This additional layer of optimization, akin to JIT compilers and browser runtimes, provides near-native compute performance. Therefore, there’s minimal downside to this abstraction layer.”

More Episodes from Open Source Summit EU 2023

From Debian to Open Source AI

Integrating a Data Warehouse and a Data Lake

PostgreSQL Takes a New Turn

Powertools for AWS Lambda Grows with Help of Volunteers

How to Be a Better Ally in Open Source Communities

Open Source Development Threatened in Europe

Group Created with Sketch.
TNS owner Insight Partners is an investor in: Pragma, The New Stack.
THE NEW STACK UPDATE A newsletter digest of the week’s most important stories & analyses.