Will JavaScript type annotations kill TypeScript?
The creators of Svelte and Turbo 8 both dropped TS recently saying that "it's not worth it".
Yes: If JavaScript gets type annotations then there's no reason for TypeScript to exist.
No: TypeScript remains the best language for structuring large enterprise applications.
TBD: The existing user base and its corpensource owner means that TypeScript isn’t likely to reach EOL without a putting up a fight.
I hope they both die. I mean, if you really need strong types in the browser then you could leverage WASM and use a real programming language.
I don’t know and I don’t care.
AI / Storage

Nvidia Uses OpenStack Swift Storage as Part of Its AI/ML Process

How OpenStack, a staid open source Infrastructure as a Service cloud, works with Nvidia's cutting-edge Artificial Intelligence and Machine Learning data management.
Jun 30th, 2023 6:50am by
Featued image for: Nvidia Uses OpenStack Swift Storage as Part of Its AI/ML Process

When you think about artificial intelligence (AI) and machine learning (ML), the OpenStack Infrastructure as a Service (IaaS) cloud, and its object storage component, Swift is not the first technology to come to mind. But that’s exactly what trillion-dollar-plus chip and AI powerhouse Nvidia uses to power its ML efforts.

John Dickinson, Nvidia’s Principal Systems Software Engineer, explained at the recent OpenInfra Summit that ML requires speedy, robust storage solutions. “With the rise of AI and ML technologies, our primary role as storage providers is to fuel the engine with as much data as possible, as quickly as possible,” said Dickinson. To keep up with the increasing demand, storage solutions must offer high capacity, availability, and aggregate throughput.

Enter Swift

True, Dickinson continued, “While Nvidia’s Grace and Hopper chips and spectrum switches are pushing the boundaries in the computing and networking domains, storage speed is also vital.” Open source Swift, a distributed object storage system designed to scale from a single machine to thousands of servers, is optimized for multitenancy and high concurrency. A simple, REST-based API is normally used to access Swift.

During his keynote at the conference, Dickinson illustrated the ML workflow, emphasizing the significance of understanding data access patterns while building the supporting storage systems. After all, ML demands massive datasets that are much too large to fit into GPU memory or server flash storage.

According to Dickinson, the answer lies in object storage. This offers high-throughput and large capacity, albeit with differing APIs. While object storage presents its own set of challenges, including caching complexities and varying APIs, he firmly stated that the goal is to “enable users to do what was previously impossible.”

Two Key Concepts — Inner and Outer

Nvidia, he disclosed, is implementing two key concepts to tackle these issues — an “inner ring” and an “outer ring”. The inner ring is characterized by high speed, low latency, and its connection to a specific GPU cluster, resembling file storage for the end users. The outer ring, on the other hand, offers large capacity, high throughput, and high availability. For the outer ring, Nvidia uses Swift, thanks to its suitability for large capacity and high throughput storage.

Implementing these storage concepts has enabled Nvidia to support massive datasets that were previously impossible to handle, improve performance, and increase workload portability. Swift also delivers improved I/O performance with a single read from the outer ring on the ML first epoch, This outer ring data is also accessible from every compute cluster. In addition, since Swift supports many standard APIs such as POSIX and NFS for file access and S3, Azure, and native Swift for object access, it’s very easy to work with the datasets regardless of how you need to access them.

The strategy continues beyond providing inner and outer rings. Acknowledging the increasing difficulty of data exploration as datasets grow, Nvidia has created a dataset service aiming to simplify this process. In a live demonstration, Dickinson showcased how these storage services facilitate large-scale machine learning, highlighting how a user can load a dataset into Swift, explore it in a Jupyter notebook, and run an ML task without worrying about the down and dirty details of accessing that storage.

This live demo impressed the OpenInfra audience of about 750 users. It’s rare that a technical audience is impressed by a demo. They’ve seen it all, and they know all the tricks. But this one caught their attention. OpenStack and Swift have a clear role to play in serious work with massive ML datasets.

Group Created with Sketch.
THE NEW STACK UPDATE A newsletter digest of the week’s most important stories & analyses.