Cloud Services / Data / Storage

Qumulo Scales Distributed Storage for Collaborative Online Video Editing

4 Apr 2019 8:42am, by

When Qumulo first launched back in 2015, the company was said to be working to solve the problem of how to scale data, rather than scale storage, with its Qumulo Core, a scalable, data analytics platform that gives enterprises a view of their data and storage resources.

Two years later, Qumulo paired its distributed file storage system with Amazon Web Services and today has added Google Cloud Platform to the mix. Taking advantage of all that available and scalable compute, the company has also unveiled two new products:  Qumulo CloudStudio and QumuloDR Cluster.

The combination of Qumulo’s distributed file storage system with the elastic compute resources of Google Cloud Platform (GCP) and Amazon Web Services is showcased with CloudStudio, which “securely moves traditionally on-prem workspaces, including desktops, applications, and data, to the public cloud,” according to a company statement.

Molly Presley, global director of product at Qumulo, offered up an anecdote of visual effects studio that does work for the “Game of Thrones” television series as evidence of the potential of CloudStudio. Previously, she said, the studio would take several weeks when they wanted to run a render on a new visual effect, and they began to fall behind. With Qumulo, she said, they could “burst up to Amazon and run on a thousand nodes” instead of missing deadlines and falling behind.

Presley also mentioned that integrating with GCP offered additional strategic compute locations, as the company had added locations outside of Toronto, Los Angeles and other locations where media and entertainment organizations, such as creative agencies, post-production studios, broadcasters, and visual effects studios, might be based. She emphasized, however, that Qumulo provides similar benefits for other industries as well, such as the oil and gas industry.

“We have oil and gas customers that do the same thing. The whole idea is to move an entire workload, which has specialized applications and compute power and metadata requirements, and building that footprint in the cloud,” said Presley. “Everything we’re doing is about getting data off sensors off the edge and into the cloud. The data sits on our environment and the compute happens in AWS, GCP, or on-prem. The only way you could do this kind of work before was if you were a supercomputing type.”

Qumulo also launched QumuloDR Cluster, which adds an extra layer of data protection using the public cloud. According to a company statement, the feature combines Qumulo with AWS and GCP to offer users the ability to “replicate their on-prem cluster data to a cluster running in the cloud for extra protection” and “failover to the cloud cluster where they will have access to all of their on-prem data” in the case of a catastrophic failure. Additionally, users are able to keep the cluster running continuously or “programmatically turn it on before replication to reduce costs.”

Feature image by DarkWorkX from Pixabay.

A newsletter digest of the week’s most important stories & analyses.

View / Add Comments

Please stay on topic and be respectful of others. Review our Terms of Use.