Cloud Cost-Unit Economics — A Modern Profitability Model
Transformative cloud initiatives have upended traditional financial oversight for budgeting and controlling IT spend. Traditional processes established around planning and purchasing capital equipment simply don’t apply in the cloud where infrastructure is deployed as code and resources are spun up and down based on demand.
Given the relentless pressure from the business to improve product/feature velocity, developers rapidly provisioned resources using cloud APIs that were outside of financial governance. That ease of deployment bypassed conventional checks and balances, reducing visibility and accountability for cloud spend.
Cloud FinOps and Cost-Unit Economics
As cloud governance matures within an organization, financial analysis also evolves to better comprehend new concepts, such as the shift to the cloud’s OpEx cost model, the complexity of elastic infrastructure and variable cloud vendor pricing.
The emerging discipline of Cloud FinOps (a clever combo of “financial” + “DevOps”) aims to maximize return on cloud investments, ensure data-driven decisions and improve accountability — all while increasing collaboration of business and technical teams.
The FinOps Foundation describes six guiding principles to establish a cloud cost optimization framework, and the concept of unit economics is a key component of each principle. Cloud cost-unit economics measures the marginal, or unit, cost specific to the development and delivery of cloud-based software. Subtracting this value from a marginal/revenue calculation yields a profit/loss number.
“By calculating the difference between marginal cost and marginal revenue, you can determine where your cloud operations break even and begin to generate a profit — an important concept in economics overall and one of the most effective ways to make data-driven business decisions regarding your cloud investment.” — Finops.org, “Introduction to Cloud Unit Economics”
So, what are the “units” in cost-unit economics? This will be specific to the business, but some common units of measurement might be:
- Cost per customer
- Cost per transaction
- Cost per subscription
Technical Plumbing: Resource Tagging
The first step in establishing a FinOps operating model is gathering information about cloud usage and costs. There is no shortage of data to collect, but gathering and parsing the sea of available data requires a methodical approach. A monthly enterprise cloud bill could have hundreds of thousands of line items.
This initial phase is also the best time to implement the necessary technical plumbing to map cloud resources back to the business units generating the cost. Accurate allocation and reporting are not straightforward tasks given the ephemeral nature of cloud resources. A manual process would be impossible.
Tagging — or labeling, the term is different depending on the cloud vendor — is a process of identifying resources and assets using a key-value pair. A tag is essentially metadata that links the cloud resource to the organization or business unit responsible for the deployment, which will be responsible for the life cycle of the resource, not just its initial deployment.
Tagging doesn’t just facilitate cost chargebacks. Properly tagged resources can also be automatically monitored, with alerts that notify stakeholders when infrastructure is deployed outside of governance policies established by the business.
Point-in-Time Unit Cost Calculation
Segregating cloud accounts by each business unit allows you to tag all resources, making it easier to understand and track resources when they are provisioned. Let’s assume you have attributed each of your cloud accounts to a specific product ID, and your target unit of measurement is cost per customer. Now you are ready to produce a straightforward point-in-time calculation for unit costs that will look like this:
total monthly cloud costs per product ID/number of customers for that product ID = average cost per customer
A one-time cost calculation like this is a great place to start but doesn’t provide much opportunity to model your business over time, such as how cost per customer might change as your business grows. Also, an application built on a cloud native microservices architecture will have a complex web of service interactions and shared services to identify and track in real time.
Fortunately, we have tools available to uncover the necessary data and context to support a more sophisticated analytics model. Interestingly, cost information from the cloud vendor is presented as time series data, and the applications running on cloud resources can be set up to also track observability data in a time series. Adding time series data that correlates cloud costs to your unit of measurement allows you to better understand how the unit drives costs and create models to predict.
Uncovering Time-Series Cost Data at the Application Level
As described in this white paper, Dan Bode, senior director of Cloud Advisory at UST, explains:
“Metrics and distributed tracing are two types of observability data published from a running application to an external data source as time series data. Because metrics, trace and cloud cost data are time series and allow arbitrary labels to be set (like ProductId), observability data can be joined with cost data to provide additional context.”
He advises that if you have not implemented central observability in your organization yet, you might find it easier than you think with tools like Istio and eBPF. The observability data, paired with time series cloud cost data, form the foundation for a cloud cost-unit economics analysis that identifies how costs change with customer volume.
A Modern Take on Calculating Business Profitability
If we believe that every company is a software company, then businesses need a fresh take on calculating profitability. Cloud cost-unit economics provides the view of profit maximization that sheds light on the cloud’s role in your company’s financial performance and provides the financial model required to predict future profitability.