Raygun sponsored this post.
The mission is clear, but getting there is fraught with difficulty: Tech leaders are not only expected to deliver on time, but quality must at least meet modern demands.
The quality demands users want for software are that it:
- doesn’t crash.
- is fast.
- communicates how users are using their software and what their experience is.
- has a repeatable and reliable delivery cadence. (Is agile.)
However, even the best teams are still figuring out the best metrics for measuring the quality of their software. While competition is fierce and software engineer salaries are generally high, organizations often struggle when trying to prioritize resources to ensure software meets these modern demands.
This is why crash, error and performance monitoring software Raygun is hosting a series of events called the Tech Leaders’ Tour. The series brings software leaders together to learn from each other about improving software quality and customer experience.
In our Wellington event, we brought three software leaders together in front of an audience of 60 developer professionals to discuss software performance metrics, technical debt and more.
Our panelists were:
- Sonya Williams, co-founder and chief of product at Sharesies. Sharesies is an online investment platform.
- Zheng Li, director of product at Raygun. Raygun is an error-tracking and performance-monitoring tool.
- Simon Young, chief product and technology officer at Trade Me. Trade Me is an online marketplace based in New Zealand.
- Gabe Smith, general manager of enterprise technology at Xero.
The main topic of conversation was how tech leaders prioritize and balance development time so that software works as users expect, solves their problems and is reliable. Here are the three key takeaways from the panel.
1. Software Quality Is Tough to Quantify
However, that doesn’t mean we shouldn’t try. The majority of the audience measures software quality using the number of bugs or errors introduced over a certain timeframe, which is quite easy to monitor using an error tracking tool.
“We’ve had quite a few directors of engineering and each person has had a different view on what to track (as a measure of software quality),” Li said. “In the end, we decided to track how many bugs have been introduced since a new deployment has gone out and how many have been fixed, what is the cycle time, are we getting better at what we’re doing, more efficient, and are we actually solving bugs as we go?”
2. Get Buy-In with Fixing Technical Debt Early
Tech Debt has been a recurring theme for our tech leaders over the past two events. Young used the analogy of a credit card to help communicate the value of fixing technical debt, and suggested that if you are struggling to articulate its value, that you do the same.
“(Tech debt) is a tool to get something done, the same way you’d use your credit card. You want to buy the thing, you want it right now, and you’re prepared to pay it off over time,” Young said. But if you don’t pay off your card after a month, you start accruing interest. If you’re managing that debt well, then you can get good leverage over it, but if you’re just, you know, taking another credit card to pay off your old credit card, you’re in a whole lot of pain and you need to focus on that.”
Smith also offered a similar strategy of prioritization. If a problem aligns with strategic vision, then that gets fixed first. He explained that the development team builds a business case based on the strategic value, compliance issues and risk. “It’s easy to respond to who shouts the loudest, and that’s not always the right answer,” Smith said.
Li shared that putting your users first should be a priority when deciding to allocate engineering time. “For me, it’s bugs introduced per thousand lines of code, bugs remedied per development cycle, and also the number of users affected,” Li said.
3. NPS Is a Flawed Metric for Software Quality
We’ve written about why NPS is a flawed engineering metric before, but this customer-loyalty metric seems to be used as the primary way for our audience to measure customer satisfaction. Zheng from Raygun suggested that if teams do use NPS, they should also measure more proactive metrics like site speed and feature adoption.
At 46:00 in the video, Zheng said “when our site is slow, our NPS score dives by around half. So speed is very important.”
“We need to make sure that our Raygun app is maintained at four seconds to load or quicker than that,” Zheng said. “When we introduce the new project or new feature, we send our customer success teams to go and talk to the customers after we’ve sent out all the marketing material, and we gauge what people think, and if it’s not the right thing, we quickly reverse.”
Smith said that to deliver a great customer experience, it’s also important for the team members are happy and always have the opportunities to grow their careers. At 44:02 in the video, Gabe said: “It’s really important to get really good feedback. You know whether it’s about celebrating success or even around you know, maybe you should do that differently, or you should try and improve.”