In the eternal rat race to simplify the enterprise software delivery process, it’s only natural that another wave of tool consolidation appears to be washing across the desks of enterprise IT leaders. The tornado of digital disruption is forcing a disaster response from mature organizations as they frantically seek to create software-driven business models to compete with the tech giants’ oligopoly.
Vendors from across the tooling landscape are proposing solutions that promise a single-platform experience that enables teams to manage the whole value stream from ideation to operation. Yet history speaks for itself; any previous attempts to incorporate the workflows that underpin the complex process to do “everything for everyone” have failed to deliver.
“Software delivery is simply too complex for one tool, one team or even one organization,” writes Dr. Mik Kersten in Project to Product. Instead, the success of elite-performing enterprises indicates the focus should be on laying foundations to harness a best-of-breed toolchain, one that:
- Maximizes the value of the latest cutting-edge tools.
- Supports the ever-growing network of specialized IT teams.
- Enables an adaptive, modular infrastructure that’s responsive to the evolving nature of the business and market.
The One Tool Fallacy
As we listen and learn from the past, we must also cup an ear to the present; the global application development and deployment market is estimated to be worth nearly $350 million by 2022. The tooling landscape is diverse and vast for a reason. Just as there is no one medical practitioner, process or technology to treat the entire human body, there is no one tool that can incorporate all the needs of the mutating IT workforce that is involved in the planning, building and delivery of software. Much of the “why” can be gleaned from the rise and fall of IBM Rational.
In the ‘80s, IBM Rational served as a sophisticated and effective toolchain that helped IT shops track their software lifecycle. IT had visibility, control and predictability for large software initiatives. But as software began to nibble at the world, Rational’s waterfall model was too slow to keep up with an economy being transformed by the rising demand for digital experiences. Agile and DevOps emerged to accelerate the Create and Release stages of the value stream to improve the quality and time-to-market of products and their continuous improvement (and with it the Cambrian explosion of tools to support myriad specialized roles that were emerging).
As software has scaled and directly impacted more areas of the business, we’ve seen a steady increase of stakeholders across disciplines, teams, departments and even organizations. All of whom have their needs, goals, processes and tools related to their sliver of the product value stream. When you dig into these different teams and areas and how they interlink and collaborate, you can see why there can be no one tool to manage such a creative, technical and, crucially, human process.
Understanding Your Value Stream Architecture
The value stream comprises an implicit network of disciplines, teams, tools and processes across four key stages of the process — Ideate, Create, Release and Operate. There are excellent tools in every area — from sales to business analysis, to product and project management, to development and test, to release and support — for supporting specialists and streamlining and optimizing certain areas of the value stream (especially in the release stage).
Yet at the same time, research by Tasktop into the toolchains of over 300+ leading organizations across multiple industries found, encouragingly, that 40% of them connect or plan to connect four or more tools in the design and development stages that precede CI/CD. As illustrated by the below diagram of a typical value stream architecture, there is a lot more to software delivery than the release pipeline:
A value stream architecture mapping exercise reveals how business value flows across key tools and teams in the value stream network. In this example, there are six tools, only half of which are in the release stage. What about all the other critical information that is created and managed by teams in the stages before and after release?
If a single tool vendor could provide robust functionality for all practitioners in the process — covering the process and all that rich data that relates to their work and a product’s development — it would. But these vendors are successful in their customer domains (sales, product management, project management, Agile planning, test management and help desk incidents, etc.) for a reason. Their expert users need purpose-built tools that reflect their day-to-day work and improve their productivity and effectiveness.
Understandably, the vendors in this landscape still want to make it easier for these cross-functional teams to work together. And as seen with Atlassian Jira, some tools can encompass more of the workflow to become integral cogs in the machine and support other disciplines, especially for some smaller organizations that only have a couple of hundred IT staff and that don’t require specialized functionality such as advanced test management.
Given the size of their teams, the volume of work and type of work (unregulated) mean these types of organizations have a good view and understanding of their value stream. Teams can easily analyze their work as they only really have one relevant data source (Jira) to glean valuable performance metrics such as lead time and bottlenecks to continuously improve the process.
For larger enterprises employing 15,000+ IT staff, consolidating all work and data onto one tool won’t work. Beyond the sheer scale of operations, Agile planning tools like Jira are not designed for high-level requirements, portfolio or complex test management that are critical to highly regulated industries like finance, healthcare and government, where traceability is key. What’s more, migrating to just one tool has myriad issues:
- One generic UI not optimized to the practitioner work environment.
- Increased manual waste via non-value adding work to share information between tools.
- All of production is halted when it goes down.
- Potentially becomes obsolete as new technology inevitable comes to market.
- Causes frustrations for different levels of business that can’t work within their necessary process.
Instead, these organizations should embrace the diverse toolchains that meet the diverse needs of the workforce that builds and maintains their critical product and service portfolios. By connecting and automating the flow of data between systems, you can augment existing investments while creating a single oiled machine designed for the unpredictable rigors of the modern marketplace that provides these important benefits:
- Cuts manual overhead and improves cross-team collaboration.
- Creates a visible and traceable workflow.
- Accelerates flow across the entire value stream.
- Accommodates organizational change via mergers and acquisitions.
- Supports new versions and new tools without disrupting workflow.
- Generates end-to-end data stream to extract and build technical and business-level metrics to accelerate value delivery.
So the next time someone tells you that one tool offers a single platform experience to manage the entire value stream, don’t be seduced. If it sounds too good to be true, it most probably is.
Feature image via Pixabay.