CI/CD / Contributed

Global Reporting: 5 Reasons Why Tool and Team Alignment Are Wastes of Time

25 Feb 2019 5:00am, by
Tina Dankwart
Tina Dankwart is a Senior Consultant for Tasktop Technologies. She has 14 years of experience building complex solutions, reports and integrations as well as assisting with in-house professional synchronization tools for some of the largest application lifecycle management (ALM) customers worldwide. Previously, Tina served as senior consultant for Mercury/HPe/MicroFocus in the ALM space building complex technical solutions, integrations and reports. In her free time, she enjoys climbing and hiking.

The challenge of global reporting — tracking activities and outcomes across teams, tools and regions — is not a new one. Even in the sheltered and unrealistic world of “one stack serves all” ALM, in which I spent almost 14 years of my career, this challenge was ever prevalent. You may think work was simple with one SaaS tool, and on SaaS, where everything is in one place, but let me paint you a quick picture which many will recognize as showing the typical challenges around major operations.

Imagine 15,000 end users operating across three continents. Imagine a wealth of different client operating systems, outsourced IT (different ones for different regions, naturally), four different testing companies tasked with test creation and execution, multiple time zones, 400 projects and one connection made every two seconds.

Quickly, your beautiful single stack turns into three “single” stacks. There’s that one office that cannot upgrade its ALM tool because it’s still stuck in the dark ages of Windows 98. Then there is another team that urgently needs the very latest version of the tool to match the latest version of their QA tool for some SAP testing. Lastly, a recent acquisition has brought in their own version of the ALM tool. And they’re all on a different database. Obviously.

Moreover, each one of these pockets of users — ranging anywhere between 100 to 1,000 — will be guided by their own processes, using their own customizations, list values and, of course, local reports. On the surface, there seems no harm in this approach; each team is working the way that best suits them. That is until something goes wrong “somewhere” along the line that damages the quality of the end product.

Now imagine one little consultant, sitting in her cozy office near London somewhere, minding her own business, looking forward to a nice (if probably not very sunny) day, not knowing her impending doom. The CIO of this one large client wants to know “please could you tell me why the quality of the product is so poor?”

Lifecycle tools are experts at whatever they were purchased and implemented for — defect tracking, or test management, requirements capture — but not reporting.

Now you might think: it’s SaaS and one stack, so surely all data is in one place. How hard can it possibly be to create some seemingly simple reports? Turns out, very. There are 400 projects, hundreds of fields and lists and spellings and processes. Even counting all open defects and categorizing them by product line and severity proved to be a major challenge (just how many defect categories could there possibly be?).

We spent many weeks and months producing reports, always up against the same problems:  too many variations on common fields, list values and processes. And after a while, the solution seemed obvious: Alignment.

Alignment can take many different forms. Within a stack, you could try and make everyone use the same fields and list values. Some ALM vendors have attempted this through cross-project customization, which was actually born from the many requests of central control (for the purpose of reporting). This approach has its own drawbacks.

Anyone who has ever tried to retrospectively implement cross-project customization on existing databases will be well aware of the difficulty of aligning the data. There are old records with old or redundant list values, editing data causing problems with data integrity and auditing, to name a few.

Or you may try and make everyone use the same processes, projects or move everyone onto the same tool. This, of course, goes against all Agile principles and will cause not insignificant push-back from the teams. People like to work the way they like to work, in the tool they like and with their own customizations and, of course, the product of choice.

I spent the next few years attempting to help customers implement these different forms of alignment; here are the most common reasons why alignment is doomed to failure and a waste of time and effort:

  1. It is expensive: Re-training users, new infrastructure and licenses, not to mention the employees or consultants you will need in order to implement such a huge change, demands a lot time and money.
  2. Sometimes it is not possible: There may be good reasons for having different versions or tools (suppliers, compatibilities with operating systems or other tools, to name a couple).
  3. It is disruptive: Users need to get on with what they are meant to be doing. Big changes require data migrations, time for retraining, cooperation from users and team leads.
  4. It doesn’t actually make sense: Agile has taught us “power to the teams.” In many cases it makes a lot of sense for users to have different processes and ways of working. They should be allowed to use the best of breed tools and processes of choice to make them the most efficient they can be.
  5. It is not scalable and is inflexible: Alignment is a never-ending process. Once you have completed the alignment, the teams will either find ways to maintain their own customizations or another team will be onboarded, a company will be acquired, and/or new product lines will be added.

I spent many years of my life helping customers align their operations. At one point, we were trying to split some projects which had become too large because the customer had tried to get around the reporting problems by putting 2,000 users into one project. This is when I came across the concept of synchronization. I had only ever considered this as a way of integrating two different tools for the sake of better communication or collaboration, but it soon dawned on me that this might also be the answer to all of my reporting woes.

I created a reporting project and flowed all of the important items into this project (without attachments or anything else unnecessary). I should make a point here that it is infinitely preferable to use a reporting database rather than using a workspace within the tools. Only that way can you become truly tool independent, and it gives greater freedom of reporting facilities too, encouraging further the best-of-breed concept.

Lifecycle tools are experts at whatever they were purchased and implemented for — defect tracking, or test management, requirements capture — but not reporting. Reporting is available but often limited and inefficient. If you instead flow artifacts (such as defects, requirements or features) into a separate database, you have the choice of the best reporting tools on the market, rather than being limited to whatever your defect tracking tool will offer you (for example). This practice will also remove the often-considerable load added to the tools by complex and often inefficient or duplicated reports created by end users.

Why Synchronization Is the Answer

  1. No cooperation from teams needed: There is no push back, no disruption. You can report on supplier data without having to give general access to their tools or their environment. Teams can use best of breed tools and processes.
  2. No more long lead times for new reports: Reports are immediate because the normalized data is already there.
  3. No change needed, no disruption: There is no retraining, changes of process, new tools or infrastructure.
  4. Scalable: New tools and teams can be added without fuss or problems. Data is normalized by creating mappings once.
  5. Flexible and tool agnostic: Old tools can sunset slowly, and new tools integrated seamlessly without breaking reports.

The need for global reporting isn’t going to go away as organizations seek to measure and continuously improve the business value of their software delivery. If anything, it will become more complicated to achieve as the next shiny new tool appears on the horizon. We need accurate real-time reporting. For an accurate overview of our operation, we need to analyze the data to make better decisions, to improve quality and efficiency and to determine progress. Without it, we are flying blind.

With sophisticated synchronization, you can allow your teams to do what they need, with the tools they want and with the most effective processes while still achieving near real-time reporting across the entire landscape with a minimum amount of cost, effort and disruption. In other words, have your cake and eat it, too.

Feature image via Pixabay.