Matillion Stretches Data Value Chain Beyond Engineers with Low-Code
Data integration tools provider Matillion has moved to simplify data integration for citizen data scientists and analysts with a low-code/no-code approach.
Data preparation is essential to deriving insights from data. Large enterprises often struggle to make raw data useful, and ready for AI/machine learning (ML) and analytics, at the required pace. Data integration tools make this time-consuming step simpler, by integrating data from multiple sources, loading it into the selected destination analytics platform and transforming it, to make it analytics-ready.
The industry has had ETL (extract-transform-load) tools that have allowed specialists to do this work for decades. To add value in today’s environment of high data volumes and digital transformation, integration platforms must enable not just data engineers, but also business intelligence developers, database administrators (DBAs) and business analysts to do this work as well. This lets team members in those roles create value from data, rather than just consuming data sets that have already been prepared.
Case in Point: Matillion
Matillion is a data integration provider dedicated to taking on this challenge. It takes a low-code, cloud-native approach, providing a platform purpose-built for large enterprise customers to implement their data-driven, digitally transformed practices and culture.
Matillion allows users to extract data, load it into cloud data lakes and data warehouses, then transform it on those cloud data platforms to make it ready for analytics. Users do this by designing and implementing data pipelines using Matillion’s low-code/no-code, drag-and-drop user interface and operationalizing those pipelines with Matillion’s built-in job scheduler and monitoring capabilities.
Other data integration and transformation players, like Fishtown Analytics with its dbt product, and Fivetran, which integrates with dbt, take a more code-oriented approach to data transformation. And while players like Alteryx employ a low-code strategy similar to Matillion’s, they do so by applying the transformations before the data is loaded into the destination platform rather than afterward. While not the only such competitor, Matillion combines the ELT (extract-load-transform, rather than extract-transform-load) paradigm with that of visual pipeline design.
Matillion’s alignment with the phenomenon of empowering non-specialists to do data integration work is likely one big reason why the company was able to raise $150M in Series E funding, in a round led by General Atlantic, with participation from Battery Ventures, Sapphire Ventures, Scale Venture Partners, and Lightspeed Venture Partners.
Integration as OS
The New Stack spoke with Matillion founder and CEO, Matthew Scullion, to get a better understanding of how the company sees the data integration space, where the innovation in it is, and how that will influence where Matillion invests and spends the new funding dollars.
Scullion explained the money will fund Matillion’s development of what the company sees as a data operating system, that takes care of all enterprise data needs. Scullion said, “we’ve already built a very comprehensive data operating system for the enterprise, but we’re not done doing that, and we will continue on our journey…until all needs of the enterprise, in terms of loading, transforming, synchronizing and orchestrating their data in the cloud, are taken care of.”
Scullion added Matillion feels the Series E round is “a great affirmation of the product market fit and vision we’ve got for this market.”
Living up to Potential
Data is the new hot commodity, but in its raw and fragmented form, its value is low. Enterprises that can integrate and transform data — to make it useful for AI, ML and analytics — are the ones who can reach data-driven insights quickly. Enabling not just data engineers, but also team members who have domain expertise and corresponding intimacy with the data, to shape and prepare that data, will accelerate an organization’s progress toward this goal.
In other words, democratizing data requires democratizing data integration, too. We expect to see this become a shared goal among all data integration providers, whether they identify their platforms as ETL, ELT, data preparation or data pipeline solutions.