Integrating a Data Warehouse and a Data Lake

The Delta Lake open source project integrates data lakes and data warehouses, a needed combination in this new age of scale-out data requiring reliability and controls.
A data lakehouse integrates the advanced data analytics and low-cost storage of a data lake with the performance and reliability of a data warehouse, said Florian Valeye, a data engineer with Back Market, in this episode of The New Stack Makers, recorded at the Open Source Summit in Bilbao, Spain, earlier this fall.
The data warehouse is an approach that emerged over the past two decades. It consists of structured data models that allow for better performance. The datasets are small and constrained, Valeye said. In contrast, the data lake consists of unstructured data from multiple sources. The volume size of data lakes reaches petabytes or even exabytes.
Delta Lake, created by Databricks, breaks down the barriers between data warehouses and data lakes by providing more performance and features in the data lake, Valeye said.
ACID transactions are a staple of a data warehouse, Valeye said. When thinking about the relational database or a data warehouse, a focus is put on model representations and the data structure. A data lake is an open way to push data and add a schema. Data lakes magnify data. Through the “wall breaking,” the lake house provides ACID transactions, Read, Process, Interpret (RPI) ingestions, and metadata scalability. The strength comes with a way to attain knowledge for any usage without a barrier between the data analyst side of the house and the data engineering and data scientist teams.
Databricks is now working on providing ways for anyone to contribute their connectors through Delta Lake, allowing gateways that can be used with, for example, different databases.
“And that’s why it’s really nice when you are working on a specific cloud provider; you don’t want to be bundled, locked inside it,” Valerie said. “So that’s why having this kind of standard format, you can switch and move from one to another, and don’t feel stuck with one provider and one format.”
Valeye said Back Market sells refurbished devices. To determine device quality, the company developed an algorithm to determine whether a device is viable for sale. Back Market uses Delta Lake to ingest data from APIs and other data sources.
Delta Lake is a platform to connect data scientists and data engineers, said Valeye, who previously worked as a data engineer. Before using Delta Lake, deploying models could be complicated due to the complexities of the different tools and programming languages used. The Delta Lake infrastructure closes the gaps. It allows everyone to work on the same infrastructure.
More Episodes from Open Source Summit EU 2023
WebAssembly’s Status in Computing
Powertools for AWS Lambda Grows with Help of Volunteers
How to Be a Better Ally in Open Source Communities
Open Source Development Threatened in Europe