Analysis depends on data—and the more sources, the richer the insights. However, bringing data together to enable in-depth analysis is an age-old challenge, one that has only become tougher as data volumes and sources grow. The crucial question is this: where should data reside for analysis?
The traditional approach—extracting data from a source system and loading it into an enterprise data warehouse—is a proven best practice as it allows IT to solve data quality issues and create optimized models for analysis. But it can come with high development and maintenance costs. The conventional alternative—staging multiple data sources on a desktop and performing a local “mashup”—sacrifices governance and prevents collaboration. The ideal strategy implements a continuum of options that allow organizations first to prove the value of a new data source and then formally onboard it using production IT processes.
Join Ian Macdonald (Principal Technologist, Pyramid Analytics), Patrick Ebert (Enterprise Architect, Pyramid Analytics), and Carsten Weidmann (Technical Alliance Manager, Exasol) as they describe a progressive new approach that preserves business agility and governance while preparing data for integration into an Exasol data warehouse.
Ian Macdonald, Principal Technologist at Pyramid Analytics
Patrick Ebert, Enterprise Architect at Pyramid Analytics
Carsten Weidmann, Technical Alliance Manager at Exasol