Scaling Data Pipelines @Magenta Telekom

Nov 4, 2025·
Georg Heiler
Georg Heiler
· 1 min read
Abstract
Making humans and machines collaborate efficiently.
Date
Nov 4, 2025 8:30 PM
Event
Location

Online

Magenta Telekom ingests many terabytes of new data every day, and every downstream consumer wants it immediately. The real bottleneck turned out not to be hardware but humans wrestling with hidden, hard-wired dependencies in hundreds of heterogeneous pipelines and sometimes tool silos.

Our fix was to treat every data asset as a node in a data-dependency graph and every transformation as an edge. Ingestion, Transformation, AI and BI are all part of the same executable graph. By using suitable abstractions and dependency injection less technical people are empowered to contribute business logic which can be operationalized efficiently.

This talk covers:

  • Unified asset graph –> ingest –> transforms –> reports –> ML all in one lineage-aware DAG.
  • Event-based pipelines: Events propagate state changes across the enterprise via the edges in the graph in near real-time
  • Dependency-injection: By using the right abstractions we can empower more users along the data value chain to contribute.

Operational challenges are handled by the abstractions and analysts only focus on the business logic.

Georg Heiler
Authors
senior data expert
My research interests include large geo-spatial time and network data analytics.