Confluent Developer ft. Tim Berglund, Adi Polak & Viktor Gamov
Placing Apache Kafka at the Heart of a Data Revolution at Saxo Bank
19 Aug 2021
Monolithic applications present challenges for organizations like Saxo Bank, including difficulties when it comes to transitioning to cloud, data efficiency, and performing data management in a regulated environment. Graham Stirling, the head of data platforms at Saxo Bank and also a self-proclaimed recovering architect on the pathway to delivery, shares his experience over the last 2.5 years as Saxo Bank placed Apache Kafka® at the heart of their company—something they call a data revolution. Before adopting Kafka, Saxo Bank encountered scalability problems. They previously relied on a centralized data engineering team, using the database as an integration point and looking to their data warehouse as the center of the analytical universe. However, this needed to evolve. For a better data strategy, Graham turned his attention towards embracing a data mesh architecture: Create a self-serve platform that enables domain teams to publish and consume data assetsFederate ownership of domain data models and centralize oversights to allow a standard language to emerge while ensuring information efficiency Believe in the principle of data as a product to improve business decisions and processes Data mesh was first defined by Zhamak Dehghani in 2019, as a type of data platform architecture paradigm and has now become an integral part of Saxo Bank’s approach to data in motion. Using a combination of Kafka GitOps, pipelines, and metadata, Graham intended to free domain teams from having to think about the mechanics, such as connector deployment, language binding, style guide adherence, and data handling of personally identifiable information (PII). To reduce operational complexity, Graham recognized the importance of using Confluent Schema Registry as a serving layer for metadata. Saxo Bank authored schemes with Avro IDL for composability and standardization and later made a switch over to Uber’s Buf for strongly typed metadata. A further layer of metadata allows Saxo Bank to define FpML-like coding schemes to specify information classification, reference external standards, and link semantically related concepts. By embarking on the data mesh operating model, Saxo Bank scales data processing in a way that was previously unimaginable, allowing them to generate value sustainably and to be more efficient with data usage. Tune in to this episode to learn more about the following:Data meshTopic/schema as an APIData as a productKafka as a fundamental building block of data strategyEPISODE LINKSZhamak Dehghani Kafka Summit 2021 KeynoteData Mesh 101 CourseData Mesh Principles and Logical ArchitectureSaxo Bank’s Best Practices for a Distributed Domain-Driven ArchitectureSEASON 2 Hosted by Tim Berglund, Adi Polak and Viktor Gamov Produced and Edited by Noelle Gallagher, Peter Furia and Nurie Mohamed Music by Coastal Kites Artwork by Phil Vo 🎧 Subscribe to Confluent Developer wherever you listen to podcasts. ▶️ Subscribe on YouTube, and hit the 🔔 to catch new episodes. 👍 If you enjoyed this, please leave us a rating. 🎧 Confluent also has a podcast for tech leaders: "Life Is But A Stream" hosted by our friend, Joseph Morais.
No persons identified in this episode.
This episode hasn't been transcribed yet
Help us prioritize this episode for transcription by upvoting it.
Popular episodes get transcribed faster
Other recent transcribed episodes
Transcribed and ready to explore now
3ª PARTE | 17 DIC 2025 | EL PARTIDAZO DE COPE
01 Jan 1970
El Partidazo de COPE
13:00H | 21 DIC 2025 | Fin de Semana
01 Jan 1970
Fin de Semana
12:00H | 21 DIC 2025 | Fin de Semana
01 Jan 1970
Fin de Semana
10:00H | 21 DIC 2025 | Fin de Semana
01 Jan 1970
Fin de Semana
13:00H | 20 DIC 2025 | Fin de Semana
01 Jan 1970
Fin de Semana
12:00H | 20 DIC 2025 | Fin de Semana
01 Jan 1970
Fin de Semana