[Webinar] How to Protect Sensitive Data with CSFLE | Register Today
Why a data mesh?
Predicated on delivering data as a first-class product, data mesh focuses on making it easy to publish and access important data across your organization. An event-driven data mesh combines the scale and performance of data in motion with product-focused rigor and self-service capabilities, putting data at the front and center of both operational and analytical use-cases.
Underpinned by four major principles, data mesh is a renegotiation of social responsibilities in combination with modern event streaming technologies. The result is a network of continually updating event-streams that provide both fresh information and a historical record, enabling consumers to choose and use the data as they see fit.
Author Adam Bellemare explains how an event-driven data mesh built on top of Apache Kafka® provides the optimum way to access important business data and unify the operational and analytical planes. He also walks you through a proof of concept self-service platform built using Confluent Cloud, that ties together the data mesh principles with real-world technical tradeoffs.
Don’t just take it from us - a real-world case study illustrates the implementation of an event-driven data mesh by Saxo Bank, including their challenges, technology choices, and the implementation of the data mesh principles.