Level Up Your Kafka Skills in Just 5 Days | Join Season of Streaming On-Demand
Cut the time of application delivery by reusing Kafka data structure between projects! Expecting boundaries and data definitions to remain consistent between source and consuming projects can be a constant source of surprise - a Kafka spiderweb. Duplicate datasets with mutations often bring with them monetary, opportunity and reputation costs, as well as errors, inconsistencies, tech debt, reactive approach to data problems, audit issues, data ownership confusion and data that is not fit for purpose. Solving this requires moving towards uniform datasets. And this moves requires an Enterprise Domain Driven Design approach, combined with Event Storming. Doing this architecture work upfront allows for real-time, ubiquitous, distributed data implementations, as opposed to a Kafka spiderweb. This talk and demo will show the design, along with a demo illustrating this approach.