[Webinar] Build Your GenAI Stack with Confluent and AWS | Register Now
Getting actionable insights from events and addressing common business scenarios usually require bridging together several technologies and programming models to achieve the expected business outcome and bring the flexibility modern applications need.
This talks is about how we can concretely wire together Apache Kafka to collect and distribute real-time events, Apache Flink and stream processing to reduce the noise and transform raw events into business events and a rule engine to analyse complex patterns of events, detect business-meaningful situations in context and leveraging business rules to implement the detection logic. Many Apache Flink users in various industries are already combining these technologies to make their event solutions more powerful and more flexible. Built on a fictitious but realistic airline scenario, this talk shows how to technically integrate a rule engine technology, like Redhat Drools, with Kafka and Apache Flink and pinpoint on how stream processing and complex event processing are complementary and different at the same time, and where one need to pay attention to scale the solution to production environment.