[Webinar] Build Your GenAI Stack with Confluent and AWS | Register Now
Building financial-grade applications involve performing complex calculations over a wide range of data from across different domains, with challenges including stringent accuracy requirements, latency constraints, along with the need to share states across distributed services.
During this session, I will cover how, at Morgan Stanley, we built a real-time, microservices based Liquidity Management platform using event streaming with Kafka Streams API, to tackle high volumes of data and to perform calculations on cross domain events, spanning wide time windows over the past and the future.
I will demonstrate how we used Kafka Streams & state stores, along with patterns like Saga to achieve eventual data consistency and use state-enriched events to decouple services when transferring them through multiple business domains. I will cover mechanisms to ensure accuracy and transparency with idempotency at heart along with error detection and replay strategies.
Finally, I will look at how we used a high-performant in-memory cache to stage the results of cascaded KStream based calculation engines, which powered our high-speed, ticking and stateful data visualisations.