[ウェビナー] Confluent と AWS を使用して生成 AI スタックを構築 | 今すぐ登録

Event Streaming Architectures to Solve Problems for FinServ

作成者 :

Financial institutions generate a constant stream of data: customers opening and closing accounts and making purchases, withdrawals, and deposits. This requires the status and balance of each account to be continuously updated. With traditional IT infrastructure, data was built as stores, not as flows, leading to tangled webs of interdependencies across systems in a financial institution.

To maintain a competitive advantage in today’s landscape, banks and other financial services companies are built on constant, real-time data flow. To be successful, they need to be able to utilize data as soon as it’s generated, whether it be the swipe of a credit card or a website click-through.

Data streaming platforms provide an architecture for software to react and operate as events occur. Simple and reusable patterns can be applied over these systems to help meet the design demands of modern real-time distributed systems. Data streaming, for example, makes it much easier to plug in new use cases using the same event streams everyone else is using, or a combination of streams, without having to interface with those other groups. That provides some essential benefits for financial institutions:

  • Banks can analyze transactions in real time to detect (and prevent) fraudulent activity as soon as it occurs
  • Investment firms can build event-driven trading platforms that support real-time stock trades
  • Financial institutions can seamlessly aggregate data across lines of business and build centralized compliance hubs to ensure regulatory requirements
  • Thousands of global, retail banks can process, sync, and analyze customer behavior in real-time across their website, mobile app, and brick and mortar location

Using an Event-Driven Architecture for Financial Services
In the basic sense, event-driven architecture uses events to trigger and communicate between decoupled services. It’s a way to exchange and share data through events. An event is a change in state or an update, like a debit made against a customer’s checking account. An event might trigger a cascade of actions, like verifying a customer’s new address or authorizing a debit charge for a five-dollar latte.

Using events to share changes in data means you can avoid shared database slowdowns, join data easily, and use a push system rather than pull, so the information you need from the data comes to you when you need it.

There are some essential building blocks to know when you’re building event streaming architectures for financial services.

  • Event notifications: These are an indication that, essentially, something happened. That could be a new customer address, for example, that then triggers further action in the application.
  • Event-carried state transfer: Often, an event notification is accompanied by a state notification, which includes details about the actual event. In a customer address change, for example, this would contain the actual address, unlike the event notification.
  • Event sourcing: This brings in the full history of state changes. For example, an account balance is a series of deposits and withdrawals; to get to the current state of the account, it requires reading through all those events—state changes like address or name changes. This is essential for running analytics on the data.
  • Command query responsibility segregation (CQRS): WIth a CQRS pattern, you’re building a separate path for your reads (commands) and writes (queries)—an asynchronous way for writing data and then getting a response. To make a purchase, for example, the command or read is using a debit card to buy a coffee. To complete that, the application has to go through the steps of checking the account balance. With CQRS, the application can listen to both channels, reads and writes, and wait for the tagged response to come back that the transfer is OK. It’s a powerful pattern to use and reuse in financial services.

Communicating among services is also part of a data streaming platform, using commands, events, and queries. The order of commands obviously matters in financial services, so Kafka and Confluent guarantee ordering, which lets banks run their transactional systems on the platform. 

Finance Use Cases for Data Streaming 
Banking and finance offer many real-time use cases that involve event streams, starting with customer 360 events and foreign exchanges, where up-to-the-minute data updates are crucial. In fraud detection, an event trigger that saves seconds or even milliseconds can add significant value. With reconciliations, event streaming makes it possible to see the current state of the process, since data is being reconciled continuously, instead of batch processed at the end of the day.

On the IT side, Security Information and Event Management (SIEM) use cases are very common; events are constantly coming in, and it’s crucial to make sure you can perform whatever security analytics you need to immediately. There are also easy ways to transfer processing load off of mainframes or databases and into a data streaming platform like Confluent.

There are a few aspects of Confluent that are particularly useful for those working in financial services with an event-driven approach. Kafka Streams, a Java library, is the stream processing layer on the Confluent platform. One level higher are KSQL drivers, a SQL-like abstraction of Kafka Streams.

While a topic is a stream of events coming in, a KStream is an abstraction of those events. It contains every event, whereas a KTable is more like a traditional value database. Using a bank example, a KStream might have all the debits and withdrawals, whereas the KTable would have the current balance. Both of those can be materialized from the same topics within Kafka, but you use them for different things.

Once a stream is within Kafka, you can perform a number of transformations on it. Some are basic, like filters, letting you build a stream based on that and then write those out to another KTable or an Oracle database.

Windowing is also a powerful option in streams and tables, letting you set time boundaries for computations. Typically, if you needed a report at the end of the day, you’d run a batch and have to wait for the results. But setting a one-day window means the processing is happening in real time throughout the day as events are coming in. At the end of the day, the answer is available in seconds. Or, build a ten-minute window to aggregate the number of failed logins per user. If someone hits that limit, that could trigger an event that makes a fraud or security app automatically lock out the account. 

Finally, enrichment with streams, tables, or both allows you to see different pieces of data from different places and how they come together. You might have a KTable with all your customers’ latest account information. It’s always going to have the most up-to-date information as events come in and things change. Then, if new information comes in, like an address change, you can join that to your KTable to create an enriched stream.

Data streaming and event-driven architectures can provide a secure, centralized platform for financial services companies to move faster and make customer experiences continually better. To learn more, please visit Confluent’s Financial Services resource center found here.

  • Starting with developing some of the very first enterprise “eCommerce” sites, Russ Katz’s career now spans over 25 years of remaining on the cutting edge. Now a Customer Success Technical Architect at Confluent he continues to work with many of the world’s largest organizations bringing them into the modern data era of Event Streaming with Apache Kafka.

このブログ記事は気に入りましたか?今すぐ共有

Win the CSP & MSP Markets by Leveraging Confluent’s Data Streaming Platform and OEM Program

This blog explores how cloud service providers (CSPs) and managed service providers (MSPs) increasingly recognize the advantages of leveraging Confluent to deliver fully managed Kafka services to their clients. Confluent enables these service providers to deliver higher value offerings to wider...


Atomic Tessellator: Revolutionizing Computational Chemistry with Data Streaming

With Confluent sitting at the core of their data infrastructure, Atomic Tessellator provides a powerful platform for molecular research backed by computational methods, focusing on catalyst discovery. Read on to learn how data streaming plays a central role in their technology.