[Webinar] How to Protect Sensitive Data with CSFLE | Register Today
Financial institutions generate a constant stream of data: customers opening and closing accounts and making purchases, withdrawals, and deposits. This requires the status and balance of each account to be continuously updated. With traditional IT infrastructure, data was built as stores, not as flows, leading to tangled webs of interdependencies across systems in a financial institution.
To maintain a competitive advantage in today’s landscape, banks and other financial services companies are built on constant, real-time data flow. To be successful, they need to be able to utilize data as soon as it’s generated, whether it be the swipe of a credit card or a website click-through.
Data streaming platforms provide an architecture for software to react and operate as events occur. Simple and reusable patterns can be applied over these systems to help meet the design demands of modern real-time distributed systems. Data streaming, for example, makes it much easier to plug in new use cases using the same event streams everyone else is using, or a combination of streams, without having to interface with those other groups. That provides some essential benefits for financial institutions:
Using an Event-Driven Architecture for Financial Services
In the basic sense, event-driven architecture uses events to trigger and communicate between decoupled services. It’s a way to exchange and share data through events. An event is a change in state or an update, like a debit made against a customer’s checking account. An event might trigger a cascade of actions, like verifying a customer’s new address or authorizing a debit charge for a five-dollar latte.
Using events to share changes in data means you can avoid shared database slowdowns, join data easily, and use a push system rather than pull, so the information you need from the data comes to you when you need it.
There are some essential building blocks to know when you’re building event streaming architectures for financial services.
Communicating among services is also part of a data streaming platform, using commands, events, and queries. The order of commands obviously matters in financial services, so Kafka and Confluent guarantee ordering, which lets banks run their transactional systems on the platform.
Finance Use Cases for Data Streaming
Banking and finance offer many real-time use cases that involve event streams, starting with customer 360 events and foreign exchanges, where up-to-the-minute data updates are crucial. In fraud detection, an event trigger that saves seconds or even milliseconds can add significant value. With reconciliations, event streaming makes it possible to see the current state of the process, since data is being reconciled continuously, instead of batch processed at the end of the day.
On the IT side, Security Information and Event Management (SIEM) use cases are very common; events are constantly coming in, and it’s crucial to make sure you can perform whatever security analytics you need to immediately. There are also easy ways to transfer processing load off of mainframes or databases and into a data streaming platform like Confluent.
There are a few aspects of Confluent that are particularly useful for those working in financial services with an event-driven approach. Kafka Streams, a Java library, is the stream processing layer on the Confluent platform. One level higher are KSQL drivers, a SQL-like abstraction of Kafka Streams.
While a topic is a stream of events coming in, a KStream is an abstraction of those events. It contains every event, whereas a KTable is more like a traditional value database. Using a bank example, a KStream might have all the debits and withdrawals, whereas the KTable would have the current balance. Both of those can be materialized from the same topics within Kafka, but you use them for different things.
Once a stream is within Kafka, you can perform a number of transformations on it. Some are basic, like filters, letting you build a stream based on that and then write those out to another KTable or an Oracle database.
Windowing is also a powerful option in streams and tables, letting you set time boundaries for computations. Typically, if you needed a report at the end of the day, you’d run a batch and have to wait for the results. But setting a one-day window means the processing is happening in real time throughout the day as events are coming in. At the end of the day, the answer is available in seconds. Or, build a ten-minute window to aggregate the number of failed logins per user. If someone hits that limit, that could trigger an event that makes a fraud or security app automatically lock out the account.
Finally, enrichment with streams, tables, or both allows you to see different pieces of data from different places and how they come together. You might have a KTable with all your customers’ latest account information. It’s always going to have the most up-to-date information as events come in and things change. Then, if new information comes in, like an address change, you can join that to your KTable to create an enriched stream.
Data streaming and event-driven architectures can provide a secure, centralized platform for financial services companies to move faster and make customer experiences continually better. To learn more, please visit Confluent’s Financial Services resource center found here.
This blog explores how cloud service providers (CSPs) and managed service providers (MSPs) increasingly recognize the advantages of leveraging Confluent to deliver fully managed Kafka services to their clients. Confluent enables these service providers to deliver higher value offerings to wider...
With Confluent sitting at the core of their data infrastructure, Atomic Tessellator provides a powerful platform for molecular research backed by computational methods, focusing on catalyst discovery. Read on to learn how data streaming plays a central role in their technology.