Unlock the Secrets of Shifting Left in Our Upcoming Webinar | Register Now
In a world of instrumented products, assets, infrastructure, and devices, it is impossible to “store-and-then-analyze'' all events, so applications that depend on a continuously updated view of the evolving world have to analyze, learn, and predict directly from streaming events. They need to build models on-the-fly whose predictions are accurate and in sync with the real world, often to support automation. Many insights depend on analyzing the joint evolution of data sources whose behavior is correlated in time or space. In this talk we present Swim, an Apache 2.0 licensed platform for continuous intelligence applications. Swim builds a fluid model of data sources and their changing relationships in real-time - Swim applications analyze, learn and predict directly from event data. Swim applications integrate with Apache Kafka for event streaming. Developers need nothing more than Java skills. Swim deploys native or in containers on k8s, with the same code in each instance. Instances link to build an application layer mesh that facilitates distribution and massive scale without sacrificing consistency. We will present several continuous intelligence applications in use today that depend on real-time analysis, learning and prediction to power automation and deliver responses that are in sync with the real-world. We will show how easy it is to build, deploy and run distributed, highly available event streaming applications that analyze data from hundreds of millions of sources - petabytes per day. The architecture is intuitively appealing and blazingly fast.