[Webinar] How to Protect Sensitive Data with CSFLE | Register Today

Revolutionizing Telemedicine with Data Streaming

Written By

Telemedicine has widened access to healthcare across the globe. It has allowed people to access services such as video consultations, virtual diagnosis, and remote condition monitoring, helping them to overcome common challenges to obtaining healthcare like geographical distance from medical centers and long wait times. 

As telemedicine services become more integrated within healthcare systems, however, their effectiveness is highly dependent on their underlying data infrastructures.

This post explores some of the challenges related to data infrastructure in telemedicine services, and explains how data streaming with Confluent Cloud can help to address them. We’ll also consider data formats commonly used in healthcare and demonstrate a high-level example architecture for a telemedicine service. 

Technical challenges 

The implementation of a telemedicine service poses a number of technical challenges for data teams. This is due to the fact that such services must operate within strict regulatory boundaries while managing high volumes of often complex medical data from different sources in a secure, reliable way. In order to deliver an effective telemedicine service, teams must navigate the following challenges: 

  • Managing security and compliance requirements – Telemedicine platforms must ensure the secure transmission of patient health data and adhere to evolving healthcare regulations such as HIPAA in the United States and GDPR in the European Union. This involves encrypting personal information, preventing unauthorized access to data, enforcing data retention policies, and maintaining data accuracy for audit purposes. These tasks are rendered more difficult by siloed, monolithic data architectures which rely on multitudes of point-to-point connections. 

  • Ensuring scalable, highly available, and reliable data pipelines – The criticality of remote healthcare services means that they must be able to deal with unpredictable throughputs (e.g., caused by spikes in demand) and avoid downtime or disruptions. This necessitates a distributed, elastically scaling data architecture with redundancy and failover mechanisms, which is often lacking with data infrastructure based on “traditional” messaging queues. 

  • Integrating healthcare data Telemedicine solutions must seamlessly integrate with hospital information systems and other healthcare technologies. Achieving this integration can be technically complex due to the variety of systems and standards in use.

The solution – data streaming for telemedicine

We’ve helped a number of telemedicine providers overcome these challenges by implementing architectures based on data streaming. 

Data streaming involves the continual transfer of “events” between source and destination systems. It allows organizations to simplify their data infrastructures and deliver a diverse range of “data products” from a single platform, avoiding the need to build siloed, monolithic applications.

Apache Kafka® is the leading data streaming technology, used by over 70% of Fortune 500 companies. Many organizations use Kafka as the backbone of their telemedicine services as it enables the scalable integration and processing of healthcare data from multiple systems. 

Confluent Cloud – cloud-native, complete, and everywhere 

Confluent Cloud, based on Apache Kafka and powered by the Kora engine, is a complete, cloud-native data streaming platform which is available “everywhere,” and is used by telemedicine providers as the foundation of their digital healthcare services. Here’s why: 

  • Cloud-native Confluent Cloud is a fully managed cloud Kafka service that delivers elastic scalability, four 99.99% SLAs, and significantly improved p99 latency (especially for high throughput workloads) over open source Kafka. Confluent Cloud provides telemedicine organizations with a fault tolerant and reliable data streaming platform.

  • Complete – Confluent Cloud offers a number of features which go beyond Kafka and enable organizations to rapidly drive value with data streaming. A library of over 120 pre-built connectors (70 of which are available fully managed), stream processing framework (e.g., Flink), and a full suite of security and governance tools (e.g., field-level encryption) are just a few of these. 

  • Everywhere – Confluent Cloud is deployable as a fully managed service on AWS, Azure, and Google Cloud across 60+ regions, and Confluent Platform is deployable on premises.

Confluent for telemedicine – data structure and example architecture 

Before considering an example architecture for a telemedicine solution on Confluent Cloud, let’s first take a look at the structure of events flowing through the system. As you’ll see, the structure of events is fundamental in achieving interoperability between different systems within the healthcare datasphere. 

HL7 FHIR (Fast Healthcare Interoperability Resources) is the standard protocol used for exchanging digital healthcare information. It provides a framework for the exchange, integration, sharing, and retrieval of healthcare data in a secure and efficient manner. HL7 FHIR has these key features:

  • Resource-oriented: Resources are the building blocks of healthcare information, such as patients, practitioners, medications, and observations. These resources are organized into a hierarchical structure, making it easy to work with specific pieces of data.

  • RESTful APIs: Representational State Transfer (REST) principles are utilized for easy and secure data exchange. This aligns with the requirements of telemedicine, where real-time communication is crucial.

  • Modular and extensible: It’s designed to accommodate evolving healthcare standards, allowing for easy extensions and customizations to meet the unique needs of telemedicine applications.

  • Multi-resource representation: There are different formats and serialization options available, such as XML, JSON, and Turtle (a format used to represent Resource Description Framework RDF data).

As a widely used and lightweight format, JSON is the default schema for FHIR resources. To illustrate the format, here's an example for the Patient resource:

{
  "resourceType": "Patient",
  "id": "pat1",
  "text": {
    "status": "generated",
    "div": "<div xmlns=\"http://www.w3.org/1999/xhtml\">\n      \n      <p>Patient Dorothy Gale @ Acme Healthcare, Inc. MR = 998877</p>\n    \n    </div>"
  },
  "identifier": [
    {
      "use": "usual",
      "type": {
        "coding": [
          {
            "system": "http://terminology.hl7.org/CodeSystem/v2-0203",
            "code": "MR"
          }
        ]
      },
      "system": "urn:oid:0.7.6.5.4.3.2.1",
      "value": "654321"
    }
  ],
  "active": true,
  "name": [
    {
      "use": "official",
      "family": "Gale",
      "given": [
        "Dorothy"
      ]
    }
  ],
  "gender": "female",  
  "contact": [
    {
      "relationship": [
        {
          "coding": [
            {
              "system": "http://terminology.hl7.org/CodeSystem/v2-0131",
              "code": "E"
            }
          ]
        }
      ],
      "organization": {
        "reference": "Organization/1",
        "display": "Land of Oz Corporation"
      }
    }
  ],
  "managingOrganization": {
    "reference": "Organization/1",
    "display": "ACME Healthcare, Inc"
  },
  "link": [
    {
      "other": {
        "reference": "Patient/pat2"
      },
      "type": "seealso"
    }
  ]
}

Architecture

Telemedicine services require the integration of data from multiple sources, both internal and external (i.e., third party). Confluent Cloud provides the foundation for streaming, processing, and governing this data, creating “data products” which can be consumed by downstream applications.

In order to integrate healthcare data from multiple sources and systems (e.g., data stores or producers in hospitals, clinics, diagnostic centers, health trackers), this data must conform to the HL7 FHIR protocol. While most modern healthcare infrastructure and software use or are able to produce HL7 FHIR entities, some legacy systems may require a dedicated adaptor to translate their standards into the protocol. 

In the following example architecture, we’ve deployed an Application Gateway to function both as an API endpoint and as a message translator for incompatible systems that need to publish events to the telemedicine ecosystem.

(See full-size image)

In this architecture, all clients send events to the Application Gateway, which will then publish to Confluent Cloud. Before publishing to Confluent Cloud, however, personally identifiable information (PII) is encrypted in the application layer.

Once events are populating in Confluent Cloud, stream processing (i.e., Flink) can be used to join, filter, and enrich streams in real time. This enables the delivery of streaming telemedicine applications, such as an alerting system which prompts patients to schedule a virtual clinical session with a consultant based on a “risk score,” calculated using electronic health records (EHR) and data transmitted by healthcare trackers. 

Processed data is then synced to downstream datastores via fully managed connectors on Confluent Cloud for use in real-time applications or historical analysis. 

To ensure that telemedicine services are not disrupted, data can be replicated across multiple cloud availability regions using Cluster Linking. In the event of data center or cluster failure, consumers can begin to read from a disaster recovery (DR) cluster in another region. Moreover, with a bidirectional cluster link, producers can also write to DR clusters. 

Data streaming for digital healthcare 

Telemedicine services are widening access to high-quality healthcare. They allow people to get personalized medical advice and basic diagnostics regardless of where they live, helping to cut wait times and costs (both to the patient, if applicable, and the healthcare provider).

As telemedicine applications become more integral to wider healthcare systems, however, their underlying data infrastructures must be re-examined. In order to deliver a reliable service, these applications need to be able to integrate sensitive healthcare data in a secure, compliant way while managing unpredictable throughputs and guaranteeing low latency. 

Data streaming with Confluent Cloud meets these requirements. As a complete, cloud-native data streaming platform that’s deployable across all major cloud providers (and on premises), it provides the foundation for future-proofed telemedicine applications. 

If you’d like to experience Confluent Cloud, sign up for a free 30-day trial period today. You’ll gain access to a library of tutorials, helping you to deploy streaming data pipelines in minutes. 

Resources: 

  • Stefano Linguerri is a Solutions Engineer at Confluent, where he works with customers in the public sector to help them modernize and implement new data stream-based architectures. He has previously worked for Red Hat both in delivery and later in the pre-sales team, where he focused on application modernization and new microservices architectures. He has a passion for technology and how it can help customers solve their problems. He believes that better solutions, fast-moving software, and the replacement of custom implementations with industry standards like Confluent can lower risk and drastically reduce time to market.

Did you like this blog post? Share it now

Win the CSP & MSP Markets by Leveraging Confluent’s Data Streaming Platform and OEM Program

This blog explores how cloud service providers (CSPs) and managed service providers (MSPs) increasingly recognize the advantages of leveraging Confluent to deliver fully managed Kafka services to their clients. Confluent enables these service providers to deliver higher value offerings to wider...


Atomic Tessellator: Revolutionizing Computational Chemistry with Data Streaming

With Confluent sitting at the core of their data infrastructure, Atomic Tessellator provides a powerful platform for molecular research backed by computational methods, focusing on catalyst discovery. Read on to learn how data streaming plays a central role in their technology.