Développez l'apprentissage automatique prédictif avec Flink | Atelier du 18 déc. | S'inscrire

Real-Time Pharmaceutical Authorization

Écrit par

Acquiring a single prescription medication is a complex end-to-end process that requires the seamless orchestration of data across drug manufacturers, distributors, providers, and pharmacies in the healthcare ecosystem. Whether it’s distribution of Schedule II medications to specific pharmacies or a provider writing prescriptions, adherence to stringent standards set by regulatory agencies such as the FDA is not just a matter of legality but also patient safety. Providers are required to stay up-to-date on medication classifications, dosage regimens, and potential interactions while pharmacies must have authorization to dispense certain medications.

Today’s healthcare industry is rapidly modernizing – there’s greater IoT connectivity, proliferating data volumes, new digital competitors, and high patient expectations (e.g., 1-hour prescription pickup and instant mobile notifications). Data streaming enables real-time authorization to streamline the ordering and distribution process, reducing operational burden and ensuring that patients receive the appropriate medications on time.

Learn about data streaming for healthcare, with AI and ML to unlock new use cases →

Consider a system where every medication and device is scrutinized in real time, ensuring that only authorized providers and patients have access (e.g., Class II medical devices such as CPAP machines can only be sent to authorized providers). This is the essence of a compliance matrix – a source of truth that maintains information on medication classifications, provider authorizations, and regulatory guidelines. A compliance matrix leverages data from various sources, including: FDA policies and laws such as the Drug Supply Chain Security Act (DSCSA) requiring the tracking and trace of prescription drug shipments, internal authorization systems like JD Edwards, and provider databases stored on technologies like Oracle and IBM DB2. A data streaming platform integrates and processes disparate data as it’s changing in real time, to help the compliance matrix continuously generate a holistic, up-to-the-second view of medication authorization status, enabling real-time decision-making regarding medication distribution. The benefits of streaming data for real-time authorizations include: 

  • Cost savings and preventing medicine waste. Checking that pharmacies are approved to dispense medications and proactively flagging any issues in real time before they’re shipped by distributors minimizes the risk of rerouting transportation from distribution warehouses or medications being wasted if authorization is revoked or not obtained. 

  • Ensuring compliance while safeguarding patient safety. Non-compliance may result in regulatory action and delays in patient care – pharmaceutical companies risk hefty fines while pharmacies and providers face the threat of losing their operating licenses. With real-time data for the compliance matrix, these risks are mitigated through proactive monitoring and enforcement of authorization protocols. 

  • More accurate outcomes with greater efficiency. Streaming data can be processed and used in a compliance matrix for faster decision-making based on a real-time view of substances and authorizations without manual information collection and updates – increasing automation, lowering latency, and reducing the chance of error. 

Today’s infrastructure and data challenges

The journey toward real-time authorization is not without its challenges, particularly across large healthcare organizations with siloed data systems. Batch-based data silos make it difficult to keep a compliance matrix up to date and introduce latency at every step of processing in the pharmaceutical supply chain, with inefficiency having a cascading business impact. Moving from batch to real time can bring cost savings in the millions. 

Learn how to transform your data pipelines, transform your business →

Today, when there are issues such as duplicate orders, pharmacies would need to phone a call center and wait for customer service to validate that there should only be one order. The patient may not be able to receive medication on time, and there may be further delays and costly manual efforts needed to reconcile data discrepancies.

The key challenges include: 

  • Batch data processing that can take weeks or months, creating stale data and potential gaps from data loss between batches. 

  • Legacy technologies such as mainframe, SAP, JMS message queues, and expensive on-premises databases. 

  • Point-to-point integrations limiting availability of data, inhibiting scale, and creating compliance concerns.

  • Creating custom code, APIs, or needing systems integrators to build customized solutions for accessing data sources like SAP. 

  • Siloed data preventing access and hindering the ability to find and share relevant data.  

  • Lack of awareness regarding what data exists and whether the data is reliable. 

  • Duplicate data contributing to a data mess, which creates data coherency issues and increases storage and networking costs.

  • Time spent maintaining platforms instead of building new capabilities and improving patient outcomes.

Data streaming solution with Confluent

With Confluent, healthcare organizations can use real-time data to enable immediate authorization for timely and compliant pharmaceutical distribution for providers. Confluent provides a fully managed data streaming platform to help stream, connect, process, and govern data at scale.

Stream data across on-premises, hybrid and multicloud environments. Pharmaceutical distributors can deploy hybrid use cases by using Confluent Platform at local distribution warehouses with the ability to process data offline, along with highly elastic, resilient, and performant Confluent Cloud powered by the Kora Engine to centralize data in any cloud in the event of network outage. Cluster linking mirrors topics to ensure high availability for 24/7 operations.

Connect data across operational and analytical systems involved in the distribution supply chain by using pre-built, fully managed source and sink connectors. Bring together disparate data including drug serial numbers, patient medical records (PHI, EHR, EMR), provider information, telemetry data (e.g., GPS of trucks from local distribution warehouses) – in order to maintain a compliance matrix, keeping track of the latest medications and provider information. Future-proof your architecture with streaming data pipelines to ensure the right data, in the right format, gets to the right place.

Process data in flight with Flink to create data products such as curated list of providers, medications, pharmacies, shipping partners, etc. for additional business units to leverage. Join multiple tables together to have materialized, real-time views of providers and authorizations. Stream processing unlocks greater automation and efficiency. For example, join prescription drug shipment numbers with GPS data from trucks to calculate how long it took for a serial number to go from one distribution center to a target pharmacy destination. This can then be used to optimize delivery routes for faster fulfillment for patients.

Govern data by using Stream Governance to ensure compliance and security. Data Portal makes data products discoverable and accessible in a self-service way across the organization, amplifying data value through reuse. It automates authorization, access control, provisioning and simplifies the way data is exchanged today while providing an onramp for new use cases.

Solution implementation

Here’s an overview of the hybrid deployment architecture for this real-time authorization use case, which uses both Confluent Platform and Confluent Cloud:

(See full-size image)

Confluent’s technical solution for this use case comprises the following:

  • Confluent clusters are deployed in a data center as well as on the cloud provider of choice (AWS, Azure, or GCP).

  • In a data center, provider data and transportation data are stored within IBM DB2, legacy apps & systems, and a MySQL database. Change data capture (CDC) and MySQL source connectors are used to write to topics in Confluent Platform. 

  • At same time, external medication data is written to topics in Confluent Cloud while Cluster Linking mirrors provider data into topics in the cloud. 

  • Stream processing joins and transforms data to create data products for the compliance matrix to have a real-time view of the latest medications and authorized providers: approved_transactions, eligible_providers, ineligible_providers.

  • Data is shared downstream with MongoDB Atlas, Elasticsearch, distribution trucks and apps, as well as receiving pharmacies. 

(See full-size image)

As shown above, data from an ERP data store and legacy apps and systems are written to topics in Confluent. Stream processing with Flink transforms data streams in real time to create data products. For example, the external_medications and erp_provider streams are joined to create a live list of ineligible_providers. Those streams are subsequently joined with erp_provider_address to approve transactions and create materialized views for delivery_status and eligible_providers. These data products can then be shared downstream with the delivery fleet and various data stores, apps, and systems to fulfill prescription orders.

Learn how to use stream processing with Flink on Confluent →

Conclusion

By democratizing access to streaming data pertaining to medication distribution to pharmacies, Confluent helps healthcare organizations make real-time authorization a reality and significantly cut down on time, errors, and unnecessary expenditures. 

This not only streamlines the authorization process but also ensures compliance with laws such as DSCSA. It also enables the enhanced ability to detect and prevent fraud such as when the same medication is at the same provider twice. With real-time data feeding anomaly detection algorithms, organizations can swiftly identify suspicious patterns and flag potential instances of Medicare/Medicaid fraud. This proactive approach protects patient safety as well as safeguards against financial losses.

Embracing a hybrid cloud architecture not only modernizes and future-proofs healthcare data infrastructure, but also unlocks new use cases. Streaming data can be used for real-time analytics for distributors to expand product offerings to additional providers, optimize shipping volumes and lead times. 

Furthermore, the integration of new AI technologies with data streaming opens up more possibilities for innovation. From providing real-time support to healthcare providers and pharmacists through AI chatbots to automatically changing medication distribution routes due to live traffic conditions, the potential for enhancing fulfillment and operational efficiency is boundless. AI can even assist in identifying potential negative drug interactions, ensuring patient safety remains a top priority. 

Leveraging Confluent for real-time authorization streamlines medication distribution and also paves the way for a more transparent and patient-centric future. Embracing the power of streaming data, the healthcare industry can drive new opportunities for growth, cost reduction, and improved patient outcomes. 

To learn more, here are additional resources: 

  • Dave Kline is a Staff Solutions Engineer at Confluent. With over a decade of experience in putting open-source technology into practice, Dave specializes in crafting and implementing data streaming solutions that significantly impact Fortune 10 customers in healthcare, energy, and retail.

Avez-vous aimé cet article de blog ? Partagez-le !

Win the CSP & MSP Markets by Leveraging Confluent’s Data Streaming Platform and OEM Program

This blog explores how cloud service providers (CSPs) and managed service providers (MSPs) increasingly recognize the advantages of leveraging Confluent to deliver fully managed Kafka services to their clients. Confluent enables these service providers to deliver higher value offerings to wider...


Atomic Tessellator: Revolutionizing Computational Chemistry with Data Streaming

With Confluent sitting at the core of their data infrastructure, Atomic Tessellator provides a powerful platform for molecular research backed by computational methods, focusing on catalyst discovery. Read on to learn how data streaming plays a central role in their technology.