The main goal of the present case study is to infer the possibility of introducing the Apache Kafka paradigm to the exchange of healthcare information. Initially, a simple HL7 message generated in accordance with documentation was used as message origin message. Then that message should reach destination using the FHIR standard. As communication middle and handler, the Apache Kafka was implemented featuring a Confluent docker image. To map between HL7 versions, the Python language converts the original message into a JSON object. Then the Kafka API handles the socket interface between structures. Considering the charge test, the results were very positive considering a 6000-message integration under one minute and with an offset of 500 messages. Also, the systems are capable of maintaining the order of messages and recover in case of errors. So, it is possible to use Kafka to share JSON objects under FHIR standard but having in mind a prior definition of the topic, consumer, and producer configuration.
TopIntroduction
In healthcare, eHealth initiatives produced transformations regarding storage, share, and reuse of data in healthcare (Digital health, 2019). Interoperability is then the capability of two or more programs/systems to exchange information over a mean and most of all understand and reuse that data to produce information (F. Rodrigues,2013).
Working in favor of interoperability for more than 30 years now, HL7 International and the purposed standards are among the most used worldwide (HL7.org, 2011). Healthcare institutions use HL7 V2, V3, or Fast Healthcare Interoperability Resources (FHIR) standards to perform all kinds of communication inside and across the organization. Some usage examples are requested patients’ medical exams, retrieve administrative information, or even publish reports from previously ordered exams. The V2 is almost 30 years old and still is the most used worldwide. The FHIR most recent intends to increase semantic interoperability and reduce ambiguity. Using the best features from the previous versions, namely the Reference Information Model (RIM) a real game-changer when released on the V3 of the standard. More focused on the content, the FHIR standard opens the door open to new and innovative ways to exchange information (technical interoperability). Through the introduction of RESTful web services models to perform that job (Sousa R. et al, 2021).
As opposed to the V2 interfaces that share information using synchronous network interfaces, the old Transmission Control Protocol/Internet Protocol (TCP/IP) layer. The resultant architecture is simple, sender a receiver, and communication mean. The sender waits for a formal confirmation from the receiver indicating that the message was delivered with success or not. The so-called ACK or formal acknowledgment. Besides sockets limitation and difficult initial configurations, the existing structures are prone to errors and communication limitations. Also, the recovery from the error is very complex and perceives manual intervention.
The concept of publish/subscribe messaging system has been proliferating in the big data universe. Due to the capacity to accomplish enormous streams of data in real-time. Describes a form of communication between modules or components that are not configured with the one-to-one paradigm. The labeled information produced by the producer is shared over a common and scalable means. Then the broker delivers to the consumer ensuring the producer of the status. One of the most notorious Application Programming Interfaces (API) regarding the real-time data stream is with no doubt the Apache Kafka (Sousa, R. et al, 2021).
This case study intends to verify if it is possible to use Apache Kafka broker and adjacent concepts in the exchange of clinical information under the HL7 FHIR standard. That constitutes the main goal and the research question. Having in mind that the original message will be on HL7 V2 format and that FHIR uses JavaScript Object Notation (JSON) format to exchange information. So, a mapping technique is necessary. That is another research question. How can we bridge between HL7 V2 and FHIR standards and at the same time maintain semantic interoperability? Dealing with clinical information implies that the system has a strong and stable communication protocol and architecture. That is another objective of the present case study. Create the architecture to modernize the way information is exchanged now in most of the healthcare facilities and at the same time maintain the levels of efficiency. Another important goal is to answer the following research question, how can we implement a system like this in FHIR when the industry uses V2 and soon will not change?
The innovation of the present case study is to introduce a big data topic (Apache Kafka) to an area that is long due to an update. Leaving the plain text sockets and Representational State Transfer (REST) endpoints and embracing an innovative way to share information. A more abstract technology (leaving the one-to-one configuration) more focused on semantic interoperability and less on the syntactic one.
The present document is organized as follows. Introduction, background explaining key concepts like big data, interoperability, HL7 and presenting at the state of the art of the respective field. Case presentation a simple section presenting main research questions and subjects that the case study should respond to. Following the results and discussion section containing the main juice of the article. To finish a brief conclusion.