- Healthcare organizations are seeking real-time information by gaining a single, simplified view of patient data. However, organizations integrating disparate health data management systems can face interoperability issues and challenges with centralized infrastructure.
WhamTech CTO and Senior Vice President Gavin Robertson told HITInfrastructure.com that multiple data sources, interoperability, and EHRs are the main challenges with integrating health IT systems.
Data integration is typically the first challenge healthcare organizations face due to the incompatibility of multiple health IT infrastructure systems.
“Multiple disparate data sources are one of healthcare’s biggest problems. Even with Epic or Cerner, organizations usually just buy modules, they don’t buy the entire solution, leaving them with an integration issue,” said Robertson. “Even organizations like Kaiser Permanente who use a single Epic-like approach, still have a lot of internal systems. Organizations have network providers to tie in and external sources that they want to integrate with their internal clinical data.”
Interoperability and data quality are also challenges for healthcare organizations seeking a single-view approach.
“The typical approach is application to application which is normally how HIEs and HL7 messaging systems work,” Robertson explained. “Clinicians need access to all relevant patient data and they need to get away from application to application transfer.”
Many interoperability issues stem from data quality, according to Robertson. Replacing humans with IT systems can sometimes lead to minor errors going undetected.
A clinician recording a patient’s visit on a clipboard is more likely to recognize a minor mistake than a clinician entering data into an EHR. People can accommodate for typos and misplaced data, but machines cannot.
EHRs can also cause trouble for clinicians when systems are not integrated into a single patient view.
“Clinicians are overwhelmed with data so you put an EHR in place,” said Robertson. “Instead of having the latest information on a clipboard, they now have access to years of information. However, it’s taking twice as long for a doctor to interact with a patient than it did ten years ago.”
There is a fear that clinicians will become too overwhelmed with unorganized data and will be unable to properly diagnose a patient.
Many healthcare organizations are struggling because they have been piecemealing their data management tools for years without addressing fundamental data issues.
The fundamental data issues include data discovery, data profiling, HIPAA classifications, data security (i.e. encryption), data quality standards, access to systems, integration, and interoperability.
Robertson added that organizations tend to buy data management solutions with only certain components in mind.
“Healthcare organizations tend to buy master data management systems or data governance systems, then they’re left with how to piece all of the systems together,” Roberson explained. “By having these separate systems, you introduce another layer of integration requirements and latency. You’re constantly copying data around to address some of the basic data management issues.”
Organizations are also not addressing fundamental data management issues at the source.
“Organizations need to address fundamental data management issues before any information leaves a source system regardless of where it ends up,” Robertson advised. “This solves a lot of issues that get introduced later on when entities try to address them retrospectively instead of upfront.”
“Part of what’s been missing is that healthcare organizations are perfectly willing to make multiple copies of operational data addressing some of the piece meal fundamental data issues instead of trying to deal with it upfront and as a whole.”
For example, IDC estimated that most healthcare organizations have between 10 and 120 copies of operational data. The more copies of data exist, the more complex data management is to deploy.
Data management only becomes more difficult as new technology, such as the Internet of Things (IoT), is introduced into health IT infrastructure. IoT technology makes real-time patient monitoring possible as long as the data can be managed properly.
The IoT hugely impacts the future of patient monitoring because it provides additional information and it monitors patients in real-time. Clinicians are able to interact with patients in real-time using the data collected by IoT devices, improving patient care.
“The problem is that the IoT device sits on someone’s wrist somewhere or on a monitor on someone’s chest and it becomes another disparate data source for organizations to integrate with the rest of the IT infrastructure,” said Robertson. “It’s a critical part of any dataflow into a patient record.”
“If organizations want a holistic view of a patient, they need to account for the IoT device data in real-time,” he continued. “Clinicians want to impact the patient’s ongoing healthcare. They don’t want to wait until two weeks later finalize the data, they want to deal with it there and then.”
Real-time data allows clinicians to collect, analyze, and decide on a patient’s condition during their initial interaction. Real-time environments lower costs because you avoid the bulk processing and the overnight loading into data warehouses.
Real-time environments also help with data governance, making sure the information entered is correct. If organizations can address data governance upfront then it solves a lot of problems concerning data quality.
But how do healthcare organizations move towards utilizing IoT devices and being able to integrate real-time data with patient records?
Robertson suggested healthcare organizations consider a decentralized virtualization approach to data management to deal with interoperability issues and future infrastructure digitization.
“A lot of legacy systems are centralized which introduces a problem,” said Robertson. “In many cases organizations cannot centralize information, which leads to federation.”
Robertson offered that organizations consider a solution that will distribute query processing to adaptors that connect to backend data sources.
“Virtualization means that as adaptors are built, data management fundamentals are addressed at the edge next to the data source,” Robertson explained. “Anytime a user needs data from that data source, it’s already addressed the data management fundamentals. It’s like a distributed parallel processing network that leaves data in sources.”
“Only when the information is needed at a companywide, department, or individual level is it retrieved from the source systems and delivered to the end user with all the data management fundamentals addressed.”
A virtualized approach to data management naturally lends itself to a real-time environment. Healthcare organizations need to move towards a real-time environment to take full advantage of IoT technology. Real-time patient care supported by centralized, single-view data management will save organizations money and allow clinicians to make much more accurate diagnoses.