In this article, originally published on GovDataDownload, Kim Garriott, Chief Innovation Officer, Healthcare at NetApp, commented on data interoperability and its impact on healthcare. Garriott said that data standardization is important in creating trust among systems. The COVID-19 pandemic has created a large opportunity to explore data standardization and interoperability.
COVID-19 vaccine rollouts continue around the world, and the GovDataDownload team is looking closely at how data powers those efforts. Data agility, accessibility, and scalability all contribute to building an effective vaccine deployment approach. This massive opportunity for data management technologies prompted the discussion around important lessons learned in public health during the pandemic, specifically around the crucial role of data standardization and interoperability.
We connected with Kim Garriott, Chief Innovation Officer, Healthcare at NetApp, to learn more about how data interoperability impacts the world of public health, the challenges experts are facing with it right now, and why it’s so critical in data standardization initiatives and the general effort around promoting trust in data.
According to Garriott, data standardization plays a pivotal part in data achieving its true potential. A lack of standardization fuels the ongoing challenge of data mistrust, a big wrench in the application of data in public health. A lack of data standardization also directly impacts the artificial intelligence and machine learning algorithms on which health data experts are heavily leaning to effect change in their respective areas of research.
“AI will fulfill the promise to transform our work, but we can’t forget that it has to be taught,” Garriott explained. “This is why we must adopt and consistently apply both data and data interoperability standards. Ultimately, this will also lead us to having a greater trust in our data.”
It stands to reason that a greater pool of data to work from, made available through increased interoperability, would better inform AI and ML algorithms. In turn, it would offer a more robust set of information for health experts, creating a more holistic image of the information captured. Garriott also pointed out that data standardization, coupled with the power of AI, would increase the value of legacy data to continue building that holistic image by more effectively integrating it with new data.
“The data that already exists in our healthcare systems has strong and compelling intrinsic value, including financial value,” Garriott remarked. “So as data experts in that field, we are tasked with figuring out how to standardize it and normalize it so it can actually be used. On top of that, we need to find a cost-effective way to do that.” Garriott thinks an effective approach to alleviate this challenge is through the standardization of ontology, or the formal naming and definition of data elements. This approach would create a single plane from which data scientists can work and would build the bridge between new and legacy data.
While Garriott admits this is a tall order, the chasm of classification between legacy data and new data is only going to grow. Garriott warned that AI and ML cannot be seen as the panacea for integrating them either when those algorithms are not properly conditioned to work from the same parameters across different data sets.
It’s understood that the challenges that come with making sense of these mountains of data exist in both the public and private healthcare sectors. But Garriott thinks the public sector should look to the private sector when it comes to their approach to data capture and the methods used to share data. This is an issue that simultaneously stems and sows data mistrust.
“One of the challenges that we have across public health is that there is a lack of trust of the information that is being shared, if it’s being shared at all,” she said. “We do not have strong mechanisms in place to share data between our own agencies within public health, let alone have the appropriate APIs or interfaces with state and private sector to effectively gather and utilize data.”
This is where the importance of data standardization and interoperability, powered by a comprehensive data sharing platform, comes into play. Garriott explained that establishing a platform and a series of standard best practices in its operation will promote both a stronger trust of data as well as a more transparent and powerful data set at the disposal of medical research professionals.
“It’s a really exciting time as people are working to solve for these problems,” Garriott concluded. “They just need to have the right tools in their toolbelt to get the job done. At the most basic level, one of those tools includes interoperability via data standardization.”
To learn more about data sharing platforms being utilized by healthcare organizations, click here.
This article was originally published on GovDataDownload on February 17, 2021