Podcast: Play in new window | Download | Embed
Intersection of Interoperability, Data Quality and Medical Record Remediation
The Velocity Interoperability Blog and Velocity Interoperability Podcast are sponsored by Velocity Health Informatics. Velocity provides both data quality and data integration as a service offerings to ensure that healthcare providers access the correct patient record with the right data for each patient they serve. See the introductory blog post.
TODAYS GUEST Hal Gilreath, Executive Vice President of Client Relations for Velocity Health Informatics
Hal joined us for to set the stage for the Velocity Interoperability Podcast, to initiate the discussion at the Intersection of Interoperability, Data Quality and Medical Record Remediation. Specifically, we discuss the following with Hal:
- What are some of the data quality problems healthcare providers are having?
- What causes these problems?
- A recent guest told me that providers’ EHRs typically contain 8-12% duplicates, is that what you are finding?
- I understand that Velocity has two main offerings to help providers with these problems, one helps with the data quality of medical records and the other with the quality of their integrated (clinical, claims, socio-demographic?) data. Will you describe the medical record remediation offering first?
- You’re one of the few firms that is providing both of these services. Why do you see this as important?
- Data quality is an ongoing effort and I know that’s why you’ve established a platform approach to monitor this in an ongoing service. What are some examples of some results are your customers seeing from your services?
- What’s next Velocity? What are you working on that you plan to bring to your customers at we close 2016 and head into 2017?
Visit Velocity on the Web and follow them on Twitter, LinkedIN and Facebook!
About Velocity
Velocity Health Informatics Inc., previously Health eGRC, was founded in 2011. We are a woman-owned company headquartered in Kansas City, Missouri. Our solutions and services are designed to meet the needs of Accountable Care Organizations, Hospital Systems, Health Information Organizations and Insurers. Velocity is one of the few firms that provides medical record remediation and data integration/data quality AS-A-Service offerings for healthcare providers that are overwhelmed with the amount of medical record duplicates and errors and the issues those errors create up and down stream in their data integration.
About Hal Gilreath
Hal Gilreath is a partner at Velocity Health Informatics where he has responsibility for business development, partner relationships, marketing and client services. He leads Velocity’s outbound communications and works with healthcare organizations’ leadership and partners to solve complex clinical and business issues. His focus is to deliver high value services with a focus on improving patient care, quality and costs through patient record remediation, data quality and integration. His operational experience includes leading internal and external organizations ranging from small strategy teams to large-scale operations and implementations. Velocity services are focused on delivering high quality data and accurate records to support new care models and strategies, interoperability, physician integration, optimization through healthcare analytics, and community care model and population health integration.
Hal was formerly the Vice President and Executive Team Member at Sandlot Solutions where he led the largest team at Sandlot to deploy HIE, analytics and notification tools to clients. Prior to Sandlot Hal was the Americas Cisco Healthcare Consulting Practice where his team delivered strategic consulting engagements focused on extending care through telehealth, new business model creation and improving the patient experience. Prior to this role he was a healthcare lead for Cisco Healthcare Solutions where he improved the delivery of care through innovative solutions. Hal was formerly an executive at Winn-Dixie and a healthcare consulting practice leader at TSC PACS and digital imaging practice, PwC eHealth practice, and with the First Consulting Group Advanced Technology Services practice. Hal is a former Naval Officer where he flew EA-3B’s for the U.S. Navy as a mission commander, and was a program manager for Naval Aviation training programs. He has published articles in leading healthcare and technology journals, and spoken at national healthcare industry meetings.
Hal holds a B.S. in Operations Research/General Engineering from the U.S. Naval Academy; and a M.S. in Health Systems/Industrial Engineering from Georgia Tech.
Subscribe NOW to intrepidHEALTHCARE on iTunes!
The Velocity Interoperability Blog and Velocity Interoperability Podcast are sponsored by Velocity Health Informatics. Velocity provides both data quality and data integration as a service offerings to ensure that healthcare providers access the correct patient record with the right data for each patient they serve.
Transcript
Joe Lavelle 0.54 Welcome to the Velocity Interoperability Podcast brought to us by the gurus at Velocity , I am your host, Joe Lavelle and I am really looking forward to another though provoking discussion where we further investigate data quality, interoperability and medical record remediation.
We’re going to get right to it today. We are joined by Hal Gilreath, Executive Vice President of Client Relations at Velocity Health Informatics. Hal welcome to the Velocity Interoperability Podcast!
Hal Gilreath 1.18 Thanks Joe, glad to be here today.
Joe Lavelle 1.21 Thanks for making time to be with us today! Before we start our discussion today, could you take a few seconds, and tell the audience about you and your background!
Hal Gilreath 1.26 Sure Joe. I came into healthcare almost by chance, but got healthcare on the providers side mostly working at First Consulting Group in the management consulting arena. We saw a lot of mergers and acquisitions, a lot of initial EMR, PACS, and other types of digital systems implementations and that was really where I got inundated with the advent of that, were utilizing healthcare, clinical data and claims data came to the fore and that experience along with my experience at Cisco Systems, in the healthcare group, and others really told me and guided me to the point where we really need to have the clinical data to the providers at the point of care with the correct data and the correct records. And that’s what really led me to Velocity and I think now looking back on all that experience is really helped me understand the importance of that, and the importance of that as we move into new healthcare paradigms with valued based care and ACOs and the pressures that are going to happen in the coming years that’s what’s imperative is to get the right information to the providers.
Joe Lavelle 2.32 Will you take a couple minutes to provide our audience with a 10,000 foot overview of Velocity?
Hal Gilreath 2.38 Sure Joe. Velocity Health Informatics was really founded upon what we call the inner section of a key point of what I was just talking about. We need to get the right record with the right data to the providers at the point of care when they need it. It sounds really simple okay, but there’s firms that do interfaces and integration and we do that. There are firms that do MDM or master data management and data quality and we have built that into our solution, and then we also deliver records and not only do record remediation in illuminating duplicate records and other problems with the records, but we do that through some machine learning tools that automate that, so when you bring it all together we deliver that from a platform called the health information as a services platform which allows us to bring that together to deliver the right record with the right data, high quality data to a provider.
Joe Lavelle 3.34 While we are on data quality, what are some of the data quality problems that healthcare providers have?
Hal Gilreath 3.39 I think there is a couple of them there Joe. Obviously you have to have the interface to develop so you have to connect to the lab system the other ancillaries to EMR and EMRs to other systems, but what happens is you can have a interface that’s basically a transport mechanism and it can be up and running but you can either have coding problems, you can have data correction problems, you can have systems go up in back up data and interfaces and the other systems out there today rarely check on the content of those systems, of the data coming across on the transport, so while you do that and your getting the data into those systems or into analytical applications, you will find that the reports are wrong, the data going into those systems is wrong but its data coming in. What we focus on at Velocity is trying to fix those problems, and alert providers on how to fix them.
Joe Lavelle 4.38 What are some of the causes of data quality problems?
Hal Gilreath 4.44 So on the other side of it when you look at the record side of it there is really a couple of problems (1) is you can go into a healthcare environment and you can have poor registration processes or poor record governance processes. So they go in and they like in your case they may register you as Joe Lavelle or Joseph Lavelle or J Lavelle or spell your name wrong and all of a sudden there is 6 or 7 records for you. Then when a provider comes up they search for you and they pull up a record, they may or may not have a complete view of all your medical history, you may be missing allergies, you may be missing immunizations which would lead to patient safety issues, you may be missing other information which would inhibit or delay billing, so obviously that has a direct revenue problem. So you got to look at it from a process perspective then you have to look at it from how the systems themselves are configured, so that they are all uniform across the entire environment because you could end up with duplicate records within one system, you could end up with duplicate records across multiple systems. In the end a health system is either putting their patients at risk or impacting there financials.
Joe Lavelle 5.54 A recent guest told me that providers’ EHRs typically contain 8-12% duplicates, is that what you are finding?
Hal Gilreath 6.07 Yes, we find that to be a pretty relevant I guess benchmark if you want to call it. That, I think. AHIMA, which is sort of a medical records, HIM industry body publishes that as a standard Metric and what we do is when we go into a client we found that sometimes there are 12 and plus 12% or higher, sometimes they are a little bit better. Sometimes we found that our clients also will come in and say well we have a .5 duplicate rate… which is a warning signal to me that they accept there thresholds so that there either matching records that shouldn’t be matched or not matching records that should be matched. You will hear the term false positives and false negatives which can both lead to significant problems. How about if your record and my record get matched together, that’s a false positive right? It shouldn’t happen that type of thing happens Periodically so if you’re in the 8 to 12% your probably about average and then what we have seen with our clients is we can take them through a governance process, we can assess a registration process and then we actually remediate those records and our goal is to get them down in the 2 to 3% duplicate rate. You’re always going to have some that’s just human nature, but when you’re talking about the difference in volume from say 3% to 8% okay that’s 5% if you have 10 million records, that adds up to a lot of records and when you look at the sort of industry standard cost of a duplicate being anywhere from a $80 to $150 and you do 5% of 10 million times $80 to a $150 all of a sudden that adds up to big money and that’s just in the cost of fixing and remediating them, that’s not in the cost if it Impacts your revenue cycle or if it impacts your patient care.
Joe Lavelle 7.57 I understand that Velocity has two main offerings to help providers with these problems, one helps with the data quality of medical records and the other with the quality of their integration. Will you describe the medical record remediation offering first?
Hal Gilreath 8.13 So the health information of the services platform is a platform that we can ingest clinical data, claims data, and records data into it, and through that platform and we use it as a cloud based offering and obviously we keep it very secure, we fix those records. There is a data quality integration side and then there is a record and record remediation side. Let’s talk a little bit about fixing the records first. We go out there we run a file usually on our clients master patient index, that file comes over we have credential HIM staff that will do an analyses of a sample from our clients environment and they will segregate those duplicates and those other issues on the records into ones that we can use our machine learning tools on and fix those. And then we will have what’s typically called the more complex ones which require manual remediation and that’s where the experience of our staff comes in and the credentials of our staff, so we will fix those records, so now you have the right record for the right patient right.
And then the other side what I call the other side of our platform, we build in the interfaces, so we use agile development processes and what that allows you to do is if say we have 40 or 50 or 60 interfaces to do, using agile process we can keep sort of what I call the factory going and deal with the lab system or the PACS system or the EMR system or the practice management system, its down for a day or its down for two days or we can’t use it for that time period, that allows us to reallocate our staff who we cross train them on all of these different types of interfaces to utilize that and what that means to our clients is they get there interfaces faster and they get it in a much more cost effective manner. Now that we have the data flowing in and we have them going to the right records we then implement data quality monitoring on those interfaces so that we can not only check for the interfaces going through, but it’s the right interfaces. So if you have a bad code like a LOINC code or ICD10 code, it will highlight that and we can set up a notification for you and it will pull that message out and que it up into a work list and allow your interface analyst to go back at the source system and fix the problem whether there be a coding problem, a configuration problem, or a process problem.
A lot of interfaces try to take bad data and fix it in the integration engine and we just think fundamentally that’s the wrong place to fix it, you need to fix it at the source system. I had previous experience where we had about 30 interfaces running from a client and we didn’t have data quality monitoring turned on or implemented, and we got all this data in and about a week later our client comes back and says we just found a glitch in our system and we have been sending you bad data for about a week, can you just go into your systems delete all that data and we will redo it. And we thought okay, and it was really thousands of hours later we was able to readjust that data, so its imperative to put this data quality monitoring on if you want to conduct patient care in the most cost effective manner and make sure your financials are strong and do that with a dashboard. The monitoring doesn’t just send notifications out, but we have basically a visualization tool real basic dashboards that allow you to monitor it from an organization level, from a facility level to the system and then all the way down to the segmented level. As an executive you can see if your volume of traffic is correct, then run down and you can check and see if there is a problem with one facility all the way down to see if there is a problem with the system and then sign it to go fix on the system basis or as an individual interface problem. And I think those types of multi segment views allows you to identify where the problem lies in the most counter effective manner.
Joe Lavelle 13.23 You’re one of the few firms that’s providing both the medical record data quality solution and a data integration offering, why do you see this as important?
Hal Gilreath 13.32 Well I think you highlighted sort of at the very beginning of the podcast Joe it’s the intersection of all those are critical right, if you have the right record that’s one component but if you have the right record with the wrong data that’s not going to deliver what you need either for billing or for patient care or on the other side of it if I deliver the right data to the wrong record I am not treating my patients correctly, so you really need to bring that intersection together and affect that in a very cost effective and timely manner.
Joe Lavelle 14.05 What are some examples of some results are your customers seeing from your services?
Hal Gilreath 14.18 Well I think you hit the nail on the head there Joe in the fact that just because you implement an interface or you implement the record remediation its constant vigilance on this because while we would like to think that different systems and once you configure them they will stay that way, data correction happens record correction happens all the time and for various reasons. So the platform allows you to run that data run those records through that and monitor that on an ongoing basis. Our clients see that as, one we have that discussion with them, and they go umm yeah our analytical reports are running fine we are managing the financials for our ACO and then also we so all these glitches in it and some anomalies in the data in the reports and we found out that the reports are running fine but what the problem was there was a data quality behind it something had changed, and all of a sudden we were getting bad data coming in. That just tells you okay wow and we didn’t have any means to know that except we caught it at the output in the reports, not through the monitoring of the data quality streams or the record streams and so our clients once they see this and once they start to use it they realized boy going upstream and fixing it they are fixing the source system, it is imperative to be able to manage your business effectively.
Joe Lavelle 15.41 What’s next Velocity? What are you working on that you plan to bring to your customers at we close 2016 and head into 2017?
Hal Gilreath 15.49 I think there is a couple of things Joe, one is how do we bring more automation into this so we can do it in a more timely fashion? I mean its great that we have great credential staff they are experienced in everything, but we really need to utilize their time and their efforts around the really more complex issues whether it be data quality issues on fixing the configuration or fixing a record. It’s those records where you got to basically call up and verify who the patient is, software in automation tools can’t do that today, but we are implementing machine learning tools and integrating that into our stack. We are also adding to our stack partner applications around referral management, care coordination because what we are going to do is we are going to deliver the data to that, we are going to deliver the records to those applications. I would envision that our analytics partner ECO system will grow in that regard so that we will be able to go into a client and the client will know that they have analytics vendor and the analytics vendor is getting the right data and the right records and then I could also see internally with us how do we get upstream and solve the problem in a more timely fashion before you get to a duplicate, before you get to a bad data quality problem. So that type of development is underway. We are also looking at how do we develop APIs or what I would say make these algorithms that the data quality algorithms and some of the machine learning tools where we can plug those into a client’s environment so they can use them and they can do it themselves and then they can build their own reports. I think that’s the type of development we are going to have in the future and I look forward to that because I think ultimately what we are doing is constantly improving and constantly bringing greater value to our clients so that they will see fewer data quality and record problems and be able to improve not only the quality of the care but the cost or there care.
Joe Lavelle 17.41 Hal, It was so great to chat with you today, thanks for stopping by and sharing your wisdom with our audience!
Hal Gilreath 17.48 I appreciate it Joe it’s been great and thank you very much.
Joe Lavelle 17.55 It’s our pleasure as well.
Before we wrap this conversation, we’d like to thank the great folks at Velocity for sponsoring the show once again. Please go to http://www.velocityhealthinformatics.com/ to find out more about the innovative ways that they are solving the data quality and interoperability needs of their clients. And now behalf of our guest, Hal Gilreath, I am Joe Lavelle, we’ll be back soon with another informative episode of the Velocity Interoperability Podcast. See you then.
- The Latest in DNA Biology - November 1, 2019
- Introducing Knowmadics - July 19, 2019
- Care Giving with Dave Wortman - June 17, 2019