Photo: Shannon Hastings
Shannon Hastings is the chief technology officer at Project Ronin, where he helps to lead the organization in delivering systems to help cancer patients and clinicians, including remote patient monitoring for cancer patients.
Ronin uses artificial intelligence, machine learning, natural language processing and electronic health records in its work to manage cancer patients. Its dashboards have explainers on each data point to explain what an AI algorithm is; internal alerts notify users if AI algorithms go off track.
Healthcare IT News sat down with Hastings to discuss how Ronin’s RPM for cancer patients works, the roles of AI and NLP in the RPM, and how EHRs play an important part in this type of care delivery.
Q. In your work, what kind of care is facilitated by remote patient monitoring for cancer patients?
A. What we see is that cancer and its treatments can be very hard on patients and the variety of regimens that those patients go through, whether it be standard, more traditional care or clinical trial care.
Remote patient monitoring enables the physician to be able to understand how the patient is doing with their treatment between the scheduled appointments. So if a patient is having an adverse event and potentially heading to the emergency room, the clinician can intervene and alter care based on patient-reported symptoms or remotely monitored metrics.
Remote monitoring can come in many flavors and what we’re seeing in cancer care is a patient’s reported outcomes, where the patients themselves are reporting how they’re feeling. For example, “I’ve been nauseous today, I have a headache, my headaches are getting worse.”
Being able to have that patient report those symptoms out of band, meaning they’re not in the office, they’re not meeting with the physician, this is during the weeks of their treatment while they’re home. The physician can get insights into how that patient is feeling and reacting to treatment without having to have an appointment.
The physician now can intervene to get in front of adverse events or general symptomatic problems that can happen with the difficult treatments that patients deal with before they become a more serious issue and might require emergency care.
Q. How do you use artificial intelligence to improve remote patient monitoring for cancer patients?
A. Artificial intelligence has really come a long way in the last 10 years from companies like Uber and Tesla for example, and you’re seeing the proof points of its value.
Healthcare generally moves a little slower, but you’re now starting to see artificial intelligence, mainly machine learning and natural language processing capabilities, to not necessarily advise a clinician what they might or should do but be able to bring insights out of the data they’re capturing on patients.
For example, we see patients who have different demographics, lab values, medication regimens along with different reported symptoms, and now we can create models that look across those many factors and provide the physician with insights.
For example, this patient has this diagnosis, been on this medication, has these comorbidities and has reported these symptoms.
This patient is like this other cohort of patients who have ended up in the emergency room and so now we can provide the physician insights using artificial intelligence to aid them in the care of the patient by giving them some predictive capabilities they could find on their own, but the amount of time and data they would have to dig through to be able to do that is unreasonable in their day-to-day job.
Using AI to bring these predictive insights as a decision support tool in front of the clinician helps them make decisions backed by more data effectively.
Q. Where does natural language processing fit into the whole process?
A. Health data is extremely complex. It’s structured in some sense but very poorly structured in another sense, and it’s evolved over time.
One thing we still find is that no matter how much effort we put into standardizing the structure of the data is that EHR vendors still have differences in how the data is shared. Also, the percentage of data in the EHR that is actually plain text is astonishing and we have seen publications that put that number above 80%.
Using natural language processing to be able to extract features out of that data, even structure that data so that it can be used in machine learning models, is critical to obtaining information from that data. Natural language processing is really a necessary step in order to bring structure and meaning to the data that we have in the medical record.
Q. How does all this technology work with electronic health records?
A. There are some real challenges in integrating with the EHR in a universal way, and what we’ve seen in the last 10 years is a really big push by EHR and third-party vendors toward embracing modern and easier to integrate styles with standards, combined with a marketplace style of delivering the experience.
The EHR doesn’t have every feature or support every flow that a clinician might want to use and so these third-party, value-added vendors are bringing technology solutions in front of the physician. A few standards have evolved to help in this integration, primarily FHIR and SMART on FHIR. These enable a much simpler way to integrate and interact with an EHR.
From an integration perspective, you used to have to work with IT to establish VPN connections and HL7 feeds. This made the integration process very slow and expensive. Now, with FHIR and this marketplace concept it’s simply exposing standard APIs that anyone with an engineering background can understand and integrate with and provide value-added services much more quickly with a much lower bar of entry.
From an engineering perspective, and also in a standard way, we can go to EHR vendor A and provide our third-party capabilities and then can go over to EHR vendor B, a completely different vendor, and do the same thing, and the lift to support both of those EHR vendors is much lower than it used to be.
We now can embed our user experience right into the EHR, regardless of vendor, and these third-party experiences are such that the clinician does not have to have two or three screens open, which has been a standard complaint across the board for the last decade.
EHRs are embracing third-party, value-added capabilities being embedded right in the EHR, right next to the workflow that the physician is used to, and that’s really opened up a lot of capabilities and really enabled bringing AI in the forms of machine learning and natural language processing directly into the hands of the physician.
Twitter: @SiwickiHealthIT
Email the writer: [email protected]
Healthcare IT News is a HIMSS Media publication.
Source: Read Full Article