Erin,
This is a great question. I think we are still so new to these types of situations that many people/organizations haven't quite figured out how to address this. Kudos for being proactive on this.
I think one thing you can do is to frame the use in a positive way. Patients may feel uneasy if they perceive AI documentation as surveillance rather than a tool designed to improve their care. I would frame AI not as a recorder but as an assistant that helps providers focus more on the patient rather than typing on a computer. Maybe you could try signage that says something like "Your provider uses (or may use) a secure AI assistant to help document your conversation accurately. This allows them to focus more on you and less on the computer. Let us know if you have any questions." You could post that on a poster/card in exam rooms or maybe by check-in. You could even have a short explainer video on screens in the waiting room if you wanted.
To help ease patient concerns, maybe you could build in an opt-out process during check-in or maybe on the patient portal.
You might also encourage any provider that is using this option to have a short discussion at the start of the visit to show transparency, build trust, and help patients understand why it is being used. Something like "I use an AI tool to help with note-taking, so I can focus on you rather than the computer. No one outside of our office will have access to your information. Let me know if you have any questions."
You may also want to develop some talking points for your staff in case they have any questions. Assuring them this is used to allow their providers to focus on the patient, not taking notes. The AI assistant makes sure the notes are complete and accurate. Also help them understand that the program keeps this information secure and meets strict healthcare privacy regulations (after you have assured this is true). Also let them know that it isn't replacing providers expertise or decision-making, it is just taking notes.
Hope that helps. I'd certainly love to hear from others if anyone is doing this differently.
------------------------------
Tracy Mehan
Director of Research Translation & Communication
Nationwide Children's HospitalColumbus, OH United States
tracy.mehan@Nationwidechildrens.org------------------------------
Original Message:
Sent: 02-26-2025 06:56 AM
From: Erin Moore
Subject: AI Virtual Assistants
Hello! A number of providers in our health center use Nuance DAX and artificial intelligence to record and document patient visits. A number of patients have expressed concern about having their patient visits recorded. I'm wondering what other people are doing to inform patients about the use of AI without causing unnecessary alarm. We don't want to add more for our Patient Services Representatives and MAs to have to discuss with each patient, but it seems like signage in the exam rooms might feel like notification of surveillance.
Thank you!
------------------------------
Erin Moore
Content Writer
Open Door Health Services
IN United States
eemoore@opendoorhs.org
------------------------------