Blogs

  

Healthcare Communications, AI Policies, and You 

Health Literacy Specialists Should Be Part of AI Policy Planning

By Tracy Mehan, AI Subject Matter Expert 

AI meeting in health organization


I was recently asked to give a talk about what healthcare organizations should consider when developing policies around employee use of programs that utilize artificial intelligence (AI). To prepare, I researched who is typically involved in creating these policies and what factors organizations weigh as they develop them.

What I found is that, typically, the people in the room where decisions are being made are members of the C-suite and IT. That’s it. Who does this leave out? The frontline doctors, nurses, marketing departments, DEI chairs, education and communication staff, accessibility advocates, and health literacy specialists who work with our patients every day.

We need ALL of these unique perspectives at the table to ensure the programs we use are being ethically developed and that any content generated by AI technology can be understood and used by the people our organizations serve.

At the end of the talk, I encouraged everyone to take a more active role in discussions around AI use and policies within their organizations.

Health Literacy Must Have a Seat at the Table

Shortly after giving my presentation, I decided to follow my own advice and find out what was happening at my institution. It turns out there is an AI committee, and they are open to having new members, so I asked to be included.

At the first meeting I attended, we heard from a speaker from a prominent software company that many healthcare institutions, including my own, use. The speaker had been invited to present on exciting new advances the company’s platform would be offering in the coming months — all powered by AI. Let me tell you — there’s some exciting stuff coming down the pike that has the potential to completely change our workflows!

The speaker went on to explain how the company is pilot-testing an add-on feature to their software that uses AI to answer questions that come in through patient portals. The software drafts an answer to the patient’s question and then notifies the healthcare provider, who is required to review the answer before it is sent to the patient. This dramatically cuts down on response time, because the doctor or nurse has to only review an answer rather than draft one from scratch every time.

The company trained the AI program by working with doctors and nurses in 3 clinics who reviewed answers and rated their accuracy. This feedback was then used to refine and improve the program. The company is now testing the program in more than 75 locations.

At this point, I raised my hand and asked if the company had included any health literacy specialists in the testing to make sure the answers were not only technically accurate, but also written in a way that enabled patients to understand and act on them. The presenter looked dumbfounded and admitted, “You are the first person to ask me that question. No one on our team or at any of our sites has ever asked to include a health literacy specialist. It is not something we have ever thought about or knew to consider.”

Wow!

How to Represent Health Literacy in AI Policy

When I shared this story with a professional acquaintance who is on the development side of AI programs, she said she wasn’t surprised and guessed that most people on the tech side of AI aren’t even aware that health literacy specialists exist. 

This is why we need health literacy specialists at the table. We must raise our hands — and our voices — to ensure the AI programs our organization uses are designed and tested with both medical accuracy AND health literacy in mind. 

Let’s start advocating for being part of these conversations. I encourage all of us to find opportunities to get involved today.

Here are 4 ways you can lend your expertise to your organization’s AI adoption process:

  1. Start learning about AI today by reading the research, attending presentations, and following thought leaders on social media.
  2. Talk about AI. Ask everyone what they think about it, how they are using it, and what they know about it.
  3. Ask if your organization has an AI committee and, if so, how you can get involved in it. If you’re not sure whom to ask, start with the IT Department. 
  4. If there’s no AI committee at your institution, consider starting one. Make sure to include people from all parts of your organization who could be impacted — not just IT and data security staff. Look back to the start of this blog for some ideas about whom to invite.

About the Author

Tracy Mehan


Tracy Mehan is the director of Research Translation and Communication for the Center for Injury Research and Policy, bringing close to 30 years of experience in communications, health education, marketing, and research to her role. Her professional interests center on the effective communication of injury prevention messages and teaching others how to translate their research for non-scientific audiences. In addition, she often serves as an injury prevention expert for mainstream media. Tracy serves as a subject matter expert on AI for IHA.




#ArtificialIntelligence
#IHA Article
#IHABlog
0 comments
22 views

Permanent Hyperlink to Share