Stroud Austin M, Minteer Sarah A, Zhu Xuan, Ridgeway Jennifer L, Miller Jennifer E, Barry Barbara A
Biomedical Ethics Program, Mayo Clinic, Rochester, Minnesota, United States of America.
Physical Medicine and Rehabilitation Research, Mayo Clinic, Rochester, Minnesota, United States of America.
PLOS Digit Health. 2025 Apr 21;4(4):e0000826. doi: 10.1371/journal.pdig.0000826. eCollection 2025 Apr.
As health systems incorporate artificial intelligence (AI) into various aspects of patient care, there is growing interest in understanding how to ensure transparent and trustworthy implementation. However, little attention has been given to what information patients need about these technologies to promote transparency of their use. We conducted three asynchronous online focus groups with 42 patients across the United States discussing perspectives on their information needs for trust and uptake of AI, focusing on its use in cardiovascular care. Data were analyzed using a rapid content analysis approach. Our results suggest that patients have a set of core information needs, including specific information factors pertaining to the AI tool, oversight, and healthcare experience, that are relevant to calibrating trust as well as perspectives concerning information delivery, disclosure, consent, and physician AI use. Identifying patient information needs is a critical starting point for calibrating trust in healthcare AI systems and designing strategies for information delivery. These findings highlight the importance of patient-centered engagement when developing AI model documentation and communicating and provisioning information about these technologies in clinical encounters.
BMC Med Inform Decis Mak. 2024-9-4
JMIR Form Res. 2025-7-8
J Ultrasound Med. 2024-10
Digit Health. 2024-4-30
NPJ Digit Med. 2024-1-26
Interact J Med Res. 2023-7-14
PLOS Digit Health. 2022-3-31