Hartsock Iryna, Araujo Cyrillo, Folio Les, Rasool Ghulam
Department of Machine Learning, Moffitt Cancer Center and Research Institute, Tampa, FL, USA.
Deparmtent of Diagnostic Imaging and Interventional Radiology, Moffitt Cancer Center and Research Institute, Tampa, FL, USA.
J Imaging Inform Med. 2025 Apr 21. doi: 10.1007/s10278-025-01510-w.
Radiology reports are often lengthy and unstructured, posing challenges for referring physicians to quickly identify critical imaging findings while increasing risk of missed information. This retrospective study aimed to enhance radiology reports by making them concise and well-structured, with findings organized by relevant organs. To achieve this, we utilized private large language models (LLMs) deployed locally within our institution's firewall, ensuring data security and minimizing computational costs. Using a dataset of 814 radiology reports from seven board-certified body radiologists at [-blinded for review-], we tested five prompting strategies within the LangChain framework. After evaluating several models, the Mixtral LLM demonstrated superior adherence to formatting requirements compared to alternatives like Llama. The optimal strategy involved condensing reports first and then applying structured formatting based on specific instructions, reducing verbosity while improving clarity. Across all radiologists and reports, the Mixtral LLM reduced redundant word counts by more than 53%. These findings highlight the potential of locally deployed, open-source LLMs to streamline radiology reporting. By generating concise, well-structured reports, these models enhance information retrieval and better meet the needs of referring physicians, ultimately improving clinical workflows.
放射学报告通常冗长且无结构,这给转诊医生快速识别关键影像结果带来了挑战,同时增加了信息遗漏的风险。这项回顾性研究旨在通过使放射学报告简洁且结构良好,按相关器官组织结果,来改进放射学报告。为实现这一目标,我们利用了部署在我们机构防火墙内的私有大语言模型(LLMs),确保数据安全并将计算成本降至最低。我们使用了来自[-审查时 blinded-]的七位具有委员会认证的身体放射科医生的814份放射学报告数据集,在LangChain框架内测试了五种提示策略。在评估了几个模型后,与Llama等替代模型相比,Mixtral LLM在遵循格式要求方面表现出优越性。最佳策略是先压缩报告,然后根据特定指令应用结构化格式,减少冗长性同时提高清晰度。在所有放射科医生和报告中,Mixtral LLM将冗余字数减少了超过53%。这些发现凸显了本地部署的开源LLMs在简化放射学报告方面的潜力。通过生成简洁、结构良好的报告,这些模型增强了信息检索能力,更好地满足了转诊医生的需求,最终改善了临床工作流程。