Hori Kenta, Uchida Yusuke, Kan Tsukasa, Minami Maya, Naito Chisako, Kuroda Tomohiro, Takahashi Hideya, Ando Masahiko, Kawamura Takashi, Kume Naoto, Okamoto Kazuya, Takemura Tadamasa, Yoshihara Hiroyuki
Annu Int Conf IEEE Eng Med Biol Soc. 2013;2013:4646-9. doi: 10.1109/EMBC.2013.6610583.
The aim of this research is to develop an information support system for tele-auscultation. In auscultation, a doctor requires to understand condition of applying a stethoscope, in addition to auscultatory sounds. The proposed system includes intuitive navigation system of stethoscope operation, in addition to conventional audio streaming system of auscultatory sounds and conventional video conferencing system for telecommunication. Mixed reality technology is applied for intuitive navigation of the stethoscope. Information, such as position, contact condition and breath, is overlaid on a view of the patient's chest. The contact condition of the stethoscope is measured by e-textile contact sensors. The breath is measured by a band type breath sensor. In a simulated tele-auscultation experiment, the stethoscope with the contact sensors and the breath sensor were evaluated. The results show that the presentation of the contact condition was not understandable enough for navigating the stethoscope handling. The time series of the breath phases was usable for the remote doctor to understand the breath condition of the patient.
本研究的目的是开发一种用于远程听诊的信息支持系统。在听诊过程中,医生除了要听听诊声音外,还需要了解使用听诊器的情况。所提出的系统除了包括传统的听诊声音音频流系统和用于远程通信的传统视频会议系统外,还包括听诊器操作的直观导航系统。混合现实技术用于听诊器的直观导航。诸如位置、接触状况和呼吸等信息会叠加在患者胸部的视图上。听诊器的接触状况由电子织物接触传感器测量。呼吸由带状呼吸传感器测量。在模拟远程听诊实验中,对带有接触传感器和呼吸传感器的听诊器进行了评估。结果表明,接触状况的呈现对于指导听诊器操作来说不够清晰易懂。呼吸阶段的时间序列可供远程医生了解患者的呼吸状况。