Dobie R A
Laryngoscope. 1983 Jul;93(7):906-27. doi: 10.1288/00005537-198307000-00014.
Hearing conservation in industry relies heavily on monitoring audiometry to detect early noise-induced hearing loss in workers who are exposed to potentially damaging noise, with or without hearing protectors. The "real-world" reliability and validity of these measurements, as well as otoscopic observations in industry, have not been extensively investigated. In addition, there is considerable controversy over the selection of a definition of "significant threshold shift" in industrial audiometry. These and related issues were considered in a series of three studies utilizing data from an active hearing conservation program. Test-retest variability in industry is much higher than has been reported for clinical settings; this variability is reduced by pure-tone averaging. Workers referred for otologic evaluation were found to have hearing levels which were, on the average, about 5 dB better than indicated by plant audiometry, even without excluding 4% of referred workers who had unilateral deafness and showed "shadow curves" on the plant audiograms. Otoscopic data obtained by the plant audiometrists were uncorrelated with the results of otoscopy by consultant otologists. Techniques borrowed from decision theory and signal detection theory were used to evaluate possible criteria for significant threshold shift. Criteria based on pure-tone averaging were superior to those based on a certain amount of threshold shift for any frequency tested. It is proposed that a significant threshold shift be defined as a 10 dB or greater change for the worse for either the 0.5, 1, 2 kHz pure-tone average or the 3, 4, 6 kHz pure-tone average, in either ear, and that such shifts be validated by prompt retesting. Even with this criterion, a substantial number of shifts (most shifts, in some situations) will be either spurious or attributable to disorders other than noise-induced hearing loss, such as presbycusis. Otologic referral in cases of large or repeated shifts may prevent unjustified administrative actions, to the advantage of both workers and management. A practical consequence of the use of monitoring audiometry may be a de facto lowering of the permissible exposure level to 85 dBA TWA.
工业中的听力保护在很大程度上依赖于监测听力测定,以检测暴露于潜在有害噪声的工人(无论是否佩戴听力保护器)早期的噪声性听力损失。这些测量在“现实世界”中的可靠性和有效性,以及工业环境中的耳镜检查观察结果,尚未得到广泛研究。此外,在工业听力测定中“显著阈值偏移”定义的选择存在相当大的争议。利用一项积极的听力保护计划的数据进行的三项系列研究中考虑了这些及相关问题。工业环境中的重测变异性远高于临床环境中的报告值;通过纯音平均可降低这种变异性。被转诊进行耳科评估的工人,即使不排除4%有单侧耳聋且在工厂听力图上显示“阴影曲线”的转诊工人,其听力水平平均比工厂听力测定所显示的约高5dB。工厂听力测定师获得的耳镜检查数据与顾问耳科医生的耳镜检查结果不相关。借鉴决策理论和信号检测理论的技术用于评估显著阈值偏移的可能标准。基于纯音平均的标准优于基于任何测试频率的一定量阈值偏移的标准。建议将显著阈值偏移定义为任一耳的0.5、1、2kHz纯音平均或3、4、6kHz纯音平均变差10dB或更大,并且这种偏移应通过及时重测进行验证。即使采用此标准,大量的偏移(在某些情况下大多数偏移)将是虚假的或归因于噪声性听力损失以外的疾病,如老年性耳聋。在出现大的或反复的偏移时进行耳科转诊可防止不合理的行政行为,对工人和管理层都有利。使用监测听力测定的一个实际后果可能是事实上将允许暴露水平降低到85dBA时间加权平均值。