John Lekha Mary, Anoop K K
Department of Physics, Cochin University of Science and Technology Kochi-682022 India
RSC Adv. 2023 Oct 9;13(42):29613-29624. doi: 10.1039/d3ra06226k. eCollection 2023 Oct 4.
Optical emission spectroscopic techniques, such as laser-induced breakdown spectroscopy (LIBS), require an optimal state of plasma for accurate quantitative elemental analysis. Three fundamental assumptions must be satisfied in order for analytical results to be accurate: local thermodynamic equilibrium (LTE), optically thin plasma, and stoichiometric ablation. But real-life plasma often fails to satisfy these conditions, especially the optical thinness of plasma, resulting in the reabsorption of emitted radiation called self-absorption. To study the self-absorption effect, we simulated optically thick emission spectrum at typical laser-produced plasma conditions. The simulation of the spectrum involves four stages, including the estimation of the ratio of the number density of various ionisation states in the plasma using the Saha-Boltzmann equation, the peak intensity of a spectral line using the Boltzmann equation, the full-width half maxima of each spectral line using the Stark broadening method, and the generation of full spectra by providing a Lorentzian profile for each spectral line. Then self-absorption is applied to the simulated spectrum. We investigated the dependence of the self-absorption coefficient on the plasma temperature, optical path length, and element concentration in the sample. Self-absorption decreases with increased plasma temperature, whereas it increases with increasing optical path length and analyte concentration. We also investigated the role of self-absorption in quantitative analysis by calibration-free LIBS with and without resonance lines of the binary alloy (Mg 50% & Ca 50%). We observed a drastic reduction in error from 27% to 2% in the composition estimation when excluding resonance lines.
光发射光谱技术,如激光诱导击穿光谱法(LIBS),需要等离子体处于最佳状态才能进行准确的定量元素分析。为了使分析结果准确,必须满足三个基本假设:局部热力学平衡(LTE)、光学薄等离子体和化学计量烧蚀。但实际的等离子体往往无法满足这些条件,尤其是等离子体的光学薄度,会导致发射辐射的重新吸收,即自吸收。为了研究自吸收效应,我们在典型的激光产生等离子体条件下模拟了光学厚发射光谱。光谱模拟包括四个阶段,包括使用萨哈-玻尔兹曼方程估计等离子体中各种电离态的数密度比、使用玻尔兹曼方程估计谱线的峰值强度、使用斯塔克展宽方法估计每条谱线的半高全宽,以及通过为每条谱线提供洛伦兹分布来生成全光谱。然后将自吸收应用于模拟光谱。我们研究了自吸收系数对等离子体温度、光程长度和样品中元素浓度的依赖性。自吸收随等离子体温度升高而降低,而随光程长度和分析物浓度增加而增加。我们还研究了自吸收在使用和不使用二元合金(Mg 50% & Ca 50%)共振线的无标样LIBS定量分析中的作用。我们观察到,排除共振线时,成分估计中的误差从27%大幅降至2%。