Lv Siting, Mao Yuanyang, Liu Youfu, Huang Yigui, Guo Dakang, Cheng Lei, Tang Zhuoheng, Peng Shaohai, Xiao Deqin
College of Mathematics Informatics, South China Agricultural University, Guangzhou 510642, China; Key Laboratory of Smart Agricultural Technology in Tropical South China, Ministry of Agriculture and Rural Affairs, Guangzhou 510642, China; Guangdong Engineering Research Center of Agricultural Big Data, Guangzhou 510642, China.
College of Mathematics Informatics, South China Agricultural University, Guangzhou 510642, China; Key Laboratory of Smart Agricultural Technology in Tropical South China, Ministry of Agriculture and Rural Affairs, Guangzhou 510642, China; Guangdong Engineering Research Center of Agricultural Big Data, Guangzhou 510642, China.
Poult Sci. 2025 Feb;104(2):104782. doi: 10.1016/j.psj.2025.104782. Epub 2025 Jan 7.
Accurate individual egg-laying detection is crucial for eliminating low-yielding breeder ducks and improving production efficiency. However, existing methods are often expensive and require strict environmental conditions. This study proposes a data processing method based on wearable sensors and joint time-frequency representation (TFR), aimed at accurately identifying egg-laying in ducks. First, the sensors continuously monitor the ducks' activity and collect corresponding X-axis acceleration data. Next, a sliding window combined with Short-Time Fourier Transform (STFT) is applied to convert the continuous data into spectrograms within consecutive windows. SqueezeNet is then used to detect spectrograms containing key features of the egg-laying process, marking these as egg-laying state windows. Finally, Kalman filtering was used to continuously predict the detected egg-laying status, allowing for the precise determination of the egg-laying period. The best detection performance was achieved by applying the 10-fold cross-validation to a dataset of 59,135 spectrograms, using a window size of 50 min and a step size of 3 min. This configuration yielded an accuracy of 95.73 % for detecting egg-laying status, with an inference time of only 2.1511 milliseconds per window. The accuracy for identifying the egg-laying period reached 92.19 %, with a precision of 93.57 % and a recall rate of 91.95 %. Additionally, we explored the scalability of the joint time-frequency representation to reduce the computational complexity of the model.
准确的个体产蛋检测对于淘汰低产种鸭和提高生产效率至关重要。然而,现有方法往往成本高昂且需要严格的环境条件。本研究提出一种基于可穿戴传感器和联合时频表示(TFR)的数据处理方法,旨在准确识别鸭的产蛋情况。首先,传感器持续监测鸭子的活动并收集相应的X轴加速度数据。接下来,应用滑动窗口结合短时傅里叶变换(STFT)将连续数据转换为连续窗口内的频谱图。然后使用SqueezeNet检测包含产蛋过程关键特征的频谱图,将这些标记为产蛋状态窗口。最后,使用卡尔曼滤波对检测到的产蛋状态进行连续预测,从而精确确定产蛋期。通过对59135个频谱图的数据集应用10折交叉验证,使用窗口大小为50分钟和步长为3分钟,实现了最佳检测性能。这种配置在检测产蛋状态时的准确率为95.73%,每个窗口的推理时间仅为2.1511毫秒。识别产蛋期的准确率达到92.19%,精确率为93.57%,召回率为91.95%。此外,我们还探索了联合时频表示的可扩展性以降低模型计算复杂度。