Sustr P, Spinka M, Cloutier S, Newberry R C
Ethology Group, Research Institute of Animal Production, Prátelství 815, 104 01 Prague, Uhrínĕves, Czech Republic.
Behav Res Methods Instrum Comput. 2001 Aug;33(3):364-70. doi: 10.3758/bf03195390.
In an experiment investigating the impact of preweaning social experience on later social behavior in pigs, we were interested in the mutual spatial positions of pigs during paired social interactions. To obtain these data, we applied a different colored mark to the head and back of each of 2 pigs per group and videotaped the pigs' interactions. We used the EthoVision tracking system to provide x,y coordinates of the four colored marks every 0.2 sec. This paper describes the structure and functioning of a FoxPro program designed to clean the raw data and use it to identify the mutual body positions of the 2 animals at 0.2-sec intervals. Cleaning the data was achieved by identifying invalid data points and replacing them by interpolations. An algorithm was then applied to extract three variables from the coordinates: (1) whether the two pigs were in body contact; (2) the mutual orientation (parallel, antiparallel, or perpendicular) of the two pigs; and (3) whether the pig in the "active" position made snout contact in front of, or behind, the ear base of the other pig. Using these variables, we were able to identify five interaction types: Pig A attacks, Pig B attacks, undecided head-to-head position, "clinch" resting position, or no contact. To assess the reliability of the automatic system, a randomly chosen 5-min videotaped interaction was scored for mutual positions both visually (by 2 independent observers) and automatically. Good agreement was found between the data from the 2 observers and between each observer's data and the data from the automated system, as assessed using Cohen's kappa coefficients.
在一项研究断奶前社交经历对猪后期社交行为影响的实验中,我们对成对社交互动期间猪的相互空间位置感兴趣。为获取这些数据,我们给每组的2头猪的头部和背部分别施加不同颜色的标记,并对猪的互动进行录像。我们使用EthoVision跟踪系统每隔0.2秒提供四个彩色标记的x、y坐标。本文描述了一个FoxPro程序的结构和功能,该程序旨在清理原始数据,并利用其每隔0.2秒识别这2只动物的相互身体位置。通过识别无效数据点并用插值法替换它们来清理数据。然后应用一种算法从坐标中提取三个变量:(1) 这两头猪是否身体接触;(2) 这两头猪的相互朝向(平行、反平行或垂直);(3)处于“主动”位置的猪是否在另一头猪的耳根前方或后方进行口鼻接触。利用这些变量,我们能够识别出五种互动类型:猪A攻击、猪B攻击、未确定的头对头姿势、“抱紧”休息姿势或无接触。为评估自动系统的可靠性,对一段随机选取的5分钟录像互动进行了视觉评分(由2名独立观察者)和自动评分,以确定相互位置。使用科恩卡帕系数评估发现,两名观察者的数据之间以及每名观察者的数据与自动系统的数据之间具有良好的一致性。