Institute of Digital Games, University of Malta, Msida, Malta.
Sci Data. 2024 Nov 29;11(1):1306. doi: 10.1038/s41597-024-04022-4.
As online video and streaming platforms continue to grow, affective computing research has undergone a shift towards more complex studies involving multiple modalities. However, there is still a lack of readily available datasets with high-quality audiovisual stimuli. In this paper, we present GameVibe, a novel affect corpus which consists of multimodal audiovisual stimuli, including in-game behavioural observations and third-person affect traces for viewer engagement. The corpus consists of videos from a diverse set of publicly available gameplay sessions across 30 games, with particular attention to ensure high-quality stimuli with good audiovisual and gameplay diversity. Furthermore, we present an analysis on the reliability of the annotators in terms of inter-annotator agreement.
随着在线视频和流媒体平台的不断发展,情感计算研究已经转向更复杂的多模态研究。然而,仍然缺乏具有高质量视听刺激的现成数据集。在本文中,我们提出了 GameVibe,这是一个新的情感语料库,由多模态视听刺激组成,包括游戏内行为观察和第三方情感痕迹,以用于观众参与度的研究。该语料库由来自 30 个游戏的各种公开可用游戏会话的视频组成,特别注意确保具有良好视听和游戏多样性的高质量刺激。此外,我们还根据注释者之间的一致性,对注释者的可靠性进行了分析。