School of Computer Science, Carnegie Mellon University, Pittsburgh, PA, 15213, USA.
Department of Information Science, Cornell University, Ithaca, NY, 14853, USA.
Sci Rep. 2023 Oct 20;13(1):17957. doi: 10.1038/s41598-023-44883-0.
Machines powered by artificial intelligence increasingly permeate social networks with control over resources. However, machine allocation behavior might offer little benefit to human welfare over networks when it ignores the specific network mechanism of social exchange. Here, we perform an online experiment involving simple networks of humans (496 participants in 120 networks) playing a resource-sharing game to which we sometimes add artificial agents (bots). The experiment examines two opposite policies of machine allocation behavior: reciprocal bots, which share all resources reciprocally; and stingy bots, which share no resources at all. We also manipulate the bot's network position. We show that reciprocal bots make little changes in unequal resource distribution among people. On the other hand, stingy bots balance structural power and improve collective welfare in human groups when placed in a specific network position, although they bestow no wealth on people. Our findings highlight the need to incorporate the human nature of reciprocity and relational interdependence in designing machine behavior in sharing networks. Conscientious machines do not always work for human welfare, depending on the network structure where they interact.
人工智能驱动的机器越来越多地控制着资源,渗透到社交网络中。然而,当机器忽略社会交换的特定网络机制时,它的分配行为可能对网络中的人类福利没有什么好处。在这里,我们进行了一项涉及简单人类网络(120 个网络中的 496 名参与者)的在线实验,参与者玩资源共享游戏,我们有时会在其中添加人工智能代理(机器人)。实验考察了机器分配行为的两种相反策略:互惠机器人,它们互惠地共享所有资源;吝啬机器人,它们根本不共享任何资源。我们还操纵了机器人的网络位置。我们表明,互惠机器人在人际之间不平等的资源分配上几乎没有改变。另一方面,当吝啬机器人处于特定的网络位置时,它们会平衡结构权力并提高人类群体的集体福利,尽管它们不会给人们带来财富。我们的发现强调了在设计共享网络中的机器行为时,需要将互惠和关系相互依存的人性纳入其中。有良知的机器并不总是为人类福利服务,这取决于它们相互作用的网络结构。