Data from: Delegation to artificial agents fosters prosocial behaviors in the collective risk dilemma
Fernandez Domingos, Elias et al. (2022), Data from: Delegation to artificial agents fosters prosocial behaviors in the collective risk dilemma, Dryad, Dataset, https://doi.org/10.5061/dryad.0rxwdbs1z
Home assistant chat-bots, self-driving cars, drones or automated negotiation systems are some of the several examples of autonomous (artificial) agents that have pervaded our society. These agents enable the automation of multiple tasks, saving time and (human) effort. However, their presence in social settings raises the need for a better understanding of their effect on social interactions and how they may be used to enhance cooperation towards the public good, instead of hindering it. To this end, we present an experimental study of human delegation to autonomous agents and hybrid human-agent interactions centered on a non-linear public goods dilemma with uncertain returns in which participants face a collective risk. Our aim is to understand experimentally whether the presence of autonomous agents has a positive or negative impact on social behaviour, equality and cooperation in such a dilemma. Our results show that cooperation and group success increases when participants delegate their actions to an artificial agent that plays on their behalf. Yet, this positive effect is less pronounced when humans interact in hybrid human-agent groups, where we mostly observe that humans in successful hybrid groups make higher contributions earlier in the game. Also, we show that participants wrongly believe that artificial agents will contribute less to the collective effort. In general, our results suggest that delegation to autonomous agents has the potential to work as commitment devices, which prevent both the temptation to deviate to an alternate (less collectively good) course of action, as well as limiting responses based on betrayal aversion.
The data was collected through behavioral economic experiments performed in a laboratory. Participants remain anonymous throughout the experiment and interact with other through a tablet computer. The experiment was divided into a control and 3 treatments. The control (treatment 1 - humans) uses a similar setting to Milinski et al. 2008, where participants play a collective risk dilemma (CRD) with 90% of risk and 10 rounds. Participants interact in groups of 6 subjects to which they are randomly assigned before the experiment starts. Jointly they are required to contribute to a public account from their endowment (40 EMUs) to achieve a total of 120 EMUs by the end of the experiment. If they achieve or surpass this value, participants keep in their account whatever was not contributed, otherwise their final payoff will be 0 EMUs with 90% of probability.
Treatment 2 (delegate) evaluates how the outcome of the dilemma changes is participants have to select one out of 5 possible agents which will act on behalf the participant during the experiment. Participants are shown what is the full algorithm which describes the behavior of the agent.
Treatment 3 (customize) is similar to the delegate treatment, except that in this case participants can customize the behavior of their agent by selecting 5 parameters.
Treatment 4 (nudge) reproduces the setting of the control treatment with only human players, but in this case participants interact in hybrid groups formed by 3 human participants and 3 artificial agents. The artificial agents are sampled from the successful groups of the customize treatment. Nevertheless, participants are only told that the agents have been programmed by humans to act on their behalf during the experiment.
Please refer to the file "README.md" for information on the meaning of each feature (column) in the dataset.
National Endowment for Science Technology and the Arts, Award: NESTA 2018 Collective Intelligence Grant
Fonds Wetenschappelijk Onderzoek, Award: G.1S639.17N
Fonds Wetenschappelijk Onderzoek, Award: G.0391.13N
Fonds Wetenschappelijk Onderzoek, Award: G054919N
Fonds De La Recherche Scientifique - FNRS, Award: 31257234
Fonds De La Recherche Scientifique - FNRS, Award: 40005955
FuturICT2.0, Award: FLAG-ERA JCT 2016
Service Public de Wallonie Recherche, Award: 2010235–ariac
Fundação para a Ciência e a Tecnologia, Award: UIDB/50021/2020
Fundação para a Ciência e a Tecnologia, Award: PTDC/CCI-INF/7366/2020
Fundação para a Ciência e a Tecnologia, Award: PTDC/MAT-APL/6804/2020
TAILOR, Award: 952215
Centro Singular de Investigación de Galicia, Award: accreditation 2019-2022
European Regional Development Fund