Volume 18 Number 3 (Aug. 2023)
Home > Archive > 2023 > Volume 18 Number 3 (Aug. 2023) >
JSW 2023 Vol.18(3): 117-129
doi: 10.17706/jsw.18.3.117-129

The Effect of Performance-Based Compensation on Crowdsourced Human-Robot Interaction Experiments

Zahra Rezaei Khavas*, Monish Reddy Kotturu, Russell Purkins, S. Reza Ahmadzadeh, Paul Robinette

School of Electrical and Computer Engineering and Computer Science, University of Massachusetts Lowell, Lowell, Massachusetts, United States.


Abstract—Social scientists have long been interested in the relationship between financial incentives and performance. This subject has gained new relevance with the advent of web-based “crowd-sourcing” models of production. In recent decades, recruiting participants from crowd-sourcing platforms has gained considerable popularity among human-robot trust researchers. A large number of outliers due to lack of enough attention and focus of the experiment participants or due to participants' boredom and low engagement, especially in crowd-sourcing experiments, has always been a concern for researchers in the field of human-robot interaction. To overcome this problem, financial incentives can be a solution. In this study, we examine the effects of performance-related compensation on the experiment data quality, participants' performance, and accuracy in performing the assigned task, and experiment results in the context of a human-robot trust experiment. We designed an online human-robot collaborative search task and recruited 120 participants for this experiment from Amazon's Mechanical Turk (AMT). We tested participants' attention, performance, and trust in the robotic teammate under two conditions: constant payment and performance-based payment conditions. We found that using financial incentives can increase the data quality and help prevent the random and aimless behavior of the participants. We also found that financial incentives can improve the participants' performance significantly and causes participants to put more effort into performing the assigned task. However, it does not affect the experiment results unless any measures are directly associated with the compensation value.

Index Terms—Amazon mechanical turk, crowdsourcing, financial incentive, human-robot interaction, human-robot trust, MTurk

[PDF]

Cite: Zahra Rezaei Khavas, Monish Reddy Kotturu, Russell Purkins, S. Reza Ahmadzadeh, Paul Robinette, "The Effect of Performance-Based Compensation on Crowdsourced Human-Robot Interaction Experiments," Journal of Software vol. 18, no. 3, pp. 117-129, 2023.

Copyright @ 2023 by the authors. This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited (CC BY 4.0)

General Information

  • ISSN: 1796-217X (Online)

  • Abbreviated Title: J. Softw.

  • Frequency:  Quarterly

  • APC: 500USD

  • DOI: 10.17706/JSW

  • Editor-in-Chief: Prof. Antanas Verikas

  • Executive Editor: Ms. Cecilia Xie

  • Abstracting/ Indexing: DBLP, EBSCO,
           CNKIGoogle Scholar, ProQuest,
           INSPEC(IET), ULRICH's Periodicals
           Directory, WorldCat, etc

  • E-mail: jsweditorialoffice@gmail.com

  • Oct 22, 2024 News!

    Vol 19, No 3 has been published with online version   [Click]

  • Jan 04, 2024 News!

    JSW will adopt Article-by-Article Work Flow

  • Apr 01, 2024 News!

    Vol 14, No 4- Vol 14, No 12 has been indexed by IET-(Inspec)     [Click]

  • Apr 01, 2024 News!

    Papers published in JSW Vol 18, No 1- Vol 18, No 6 have been indexed by DBLP   [Click]

  • Jun 12, 2024 News!

    Vol 19, No 2 has been published with online version   [Click]