Examining the replicability of online experiments selected by a decision market.

Loading...
Thumbnail Image

Date

2024-11-19

DOI

Open Access Location

Journal Title

Journal ISSN

Volume Title

Publisher

Nature Research

Rights

(c) 2024 The Author/s
CC BY 4.0

Abstract

Here we test the feasibility of using decision markets to select studies for replication and provide evidence about the replicability of online experiments. Social scientists (n = 162) traded on the outcome of close replications of 41 systematically selected MTurk social science experiments published in PNAS 2015-2018, knowing that the 12 studies with the lowest and the 12 with the highest final market prices would be selected for replication, along with 2 randomly selected studies. The replication rate, based on the statistical significance indicator, was 83% for the top-12 and 33% for the bottom-12 group. Overall, 54% of the studies were successfully replicated, with replication effect size estimates averaging 45% of the original effect size estimates. The replication rate varied between 54% and 62% for alternative replication indicators. The observed replicability of MTurk experiments is comparable to that of previous systematic replication projects involving laboratory experiments.

Description

Keywords

Citation

Holzmeister F, Johannesson M, Camerer CF, Chen Y, Ho T-H, Hoogeveen S, Huber J, Imai N, Imai T, Jin L, Kirchler M, Ly A, Mandl B, Manfredi D, Nave G, Nosek BA, Pfeiffer T, Sarafoglou A, Schwaiger R, Wagenmakers E-J, Waldén V, Dreber A. (2024). Examining the replicability of online experiments selected by a decision market.. Nat Hum Behav.

Collections

Endorsement

Review

Supplemented By

Referenced By

Creative Commons license

Except where otherwised noted, this item's license is described as (c) 2024 The Author/s