Holzmeister FJohannesson MCamerer CFChen YHo T-HHoogeveen SHuber JImai NImai TJin LKirchler MLy AMandl BManfredi DNave GNosek BAPfeiffer TSarafoglou ASchwaiger RWagenmakers E-JWaldén VDreber A2024-11-272024-11-272024-11-19Holzmeister F, Johannesson M, Camerer CF, Chen Y, Ho T-H, Hoogeveen S, Huber J, Imai N, Imai T, Jin L, Kirchler M, Ly A, Mandl B, Manfredi D, Nave G, Nosek BA, Pfeiffer T, Sarafoglou A, Schwaiger R, Wagenmakers E-J, Waldén V, Dreber A. (2024). Examining the replicability of online experiments selected by a decision market.. Nat Hum Behav.https://mro.massey.ac.nz/handle/10179/72098Here we test the feasibility of using decision markets to select studies for replication and provide evidence about the replicability of online experiments. Social scientists (n = 162) traded on the outcome of close replications of 41 systematically selected MTurk social science experiments published in PNAS 2015-2018, knowing that the 12 studies with the lowest and the 12 with the highest final market prices would be selected for replication, along with 2 randomly selected studies. The replication rate, based on the statistical significance indicator, was 83% for the top-12 and 33% for the bottom-12 group. Overall, 54% of the studies were successfully replicated, with replication effect size estimates averaging 45% of the original effect size estimates. The replication rate varied between 54% and 62% for alternative replication indicators. The observed replicability of MTurk experiments is comparable to that of previous systematic replication projects involving laboratory experiments.(c) 2024 The Author/sCC BY 4.0https://creativecommons.org/licenses/by/4.0/Examining the replicability of online experiments selected by a decision market.Journal article10.1038/s41562-024-02062-92397-3374journal-articlehttps://www.ncbi.nlm.nih.gov/pubmed/3956279910.1038/s41562-024-02062-9