Show simple item record

dc.contributor.authorLandy, Justin F
dc.contributor.authorJia, Miaolei
dc.contributor.authorDing, Isabel L
dc.contributor.authorViganola, Domenico
dc.contributor.authorTierney, Warren
dc.contributor.authorDreber, Anna
dc.contributor.authorJohannesson, Magnus
dc.contributor.authorPfeiffer, Thomas
dc.contributor.authorEbersole, Charles R
dc.contributor.authorGronau, Quentin F
dc.contributor.authorPfuhl, Gerit
dc.contributor.authorLy, Alexander
dc.contributor.authorvan den Bergh, Don
dc.contributor.authorMarsman, Maarten
dc.contributor.authorDerks, Koen
dc.contributor.authorWagenmakers, Eric-Jan
dc.contributor.authorProctor, Andrew
dc.contributor.authorBartels, Daniel M.
dc.contributor.authorBauman, Christopher W.
dc.contributor.authorBrady, William J.
dc.contributor.authorCheung, Felix
dc.contributor.authorCimpian, Andrei
dc.contributor.authorDohle, Simone
dc.contributor.authorDonnellan, M. Brent
dc.contributor.authorHahn, Adam
dc.contributor.authorHall, Michael P.
dc.contributor.authorJiménez-Leal, William
dc.contributor.authorJohnson, David J.
dc.contributor.authorLucas, Richard E.
dc.contributor.authorMonin, Benoît
dc.contributor.authorMontealegre, Andres
dc.contributor.authorMullen, Elizabeth
dc.contributor.authorPang, Jun
dc.contributor.authorRay, Jennifer
dc.contributor.authorReinero, Diego A.
dc.contributor.authorReynolds, Jesse
dc.contributor.authorSowden, Walter
dc.contributor.authorStorage, Daniel
dc.contributor.authorSu, Runkun
dc.contributor.authorTworek, Christina M.
dc.contributor.authorVan Bavel, Jay J.
dc.contributor.authorWalco, Daniel
dc.contributor.authorWills, Julian
dc.contributor.authorXu, Xiaobing
dc.contributor.authorYam, Kai Chi
dc.contributor.authorYang, Xiaoyu
dc.contributor.authorCunningham, William A.
dc.contributor.authorSchweinsberg, Martin
dc.contributor.authorUrwitz, Molly
dc.contributor.authorUhlmann, Eric L.
dc.date.accessioned2021-06-09T07:33:39Z
dc.date.available2021-06-09T07:33:39Z
dc.date.issued2020-01-16
dc.description.abstractTo what extent are research results influenced by subjective decisions that scientists make as they design studies? Fifteen research teams independently designed studies to answer five original research questions related to moral judgments, negotiations, and implicit cognition. Participants from 2 separate large samples (total N 15,000) were then randomly assigned to complete 1 version of each study. Effect sizes varied dramatically across different sets of materials designed to test the same hypothesis: Materials from different teams rendered statistically significant effects in opposite directions for 4 of 5 hypotheses, with the narrowest range in estimates being d = 0.37 to 0.26. Meta-analysis and a Bayesian perspective on the results revealed overall support for 2 hypotheses and a lack of support for 3 hypotheses. Overall, practically none of the variability in effect sizes was attributable to the skill of the research team in designing materials, whereas considerable variability was attributable to the hypothesis being tested. In a forecasting survey, predictions of other scientists were significantly correlated with study results, both across and within hypotheses. Crowdsourced testing of research hypotheses helps reveal the true consistency of empirical support for a scientific claim.en_US
dc.description©American Psychological Association, 2020. This paper is not the copy of record and may not exactly replicate the authoritative document published in the APA journal. Please do not copy or cite without author's permission. The final article is available, upon publication, at: <a href=https://doi.apa.org/doi/10.1037/bul0000220>https://doi.apa.org/doi/10.1037/bul0000220</a>en_US
dc.identifier.citationLandy, Jia, Ding, Viganola, Tierney, Dreber, Johannesson, Pfeiffer, Ebersole, Gronau, Pfuhl. Crowdsourcing Hypothesis Tests: Making Transparent How Design Choices Shape Research Results. Psychological bulletin. 2020en_US
dc.identifier.cristinIDFRIDAID 1867985
dc.identifier.doi10.1037/bul0000220
dc.identifier.issn0033-2909
dc.identifier.issn1939-1455
dc.identifier.urihttps://hdl.handle.net/10037/21375
dc.language.isoengen_US
dc.publisherAmerican Psychological Associationen_US
dc.relation.journalPsychological bulletin
dc.rights.accessRightsopenAccessen_US
dc.rights.holder© 2021 American Psychological Association.en_US
dc.subjectVDP::Social science: 200::Psychology: 260en_US
dc.subjectVDP::Samfunnsvitenskap: 200::Psykologi: 260en_US
dc.titleCrowdsourcing Hypothesis Tests: Making Transparent How Design Choices Shape Research Resultsen_US
dc.type.versionacceptedVersionen_US
dc.typeJournal articleen_US
dc.typeTidsskriftartikkelen_US
dc.typePeer revieweden_US


File(s) in this item

Thumbnail

This item appears in the following collection(s)

Show simple item record