Original Research by Members of the CloudResearch Team
Books:
Moss, A., Hartman, R., Litman, L., & Robinson, J. (2023). Research in the Cloud: A Guide to Online Behavioral Science. Forthcoming.
A lab manual for students and researchers learning to conduct research online. Contact us to receive an advanced copy.
Litman, L., Robinson, J. (2020). Conducting Online Research on Amazon Mechanical Turk and Beyond. April, 2020. SAGE Publications. https//us.sagepub.com/en-us/nam/conducting-online-research-on-amazon-mechanical-turk-and-beyond/book257367
A guide to the world of online research and how to optimally carry out research projects with online samples.
Pre-Prints and Peer-Reviewed Publications:
Frequently Cited and Noteworthy Papers
Hartman, R., Moss, A. J., Jaffe, S. N., Rosenzweig, C., Litman, L., & Robinson, J. (2023). Introducing Connect by CloudResearch: Advancing Online Participant Recruitment in the Digital Age. https://osf.io/preprints/psyarxiv/ksgyr/
A white paper introducing Connect, CloudResearch’s innovative platform designed to revolutionize online participant recruitment in social and behavioral science research.
Litman, L., Rosen, Z., Hartman, R., Rosenzweig, C., Weinberger-Litman, S. L., Moss, A. J., & Robinson, J. (2023). Did people really drink bleach to prevent COVID-19? A guide for protecting survey data against problematic respondents. PLoS ONE, 18(7). https://doi.org/10.1371/journal.pone.0287837
An investigation into how problematic survey respondents were responsible for 100% of reported incidents of household cleaner ingestion, highlighting implications for online survey research practices.
Moss, A. J., Hauser, D. J., Rosenzweig, C., Jaffe, S., Robinson, J., & Litman, L. (2023). Using Market-Research Panels for Behavioral Science: An Overview and Tutorial. Advances in Methods and Practices in Psychological Science, 6(2), https://doi.org/10.1177/25152459221140388
An overview of market-research panels and considerations for using such panels for behavioral research.
Hauser, D. J., Moss, A. J., Rosenzweig, C., Jaffe, S. N., Robinson, J., & Litman, L. (2022). Evaluating CloudResearch's Approved Group as a solution for problematic data quality on MTurk. Behavior Research Methods, 1-12. https://link.springer.com/article/10.3758/s13428-022-01999-x
A pre-registered study comparing CloudResearch's Approved list and Blocked list to a Standard MTurk sample, demonstrating superior data quality among Approved list participants.
Chandler, J., Rosenzweig, C., Moss, A. J., Robinson, J., & Litman, L. (2019). Online panels in social science research: Expanding sampling methods beyond Mechanical Turk. Behavior Research Methods, 51(5), 2022-2038. https://link.springer.com/article/10.3758/s13428-019-01273-7
Study examining data quality and participants representativeness of Prime Panels as a participant recruitment platform.
Robinson, J., Rosenzweig, C., Moss, A.J., Litman, L. (2019). Tapped out or barely tapped? Recommendations for how to harness the vast and largely unused potential of the Mechanical Turk participant pool. PLoS ONE 14(12): e0226394. https://doi.org/10.1371/journal.pone.0226394
Analysis of the size of the MTurk participant tool, and suggestions for how to apply better sampling strategies to reach less experienced high-quality participants.
Litman, L., Robinson, J., & Abberbock, T. (2017). TurkPrime.com: A versatile crowdsourcing data acquisition platform for the behavioral sciences. Behavior Research Methods, 49(2), 433-442. https://link.springer.com/article/10.3758/s13428-016-0727-z
A description of the purposes and features of the CloudResearch MTurk Toolkit.
Litman, L., Robinson, J., & Rosenzweig, C. (2015). The relationship between motivation, monetary compensation, and data quality among US-and India-based workers on Mechanical Turk. Behavior Research Methods, 47(2), 519-528. https://link.springer.com/article/10.3758/s13428-014-0483-x
Examination of the impacts of compensation on data quality on MTurk.
CloudResearch Blogs and Other Resources:
How to Use CloudResearch Products
- Introducing Connect by CloudResearch: Advancing Online Participant Recruitment in the Digital Age
- What is CloudResearch? A Rundown of CloudResearch's Products and the Best Use-Case for Each One
- Connect by CloudResearch: A New and Versatile Platform for Online Recruitment
- How to Cite CloudResearch in Your Research Papers
- Make Your Research Less WEIRD; Sample with Prime Panels International
- Including and Excluding Participants From Studies Run Through CloudResearch: What, When and How
- Running Longitudinal Studies on CloudResearch
- An MTurk Alternative: Data Quality Analysis of Mturk vs Online Panels
- Comparing Data Quality on CloudResearch to Prolific: a Reply to Prolific's Recent False Claims
- How to Gather Demographically-Representative Samples in Online Studies
General Guides for Online Research
- A Guide to Sampling Methodology
- A Guide to Statistical Significance
- A Guide to Market Segmentation
- A Guide to Data Quality
- Examples of Good (and Bad) Attention Check Questions in Surveys
- What Are Survey Validity and Reliability?
- Survey Screening Questions: Good & Bad Examples
- How to Conduct an Online Pricing Survey
- Five Common Mistakes and Misconceptions About Polls
- Clean Data: the Key to Accurate Conclusions









