Filter:

  • CloudResearch is Retiring the “Block Low-Quality Participants” Option

    CloudResearch is Retiring the “Block Low-Quality Participants” Option

    Aaron Moss, PhD

    Starting November 1, 2021, CloudResearch will be retiring the Block Low-Quality Participants feature we created in 2020 and continuing with our CloudResearch Approved Participants option. In this blog, we explain why we’re making this change and what it means for researchers.

  • The Truth About Online Data Quality: 30% of Survey Respondents Live on Pluto

    The Truth About Online Data Quality: 30% of Survey Respondents Live on Pluto

    Aaron Moss, PhD

    Have you ever been faced with having to toss 20-40 percent of your data? This can be nerve racking. There are several threats to data quality and the validity of online research. We're here to shed some light on these threats and what to do about them.

  • Pandemic, Paranoia, and the Participant Pool

    Pandemic, Paranoia, and the Participant Pool

    Praveen Suthaharan

    A few months into 2020, the COVID-19 pandemic introduced global uncertainty. As we went into lockdown, we saw an opportunity to extend our study – what if we examined the development of paranoia as the pandemic unfolded? In a paper recently published in Nature Human Behaviour we tracked paranoia and belief-updating as the pandemic progressed – prior to lockdown, during lockdown, and into reopening.

  • How to Cite CloudResearch in Your Journal Articles

    How to Cite CloudResearch in Your Journal Articles

    CloudResearch

    Starting with the methods section of your academic paper can make the process easier—we're here to help you accurately cite research samples and highlight the types of samples you can gather from CloudResearch within your paper.

  • Age ISN’T Just a Number: Age Verification in Online Studies

    Age ISN’T Just a Number: Age Verification in Online Studies

    Rachel Hartman

    Oftentimes, researchers want to target their surveys toward particular demographics: e.g., people of a certain race, gender, or age. If they state the criteria they’re looking for in the survey title, they face the risk of people lying about their demographics in order to qualify for the study, particularly if the study offers high compensation. While most people tend to be honest, if the incentive and opportunity are there, they will sometimes succumb to temptation. That’s where CloudResearch qualifications fit in, or the alternative 2-step screener. But these methods aren’t foolproof, and researchers might want the peace of mind of knowing the participants they’ve recruited aren’t imposters.  

  • Survey Screening Questions: Good & Bad Examples

    Survey Screening Questions: Good & Bad Examples

    Aaron Moss, PhD

    Although the idea may be simple, writing effective screening questions is a lot more complicated. Screening questions need to be short and easy for respondents to answer. They should avoid binary response options but also should not include too many response options. You should order the response options in the right way, and the overall screener shouldn’t contain too many questions. Above all, you absolutely need to ensure you don’t lead people to guess which attributes you’re looking for. To balance all these objectives—and to demonstrate examples of good and bad survey screening questions—we wrote this blog. Let’s dig in.

SUBSCRIBE TO RECEIVE UPDATES