When a research participant submits a survey, you, as the researcher, have a few options. You can accept the submission, in which case the participant gets paid and their approval rating increases. Or, you can reject the submission, in which case the participant is not paid and their reputation suffers (i.e., their approval rating goes down). On Connect, there is also an option to reject participants but still pay them—this is sometimes a requirement by Institutional Review Boards (IRBs).
The coronavirus pandemic has created massive social upheaval. Beyond the health and economic consequences, the day-to-day behaviors of billions of people have been altered. And, the end is currently not in sight.
Of late, researchers have reported a decrease in data quality on Mechanical Turk (MTurk). To combat the issue, we recently developed some data quality solutions, which are described in detail in our previous blog.
In the world of human subjects research, Institutional Review Boards (IRB) often conduct a cost-benefit analysis to assess whether a study is ethical. A universal standard applied in these assessments is asking how much risk participants will be exposed to when compared to the things people encounter in everyday life.
Amazon Mechanical Turk (MTurk) is a microtask platform that has been used for social science research for nearly a decade. During this time,
Early March marked a historic turning point in the United States. Despite pockets of outbreak on the West Coast and a clear acceleration of infection across Western Europe, most Americans still regarded the novel coronavirus as a distant threat. The threat abruptly became proximal when, on March 2, the first community acquired case of COVID-19 was confirmed in New York State.