Session | ||
A1: Survey Methods Interventions 1
| ||
Presentations | ||
Providing Appreciative Feedback to Optimizing Respondents – Is Positive Feedback in Web Surveys Effective in Preventing Non-differentiation and Speeding? Technical University of Darmstadt, Germany Relevance & Research Question Interactive feedback to non-differentiating or speeding respondents has proven effective in reducing satisficing behavior in Web surveys (Couper et al. 2017; Kunz & Fuchs 2019). In this study we tested the effectiveness of appreciative dynamic feedback to respondents who already provide well-differentiated answers and who already take sufficient time in a grid question. This feedback was expected to elevate overall response quality by motivating optimizing respondents to keep response quality high. Methods & Data About N=1,900 respondents from an online access panel in Germany participated in a general population survey on “Democracy and Politics in Germany”. In this study, two 12-item grid questions were selected for randomized field-experiments. Respondents were assigned to either a control group with no feedback, or to experimental group 1 receiving feedback when providing non-differentiated (experiment 1) or fast answers (experiment 2) or to experimental group 2 receiving appreciative feedback when providing well differentiated answers (experiment 1) or when taking sufficient time to answer (experiment 2). Interventions were implemented as dynamic feedback appearing as embedded text bubbles on the question page up to four times and disappearing automatically. Results Results suggest that appreciative feedback to optimizing respondents has only limited positive effects on response quality. By contrast, we see indications of deteriorating effects when praising optimizing respondents for their efforts. We speculate that appreciative feedback to optimizing respondents is perceived as an indication that they process the question more careful than necessary. Comparing various types of attention checks in web-based questionnaires: Experimental evidence from the German Internet Panel and the Swedish Citizen Panel 1GESIS - Leibniz Institute for the Social Sciences, Germany; 2SOM Institute, University of Gothenburg, Sweden Relevance & Research Question Evaluating methods to prevent and detect inattentive respondents in web surveys 1Institute for Employment Research (IAB), Germany; 2LMU Munich; 3University of Mannheim; 4NYU Relevance & Research Question Inattentive respondents pose a substantial threat to data quality in web surveys. In this study, we evaluate methods for preventing and detecting inattentive responding and investigate its impacts on substantive research. We use data from two large-scale non-probability surveys fielded in the US. Our analysis consists of four parts: First, we experimentally test the effect of asking respondents to commit to providing high-quality responses at the beginning of the survey on various data quality measures (attention checks, item nonresponse, break-offs, straightlining, speeding). Second, we conducted and additional experiment to compare the proportion of flagged respondents for two versions of an attention check item (instructing them to select a specific response vs. leaving the item blank). Third, we propose a timestamp-based cluster analysis approach that identifies clusters of respondents who exhibit different speeding behaviors and in particular likely inattentive respondents. Fourth, we investigate the impact of inattentive respondents on univariate, regression, and experimental analyses. First, our findings show that the commitment pledge had no effect on the data quality measures. As indicated by the timestamp data, many respondents likely did not even read the commitment pledge text. Second, instructing respondents to leave the item blank instead of providing a specific response significantly increased the rate of flagged respondents (by 16.8 percentage points). Third, the timestamp-based clustering approach efficiently identified clusters of likely inattentive respondents and outperformed a related method, while providing additional insights on speeding behavior throughout the questionnaire. Fourth, we show that inattentive respondents can have substantial impacts on substantive analyses. Added Value The results of our study may guide researchers who want to prevent or detect inattentive responding in their data. Our findings show that attention checks should be used with caution. We show that paradata-based detection techniques provide a viable alternative while putting no additional burden on respondents. |