Conference Agenda

Overview and details of the sessions of this conference. Please select a date or location to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).

 
 
Session Overview
Session
10.1: Reluctant Respondents and Item Nonresponse
Time:
Wednesday, 02/Apr/2025:
12:00pm - 1:15pm

Session Chair: Indira Sen, University of Mannheim, Germany
Location: Hörsaal A


Show help for 'Increase or decrease the abstract text size'
Presentations

Encouraging revision of ‘Don’t know’ responses: Comparing delayed and dynamic feedback in Web surveys

Anke Metzler, Stella Czak, Hannah Schwärzel, Marek Fuchs

Technical University of Darmstadt, Germany

Relevance & Research Question

In Web surveys, the absence of interviewers increases the risk of item missing. Therefore, general wisdom suggests to refrain from don’t know (DK) options as they may encourage respondents to satisfice (Krosnick & Fabrigar, 1997). This, however, may lead to situations where respondents who cannot generate a valid answer randomly select one of the substantive response categories.

Previous studies indicate that interactive feedback can effectively improve response quality (Zhang, 2013; Al Baghal & Lynn, 2015). Interactive feedback can be provided either (1) after the questionnaire page is submitted (delayed feedback) or (2) immediately after respondents chose DK before they submit the page (dynamic feedback).

In this study, we apply interactive feedback in difficult single-choice questions that offer an explicit DK. If respondents select DK, we follow-up with either delayed or dynamic feedback to clarify question content. We assume that dynamic feedback is more effective in reducing DK since the feedback is provided while respondents are still engaged in the response process.

Methods & Data

In a Web survey, conducted with a German online access panel in November 2024 (n=2,000), we implemented a between-subjects design experiment. In two single-choice questions the effectiveness of providing dynamic feedback (EG1) or delayed feedback (EG2) is evaluated compared to a control group providing no feedback (CG).

Results

Preliminary results indicate that both feedback types reduce the percentage of final DK responses. In the first experiment, positioned early in the questionnaire, delayed feedback appears to be more effective in reducing DK than dynamic feedback. In contrast, in the second experiment, placed later in the questionnaire, dynamic feedback exhibits stronger effects. These findings suggest that delayed feedback may be more effective when respondents are highly motivated, whereas dynamic feedback seems to reduces DK to a greater extent as respondent motivation decreases.

Added Value

This study provides insights for survey researchers seeking to minimize DK answers and improve data quality in web surveys. By examining the distinct effects of dynamic versus delayed feedback on the revision of DK answers, this study helps understand how the timing of feedback influences respondent behavior.



Zooming in: Measuring Respondent´s Reactance and Receptivity to assess the effects of Error-Reducing Strategies in Web Surveys

Stella Czak, Hannah Schwärzel, Anke Metzler, Marek Fuchs

Technical University of Darmstadt, Germany

Relevance & Research Question

In Web surveys, respondents often exhibit signs of satisficing behavior, like speeding, non-differentiation, or item nonresponse. A common strategy to reduce such behavior uses prompts and interactive feedback to respondents (for example Baghal & Lynn, 2015; Kunz & Fuchs, 2019).

However, the effectiveness of prompts is sometimes limited. Some respondents seem to react with an optimizing tendency while others tend to ignore such prompts. This raises concerns that prompts do not reach all respondents to the same extent.

In this study we assess the interaction of two personality traits with the effectiveness of prompts concerning non-differentiation, item nonresponse and other types of satisficing behavior. It is assumed that the respondents’ level of reactance, which describes a person´s inner resistance to restrictions on one's own freedom of action, prevents respondents from improving their answering behavior when exposed to a prompt.

By contrast, a person’s receptivity describes the likelihood of a positive change in behavior due to interventions. It is assumed that respondents with higher levels of receptivity perceive prompts as a helpful resource and change their response behavior to the good.

Methods & Data

In a web survey conducted in a general population sample drawn from a non-probability based online access panel (n=2,000) on “AI and digitalization” a series of experiments evaluating prompts on various types of satisficing behavior such as non-differentiation and item nonresponse have been implemented.

Also, two validated German psychometric scales were utilized to measure how responsive respondents are to interventions: One on reactance (Herzberg, 2002) and one on receptivity to instructional feedback (Bahr et al., 2024).

Results

Field work is still underway. In the analysis we aim to test, whether these personal traits are able to explain the differential effectiveness of the various established satisficing prompts.

Added Value

The study contributes to a better understanding of the differential effectiveness of satisficing prompts. Based on the results we aim to tailor the frequency, the presentation and the wording of prompts to the respondents’ psychometric profile. We assume that an improved respondent experience may foster effectiveness of interventions and ultimately data quality.



Understanding item-nonresponse in open questions with requests for voice responses

Camilla Salvatore1, Jan Karem Höhne2,3

1Utrecht University, Netherlands, The; 2German Center for Higher Education Research and Science Studies (DZHW); 3Leibniz University Hannover

Relevance & Research Question
Conducting smartphone surveys offers flexibility in collecting various types of responses. Among these various response modalities, voice responses stand out for their potential to facilitate deeper respondent engagement and expression. However, high item-nonresponse rates pose significant challenges to their large-scale use. Our research question is the following: which and to what extent socio-demographic characteristics, technological skills, and survey related aspects are associated with item-nonresponse?
Methods & Data
In this web survey study, we use data that was collected in the Forsa Omninet Panel in Germany in November 2021. Forsa drew a cross-quota sample from their online panel based on age (young, middle, and old) and gender (female and male). In addition, they drew quotas on education (low, medium, and high). The quotas were calculated based on the German Microcensus, which served as a population benchmark. For recording respondents’ voice responses, we implemented the open-source “SurveyVoice” (SVoice) tool that was implemented in the Forsa web survey system. The sample includes data from 501 respondents. Overall, we investigate data from eight open questions with requests for voice responses.
Results
Two main respondent groups emerged from the data: voice skippers (about 30% of respondents who skipped all voice questions but answered closed questions), and voice engagers (those who responded to at least one open voice question). Male respondents were more likely to be voice skippers. In contrast, respondents who expressed interest in the survey topic were less likely to be voice skippers. Among voice engagers, response rates were higher among those with good smartphone skills and positive survey perceptions (e.g. evaluating the survey as easy, not long, not intimate, and interesting). Response rates and response lengths also varied by question topic.
Added Value
The study presents key insights into the characteristics associated with item-nonresponse in open voice questions. It provides an important starting point for future studies that aim at enriching web survey data collection through rich respondent narrations across various respondent groups.



 
Contact and Legal Notice · Contact Address:
Privacy Statement · Conference: GOR 25
Conference Software: ConfTool Pro 2.8.105
© 2001–2025 by Dr. H. Weinreich, Hamburg, Germany