Conference Agenda

Overview and details of the sessions of this conference. Please select a date or location to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).

 
 
Session Overview
Session
3.2: Respondent Engagement and Attrition
Time:
Tuesday, 01/Apr/2025:
12:00pm - 1:15pm

Session Chair: Ellen Laupper, Swiss Federal University for Vocational Education and Training SFUVET, Switzerland
Location: Hörsaal B


Show help for 'Increase or decrease the abstract text size'
Presentations

Attrition patterns and warning signs in a long-term, high frequency probability online panel

Tobias Rettig, Anne Balz

University of Mannheim, Germany

Relevance & Research Question

All longitudinal- and panel studies are confronted with the gradual loss of respondents, i.e., panel attrition. Over time, this leads to lower respondent numbers, loss of statistical power, and, if the attrition is nonrandom, systematic loss of specific respondents and thus biases in the remaining sample. Using data from a long term high-frequency panel, we investigate (1) which respondents are disproportionately lost over time and may be specifically targeted, oversampled, or the survey adjusted to facilitate participation, (2) what are warning signs of attrition, and (3) which survey features are associated with higher attrition and thus present opportunities for optimization.

Methods & Data

Using data from a probability online panel of the German population, we analyze respondents’ participation in over 70 panel waves spanning 12 years. This gives us the rare opportunity to analyze rich data from a high frequency and long running panel. Descriptively, we observe how the panel composition changes over time and which respondents are disproportionately lost. Using a survival model, we investigate risk factors for attrition on the respondent level, in their participation patterns, and in survey features.

Results

We observe high attrition over the first panel waves and slower but steady loss of respondents long term. Over time, the sample tends towards being higher educated, more likely to be married, and skews slightly more male. Attrition risk is lower for younger, higher educated, and full-time employed respondents. Higher attrition risk is associated with patterns of infrequent participation (breakoffs, missing panel waves, participating late during field time, item nonresponse), but not with interrupting the survey to continue later. Higher attrition risk is also associated with longer and poorly rated survey waves.

Added Value

For panel practitioners, it is important to understand attrition patterns to accurately predict how many respondents to expect in future waves, when and how many respondents to recruit, and which groups should be specifically targeted or oversampled. We identify several groups that are at higher risk for attrition, early warning signs that may be used to counteract attrition with targeted interventions, and opportunities to optimize surveys for continued participation.



Do we need to include offliners in self-administered general population surveys? An analysis of 150 substantive variables in a probability-based mixed-mode panel survey in Germany

Lena Rembser, Tobias Gummer

GESIS – Leibniz Institute for the Social Sciences, Germany

Relevance & Research Question: Due to concerns about bias stemming from the undercoverage of non-internet users, most probability-based surveys try to include offliners (i.e., respondents not able or willing to participate online). Previous research shows that including offliners increases accuracy for some socio-demographic characteristics while not for others. These studies lack the inclusion of substantive variables. We aim to address this research gap by answering the following research question: Does the inclusion of offliners in a probability-based panel impact measures of substantive variables?

Methods & Data: We use data from the GESIS Panel.pop, a probability-based self-administered mixed-mode panel of the German general population, surveyed via web and mail mode. We analyze around 150 substantive variables from six core studies collected since 2014, which we compare between the whole sample and a sample of only onliners (i.e., without offliners). To assess the impact of including offliners, we compute differences between both samples for each substantive variable and compute average absolute relative bias (AARB) for each variable and by (sub-)topic. In addition, we re-run these analyses for different definitions of onliners and offliners and for different recruitment cohorts.

Results: Comparing the online-only subsample with the complete mixed-mode sample that includes offliners shows statistically significant average absolute relative biases for all six topics (subjective wellbeing; social and political participation; environmental attitudes and behavior; personality and personal values; media usage; work and leisure) and for socio-demographic variables. This findings shows that univariate estimators for a wide variety of topics differ depending on whether offliners are included or not.

Added Value: Our study contributes to the practical challenge of deciding whether to include the offline population in surveys by employing a costly and labor-intensive mail mode. We go beyond previous research by examining a wide range of substantive variables which will enable us to draw conclusions on topic areas in which including offliners is more warranted than in others. We expect our findings to be of relevance for survey practitioners and substantive researchers alike.



Reducing Political Science Surveys’ Attrition Bias Without Limiting Substantive Research: Potentials of Adaptive Survey Design and Missing Data Strategies.

Saskia Bartholomäus

GESIS - Leibniz Institute for the Social Sciences, Germany

Relevance & Research Question

Adaptive Survey Designs that use respondents' topic interests can reduce the overrepresentation of politically engaged respondents in political science surveys. Politically disengaged respondents would receive a questionnaire combining political and non-political modules to boost participation, while politically engaged respondents receive a purely political questionnaire. However, assigning respondents based on political interest may distort research if variables of interest correlate with political interest. Instead, researchers could assign only a certain percentage of respondents with low political interest to tailored questionnaires and use missing data strategies based on main survey data to correct biased estimates. This paper aims to assess whether adaptive designs that rely on content variation bias substantive research and whether missing data procedures mitigate this bias.

Methods & Data

Using the probability-based GESIS Panel, I simulate several datasets in which 50% to 100% of politically disengaged respondents’ responses to various variables are set as missing. I then run several regression models using the original and simulated datasets to answer RQ1. To answer RQ2, I calculate the regression models using inverse probability weights and multiple imputation.

Results

Preliminary results suggest that adaptive survey designs that vary a questionnaires’ content can bias substantive research if more than 50% of respondents are randomly assigned to question modules that are supposed to be more interesting. However, assigning only a certain percentage of respondents and applying missing data strategies instead reduces biased estimates and corrects confidence intervals.

Added Value

While adaptive survey designs with content variation may reduce attrition bias in political science surveys, researchers must be cautious if variables of interest correlate with political interest as estimates may be biased. Assigning only a certain percentage of respondents to a question module that is supposed to be more interesting to them mitigates the risk of biased estimates when using inverse probability weights or multiple imputation.



Quantifying Questionnaire Design: A Holistic Approach to Data Quality and User Engagement

Eva Wittmann, Cecile Carre

Ipsos

Relevance & Research Question

The market research industry continues to face quality issues, often prioritizing the identification and removal of "problematic" respondents. However, this approach overlooks a crucial aspect of data quality: adapting questionnaires to align with respondents' lifestyles and enhancing their appeal to high-quality panel members. By focusing on survey design optimization, we can potentially improve data integrity at its source, rather than relying solely on cleansing methods.

This paper presents a quantitative perspective on enhancing data quality through optimized questionnaire design. Measuring questionnaire design effectiveness is complex due to the multitude of elements influencing respondent experience. While previous research has attempted to identify factors that make questionnaires difficult to answer within one research type and/or topic, we lack clear thresholds on what a good respondent is able to answer without risking diminishing response quality.

Our research addresses this gap by analyzing a comprehensive dataset from Ipsos' global operations over a multi-week period. We propose a questionnaire segmentation system, correlating design elements with Ipsos' quality indicators as a proxy for engagement. This approach allows us to identify measurable factors that minimize drops in respondent engagement and provide clearer recommendations for effective questionnaire structures. Importantly, we acknowledge the challenge of conducting research on research, given the significant influence of topic and interest levels on respondent behavior. Our methodology accounts for this variability, offering insights that are applicable across diverse research contexts.

Methods & Data
Meta-study across Ipsos business
Results
Research to be finalised early 2025
Added Value
By quantifying the impact of design elements on respondent engagement and data quality, with this study we hope to provide researchers with actionable strategies to create more respondent-centric surveys. Our findings contribute to a more holistic understanding of questionnaire design, moving beyond mere identification of problematic responses to proactively enhancing the overall survey experience.



 
Contact and Legal Notice · Contact Address:
Privacy Statement · Conference: GOR 25
Conference Software: ConfTool Pro 2.8.105
© 2001–2025 by Dr. H. Weinreich, Hamburg, Germany