Overview and details of the sessions of this conference. Please select a date or location to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).
Pay to Stay? Examining the Long-Term Impact of Initial Recruitment Incentives on Panel Attrition
Almuth Lietz, Jannes Jacobsen, Jonas Köhler, Madeleine Siegel, Jörg Dollmann, Sabrina J. Mayer
German Center for Integration and Migration Research, Germany
Relevance & Research Question
This study explores the long-term effects of incentives provided during the recruitment wave of a panel study, focusing on their impact on panel consent and subsequent dropout risks. Specifically, it examines how the amount and conditionality of incentives influence participation decisions and whether transitioning from a pre-paid incentive in the recruitment wave to a post-paid incentive in subsequent waves affects dropout rates. These questions are critical for developing sustainable strategies for panel recruitment and retention, particularly in the context of probability-based online access panels.
Methods & Data
The analysis is based on data from the DeZIM.panel, a probability-based online access panel in Germany. Logistic regression models were used to examine panel consent and participation behaviours across different experimental conditions. The dataset includes responses from 9,168 participants in the recruitment wave and longitudinal data spanning 70,926 person-years across subsequent waves. The experimental design varied incentive types (prepaid vs. postpaid) and amounts (€5 vs. €10), enabling a detailed assessment of their effects on respondent behaviour.
Results
The findings reveal that pre-paid incentives significantly reduce panel consent compared to post-paid incentives. However, the amount of the incentive (€5 vs. €10) does not significantly influence consent rates. Long-term analyses show no substantial effects of either the incentive type or amount on participation rates across subsequent panel waves. Furthermore, switching from a prepaid incentive in the recruitment wave to a postpaid incentive in the first panel wave does not increase dropout risks in future waves. These results suggest that higher or unconditional incentives do not necessarily yield higher response rates or long-term participation in panel studies.
Added Value
This study contributes to the literature on survey methodology by challenging the common assumption that higher or unconditional incentives automatically enhance response rates. The findings emphasize the importance of designing incentive strategies that balance short-term participation gains with the long-term sustainability of panel studies. By providing empirical evidence from a probability-based online panel in Germany, the study highlights the potential for more cost-effective and nuanced incentive policies in maintaining panel participation over time.
Socially Desirable Responding in Panel Studies – How Does Repeated Interviewing Affect Responses to Sensitive Questions?
Fabienne Kraemer
GESIS Leibniz Institute for the Social Sciences, Germany
Relevance & Research Question Social desirability (SD-) bias (the tendency to report socially desirable opinions and behaviors instead of revealing true ones) is a widely known threat to the validity of self-reports. Previous studies investigating socially desirable responding (SDR) in a longitudinal context provide mixed evidence on whether SD-bias increases or decreases with repeated interviewing and how these changes affect response quality in later waves. However, most studies were non-experimental and only suggestive of the mechanisms of changes in SD-bias over time. Methods & Data This study investigates SDR in panel studies using a longitudinal survey experiment comprising six waves. The experiment manipulates the frequency of answering identical sensitive questions (target questions) and assigned respondents to one of three groups: One group received the target questions in each wave, the second group received the target questions in the last three waves, and the control group received the target questions only in the last wave of the study. The experiment was conducted within a German non-probability (n = 1,946) and a probability-based panel (n = 4,660). The analysis focusses on between- and within-group comparisons to investigate changes in answer refusal and responses to different sensitive measures. To further examine the underlying mechanisms of change, I conduct moderator and mediator analyses on the effects of respondents’ privacy perceptions and trust towards the survey (sponsor). Results First results show a decrease of answer refusal and SDR with repeated interviewing for most of the analyzed sensitive measures. However, these decreases were non-significant for both between-group comparisons and comparisons over time. Added Value Altogether, this study provides experimental evidence on the impact of repeated interviewing on changes in SD-bias and contributes to a deeper understanding of the underlying mechanisms by examining topic-specific vs. general survey experience and incorporating measures on privacy perceptions and trust towards the survey (sponsor).
Using Apps to Motivate Respondents: Experiences from an Experiment in a Refugee Panel Study
Florian Heinritz1, Michael Ruland2, Thom Weiß2, Katharina Sandbrink2, Jutta von Maurice1
1Leibniz Institute for Educational Trajectories, Germany; 2infas Institute for Applied Social Science, Germany
Relevance & Research Question
Be it online panels or traditional panel studies, the commitment of participants to a longitudinal study is essential for the stability of a panel. In many panels, respondents are therefore contacted by letter at regular intervals with information about the study. With the growing importance of smartphones, tailored apps that use smartphones as a channel to communicate with respondents can help to keep in touch with respondents by sending them notifications and messages about the study. In this context the question arises, whether it makes sense to contact respondents more frequently or less frequently via app and how do respondents react to notifications?
Methods & Data
In the study “Refugees in the German Educational System (ReGES)”, an experiment was conducted with the “my infas” app that varied the frequency of contact via the app. One experimental group of respondents was contacted 6 times, while the rest were only contacted 12 times in the same period. The data from this experiment is used to cluster the participation behavior more precisely using a sequence analysis.
Results
Of the 2,740 respondents to whom we sent messages via the app as part of the experiment, only 271 people read at least one message. Due to this low number of participants, the potential for analysis is limited. Nevertheless, it is possible to draw some insights from this experiment into how respondents react to messages. Four clusters were identified: The One-Time Clickers, the Curious, the Awakened and the Interested. If we look not only at whether the message and the results were read, but also at when the messages were read, we see that many respondents read the messages very late (on average 28 days after receiving them).
Added Value
As many people do not react to them at all or very late, the analyses suggest that using an app in panel studies is not a guaranteed success, but potential issues such as respondents turning off push notifications or uninstalling the app need to be considered as time-critical messages sent via apps may not reach all respondents.