How Recruitment Channels Shape Data Quality: Evidence From A Multi-Source Panel
Fabienne Kraemer, Jessica Daikeler, Barbara Felderer, Barbara Binder
GESIS Leibniz Institute for the Social Sciences, Germany
Relevance & Research Question
Declining response rates in traditional probability-based surveys have prompted researchers and survey practitioners to increasingly explore alternative recruitment strategies, such as via Social Networking Sites (SNS) and piggybacking (i.e., re-using respondents) from established surveys. While these strategies offer faster and more cost-efficient access to respondents, they also raise questions about the quality of the resulting data. SNS may threaten response quality through different motivational motives and an increased risk of satisficing. On the other hand, piggybacked samples may benefit from respondents’ experience and commitment but could suffer from conditioning effects. This research provides a comparative assessment of response quality across different recruitment strategies.
Methods & Data We examine response quality across different recruitment strategies using data from the newly established multi-source panel study GESIS Panel.dbd in Germany. The GESIS Panel.dbd combines several recruitment approaches, sampling respondents through a) SNS from Meta, b) piggybacking from the German General Social Survey (ALLBUS), the German Longitudinal Election Study (GLES), and the GESIS Panel.pop, and c) traditional probabilistic sampling from population registers. To assess response quality, we analyze a range of indicators, such as item nonresponse, break-off rates, straightlining in item batteries, and response times. Our analyses control for sociodemographic composition across the different recruitment groups to disentangle effects of recruitment mode from compositional differences. Results
Results will be presented at the conference in February. Preliminary analyses reveal notable sociodemographic differences (e.g., in age and education) across recruitment groups, pointing to potential disparities in response behaviors and data quality.
Added Value
Altogether, this study provides one of the first comparative assessments of response quality across recruitment strategies increasingly used in survey practice. By controlling for differences in sample composition, we disentangle compositional from recruitment-driven effects, offering insights into how integrated recruitment designs affect data quality.
Looks great, responds poorly: lessons from ten years of invitation letter experiments
Jelmer de Groot, Ryanne Francot
Statistics Netherlands, The Netherlands
Relevance & Research Question What if the success of your survey depended on a single piece of paper? As survey researchers, we are now competing harder than ever for people’s attention and time. With the rise of online, self-administered online surveys, interviewers play a smaller role in motivating participation. For these online studies in the Netherlands, households can only be invited by traditional mail — making the invitation letter our sole opportunity to connect. Its wording, tone, access options and layout can decide whether someone visits the link or ignores the request. Over the past decade, Statistics Netherlands has continuously refined its invitation strategy in search of the perfect letter for a diverse population. Methods & Data
Our research combines qualitative pre-testing with large-scale field experiments. After many rounds of testing, a standard letter was designed that performs consistently well — until three new experiments challenged our assumptions. We examined (1) the effect of adding a QR code for easier access, (2) a shorter version of the letter, and (3) the response to a refreshed, more visually appealing layout. Each experiment used a fresh, representative sample and a corresponding control group.
Results
Adding a QR code had no significant effect (no code: 35.1% vs. QR code: 34.8%, n.s.). However, the redesigned “fancy” letter, praised in qualitative pre-testing, led to a significant drop in response (35.4% → 32.2%, p = 0.0003). The shortened letter, by contrast, increased participation (23,7% → 26,8%, p = 0,000) – but also changed the way respondents answered the questions, resulting in fewer short trips being reported in the travel survey.
Added Value Our findings reveal how intuitive design choices can have unintended consequences. While respondents claim to prefer modern, appealing letters, with QR-codes, actual behavior tells a different story and may subtly affect measurement. This presentation offers ten years of lessons — and a few humbling surprises — from the ongoing search for the holy grail of survey invitations.
Using Text Messages (SMS) for Representative Sample Recruitment in Online Research
Ioannis Andreadis
Aristotle University of Thessaloniki, Greece
Relevance & Research Question Amid declining survey response rates, innovative methods are needed to gather representative data. This presentation evaluates a novel "push-to-web" methodology. The core research question is: Can Short Message Service (SMS) text messages effectively replace traditional contact modes to recruit a representative, probability-based sample for mobile-friendly web surveys? Methods & Data This methodology has been tested in many Greek surveys since the 2019 Hellenic National Election Voter Study (ELNES 2019). The target population is Greek citizens, leveraging the country's near-universal (99%) mobile phone penetration. A probability-based sample is generated using a Random Digit Dialing (RDD) approach for mobile numbers. Selected individuals received a sequence of text messages: a pre-notification, an invitation with a unique survey URL, and reminders. The web survey is optimized for a seamless mobile experience. Results The SMS "push-to-web" approach is a feasible and effective method for large-scale data collection. Although the final response rate is between 6% and circa 10%, the high population coverage of mobile phones ensures a comprehensive and representative sampling frame. The study demonstrates the viability of conducting large-scale, probability-based web surveys using SMS as the primary contact mode. Added Value This study provides a proven, practical framework for using SMS in survey recruitment. It highlights the method's potential for researchers to recruit representative samples in populations with high mobile phone access, even without a traditional sampling frame. This presentation will discuss the implementation, advantages, and key considerations for broader application, such as cross-national differences in mobile costs and legal restrictions.
|