GOR 26 - Annual Conference & Workshops
Annual Conference- Rheinische Hochschule Cologne, Campus Vogelsanger Straße
26 - 27 February 2026
GOR Workshops - GESIS - Leibniz-Institut für Sozialwissenschaften in Cologne
25 February 2026
Conference Agenda
Overview and details of the sessions of this conference. Please select a date or location to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).
|
Session Overview |
| Session | ||
12.1: Survey recruitment
| ||
| Presentations | ||
How Recruitment Channels Shape Data Quality: Evidence From A Multi-Source Panel GESIS Leibniz Institute for the Social Sciences, Germany Relevance & Research Question Declining response rates in traditional probability-based surveys have prompted researchers and survey practitioners to increasingly explore alternative recruitment strategies, such as via Social Networking Sites (SNS) and piggybacking (i.e., re-using respondents) from established surveys. While these strategies offer faster and more cost-efficient access to respondents, they also raise questions about the quality of the resulting data. SNS may threaten response quality through different motivational motives and an increased risk of satisficing. On the other hand, piggybacked samples may benefit from respondents’ experience and commitment but could suffer from conditioning effects. This research provides a comparative assessment of response quality across different recruitment strategies. Methods & Data Results will be presented at the conference in February. Preliminary analyses reveal notable sociodemographic differences (e.g., in age and education) across recruitment groups, pointing to potential disparities in response behaviors and data quality. Added Value Altogether, this study provides one of the first comparative assessments of response quality across recruitment strategies increasingly used in survey practice. By controlling for differences in sample composition, we disentangle compositional from recruitment-driven effects, offering insights into how integrated recruitment designs affect data quality. Looks great, responds poorly: lessons from ten years of invitation letter experiments Statistics Netherlands, The Netherlands Relevance & Research Question Our research combines qualitative pre-testing with large-scale field experiments. After many rounds of testing, a standard letter was designed that performs consistently well — until three new experiments challenged our assumptions. We examined (1) the effect of adding a QR code for easier access, (2) a shorter version of the letter, and (3) the response to a refreshed, more visually appealing layout. Each experiment used a fresh, representative sample and a corresponding control group. Results Adding a QR code had no significant effect (no code: 35.1% vs. QR code: 34.8%, n.s.). Added Value | ||