GOR 26 - Annual Conference & Workshops
Annual Conference- Rheinische Hochschule Cologne, Campus Vogelsanger Straße
26 - 27 February 2026
GOR Workshops - GESIS - Leibniz-Institut für Sozialwissenschaften in Cologne
25 February 2026
Conference Agenda
Overview and details of the sessions of this conference. Please select a date or location to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).
|
Session Overview |
| Session | ||
5.2: Online panels I
| ||
| Presentations | ||
Comparing Probability, Opt-In, and Synthetic Panels: A Case Study from the Netherlands 1Norstat, Netherlands, The; 2Lifepanel Relevance & Research Question The growth of nonprobability online panels and the emergence of synthetic survey respondents have created new opportunities and uncertainties for social measurement. While probability samples remain the reference standard, opt-in and synthetic data sources offer faster fieldwork and lower cost but may introduce unknown biases. This study asks: How comparable are attitudes measured across a probability panel, an opt-in panel, and a synthetic dataset? Methods & Data Three parallel surveys (≈500 completes each) were administered using identical instruments.
The questionnaire measured perceptions of the national situation, attitudes toward elections, and interest in sports. Analyses include demographic comparison with CBS benchmarks, item nonresponse, variance structures, and inter-item correlations. Calibration experiments test post-stratification and raking on age, gender, education, and region to evaluate alignment potential. Results The probability panel demonstrates the expected demographic balance and serves as the comparative baseline. The opt-in panel aligns closely with probability results after weighting, although unweighted data show overrepresentation of younger, higher-education groups. Attitudinal means are largely consistent across the two empirical samples, with modest discrepancies in political trust and evaluations of the national direction. The synthetic dataset approximates mean values for several attitude items but exhibits compressed variance and weakened correlation patterns, indicating insufficient behavioral realism. Some synthetic respondents show inconsistent response structures not observed among human participants. Calibration improves demographic similarity but does not correct these structural limitations, suggesting that synthetic data are constrained more by model assumptions than by post-survey adjustment. Added Value This is one of the first empirical comparisons integrating probability, opt-in, and synthetic survey data within a single national framework. The study provides practical guidance on when synthetic respondents can complement empirical data (e.g., instrument testing) and where their limitations lie. It also clarifies the degree to which calibration can bridge differences between probability and nonprobability data but highlights fundamental constraints for synthetic datasets. The findings contribute to methodological best practices as synthetic data become increasingly visible in survey research. Optimizing Panel Consent using Repeated Requests while Experimentally varying Request Placement and Panel Consent Incentives Institute for Employment Research (IAB), Germany Relevance & Research Question High panel consent rates are essential for reducing panel attrition and limiting the risk of panel consent bias in panel surveys. This study investigates how panel consent rates can be optimized by applying three innovative survey design features covering a repeated request for panel consent within the questionnaire while experimentally varying the placement of requests (beginning vs. end) and the incentive for panel consent. Methods & Data Analyses are based on the recruitment wave of the third cohort of the "Online Panel for Labour Market Research" (OPAL) and cover about 7.200 cases (about 12% are classified as partial interviews). In our design of repeated requests for panel consent within the questionnaire, respondents who do not provide consent at the first request are followed up with a second request to reconsider their decision. The survey experiment comprises four experimental groups that differ in the placement of the first panel consent request (beginning vs. end) and the incentive for panel consent at the first (0€ vs. 5€) and the second request (5€ vs. 10€). Results Concerning the first request, an early request is more successful than a late placement: When offering no incentive, the panel consent rate is higher when asked at the beginning rather than asking at the end. When offering a 5€ incentive, the panel consent rate is higher when asking at the beginning rather than at the end. Due to the second request, the cumulated panel consent rate increases by 5 to 10 percentage points across experimental groups. The highest cumulative panel consent rate after two requests has the design with the first request at the beginning while offering 5€ and offering 10€ at the second request at the questionnaire end. Added Value This paper provides evidence, that the highest panel consent rates are realized with a placement at the beginning of the questionnaire, which questions the traditional placement at the end. The panel consent rate can be significantly improved when implementing a second request within the questionnaire. Results show that the placement of a panel consent request can be more relevant than incentivizing. You’ve got Mail: Does sending Thank-you postcards increase response in a probability-based online panel? 1GESIS, Germany; 2University of Mannheim, Germany; 3Heinrich Heine Universität Düsseldorf, Germany; 4University of Hamburg, Germany Relevance & Research Question Survey nonresponse is one of the major challenges to survey data quality. While various treatments, e.g., monetary incentives, have been tested to increase response rates, the evidence regarding differential effects of the treatments for different population groups is rather thin. For example, it is known that incentives work well overall, but it is unclear whether a less costly form of appreciation (like a postcard) would achieve the same or a greater effect for certain population groups or whether some groups do not need any treatment at all. Methods & Data An experiment to increase survey response will be fielded in mid-December 2025 among the newly recruited participants of the German Internet Panel (GIP). After the first panel wave, before the second panel wave, panelists will randomly be assigned to four experimental groups: 1.) receiving a “Thank-you” postcard from the GIP team 2.) receiving a handwritten “Thank-you” postcard with the same text as 1.) 3.) receiving a postcard that states that they have been credited an extra of 5 Euro as a “Thank-you” for being in the panel 4.) control group The postcards are not connected to the invitation to the second panel wave but are a general expression of appreciation for the panelist’s participation in the study. Results Results will be presented including: 1.) the overall effect of the treatment on nonresponse in wave 2 2.) possible interaction effects of personal characteristics and the treatment on nonresponse in wave 2. The personal characteristics include basic socio-demography, the BIG 5 personality traits, and the motivation to participate in the survey. Added Value Looking at heterogeneous effects of the treatment on nonresponse, the findings will inform survey practitioners on developing a targeted design to increase response rates and more specially, to increase response rates for specific population subgroups. | ||