Conference Agenda

Overview and details of the sessions of this conference. Please select a date or location to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).

 
 
Session Overview
Session
A3.1: Solutions for Survey Nonresponse
Time:
Thursday, 22/Feb/2024:
3:45pm - 4:45pm

Session Chair: Oriol J. Bosch, University of Oxford, United Kingdom
Location: Seminar 1 (Room 1.01)

Rheinische Fachhochschule Köln Campus Vogelsanger Straße Vogelsanger Str. 295 50825 Cologne Germany

Show help for 'Increase or decrease the abstract text size'
Presentations

Does detailed information on IT-literacy help to explain nonresponse and design nonresponse adjustment weights in a probability-based online panel?

Barbara Felderer1, Jessica Herzing2

1GESIS, Germany; 2University of Bern

Relevance & Research Question

The generalizability of inference from online panels is still challenged by the digital divide. Newer research concludes that not only individuals who do not have Internet access are under-represented in online panels but also those who do not feel IT-literate enough to participate which is potentially leading to nonresponse bias.

Weighting methods can be used to reduce bias from nonresponse if they include characteristics that are both correlated to nonresponse and the variable(s) of interest. In our study we assess the potential of asking nonrespondents about their IT-literacy in a nonresponse follow-up questionnaire on improving nonresponse weighting and reducing bias. Our research questions are:

1.) Does including information on IT-literacy collected in the recruitment survey improve nonresponse models for online panel participation compared to standard nonresponse models including socio-demographics only?

2.) Does including IT-literacy improve nonresponse adjustment?

Methods & Data

Data are collected in the 2018 recruitment of a refreshment sample of the probability-based German Internet Panel (GIP). Recruitment was conducted by sending invitation letters for the online panel by postal mail. Sampled individuals who were not willing or able to participate in the recruitment online were asked to fill in a paper-and-pencil questionnaire asking about their IT-literacy. The questionnaire was experimentally fielded in the first invitation or reminder mailings. The control group did not receive a paper questionnaire.

Results

We find IT-literacy to explain nonresponse to the GIP over and above the standard socio-demographic variables frequently used in nonresponse modeling. Nonresponse weights including measures of IT-literacy are able to reduce bias for variables of interest that are related to IT-literacy.

Added Value

Online surveys bear the risk of severe bias for any variables of interest that are connected to IT-literacy. Fielding a paper-and-pencil nonresponse follow-up survey asking about IT-literacy can help to improve nonresponse weights and reduce nonresponse bias.



Youth Nonresponse in the Understanding Society Survey: Investigating the Impact of Life Events

Camilla Salvatore, Peter Lugtig, Bella Struminskaya

Utrecht University, The Netherlands

Relevance & Research Question

Survey response rate are declining worldwide, particularly among young individuals. This trend is evident in both cross-sectional and longitudinal surveys, such as Understanding Society, where young people exhibit a higher likelihood of either missing waves or dropping out entirely.

This paper aims to explore why young individuals exhibit lower participation rates in Understanding Society. Specifically, we investigate the hypothesis that young people experience more life events such as a change in job, relationship status and a move of house, and it is the occurrence of such life events that are associated with a higher likelihood to not participate in the survey.

Methods & Data

The data source is Understanding Society, a mixed-mode probability-based general population panel study in the UK. We analyze individuals aged 18-44 at Understanding Society's Wave 1, and we follow them until Wave 12. We consider four age groups: 18-24 (youth), 25-31 (early adulthood), 32-38 (late adulthood) and 39-45 middle age (reference group for comparison). In order to study the effect of life events on attrition, we applied the Discrete-Time Multinomial Hazard Model. In this model the time is entered as a covariate and the outcome variable is the survey participation indicator (interview, noncontact, refusals or other). The outcome is modeled as a function of lagged covariates, including demographics, labor market participation, qualifications, household structure and characteristics, marital status and mobility, as well as binary indicators for life event-related status changes.
Results

Consistent with existing literature, our findings reveal that younger respondents, as well as those with an immigration background, lower education, and unemployment status, are less likely to participate. We also demonstrate that changes in job status and relocation contribute particularly to attrition, with age remaining a significant factor.
Added Value

As many household surveys are moving online to save costs, the findings of this study will offer valuable insights for survey organizations. This paper enriches our understanding of youth nonresponse and presents practical strategies for retaining them. This project is funded by the Understanding Society Research Data Fellowship.



Exploring incentive preferences in survey participation: How do socio-demographic factors and personal variables influence the choice of incentive?

Almuth Lietz, Jonas Köhler

Deutsches Zentrum für Integrations- und Migrationsforschung (DeZIM), Germany

Relevance & Research Question
Incentives for survey participants are commonly used to tackle declining response rates. It was shown that cash incentives are particularly effective in increasing response rates. However, the feasibility of cash incentives for publicly funded research institutions is not always guaranteed. As a result, other forms such as vouchers or bank transfers are often used. In our study, we aim to identify the extent to which socio-demographic and personal variables influence individuals' preference for either vouchers or bank transfers. In addition, we examine differences in preferences concerning specific vouchers from different providers.

Methods & Data
We draw on data from the DeZIM.panel - a randomly drawn, offline recruited online access panel in Germany with an oversampling of specific immigrant cohorts. Since 2022, regular panel operation has taken place with four waves per year, supplemented by quick surveys on current topics. So far 9 regular waves have already been carried out. Within the surveys, we offer compensation in form of a € 10 postpaid incentive. Respondents can choose between a voucher from Amazon, Zalando, Bücher.de, and a more sustainable provider called GoodBuy. Respondents can also provide us with their bank account details and we transfer the money.

Results
Analysis reveals that over half of the respondents who redeemed their vouchers chose an Amazon voucher and around 40 percent preferred to receive the money by bank transfer. Only a small proportion of 7 percent chose one of the other vouchers. This pattern can be seen across all waves. Initial results of logistic regressions show a significant preference for vouchers among those with higher net incomes. Additionally, we will examine participants who, despite not redeeming their incentive, continue to participate regularly in the survey.

Added Value
Understanding which incentives work best for which target group is of great relevance when planning surveys and finding an appropriate incentive strategy.



 
Contact and Legal Notice · Contact Address:
Privacy Statement · Conference: GOR 24
Conference Software: ConfTool Pro 2.8.101
© 2001–2024 by Dr. H. Weinreich, Hamburg, Germany