Session | ||
A3.1: Solutions for Survey Nonresponse
| ||
Presentations | ||
Does detailed information on IT-literacy help to explain nonresponse and design nonresponse adjustment weights in a probability-based online panel? 1GESIS, Germany; 2University of Bern Relevance & Research Question The generalizability of inference from online panels is still challenged by the digital divide. Newer research concludes that not only individuals who do not have Internet access are under-represented in online panels but also those who do not feel IT-literate enough to participate which is potentially leading to nonresponse bias. Weighting methods can be used to reduce bias from nonresponse if they include characteristics that are both correlated to nonresponse and the variable(s) of interest. In our study we assess the potential of asking nonrespondents about their IT-literacy in a nonresponse follow-up questionnaire on improving nonresponse weighting and reducing bias. Our research questions are: 1.) Does including information on IT-literacy collected in the recruitment survey improve nonresponse models for online panel participation compared to standard nonresponse models including socio-demographics only? 2.) Does including IT-literacy improve nonresponse adjustment? Methods & Data Data are collected in the 2018 recruitment of a refreshment sample of the probability-based German Internet Panel (GIP). Recruitment was conducted by sending invitation letters for the online panel by postal mail. Sampled individuals who were not willing or able to participate in the recruitment online were asked to fill in a paper-and-pencil questionnaire asking about their IT-literacy. The questionnaire was experimentally fielded in the first invitation or reminder mailings. The control group did not receive a paper questionnaire. Results We find IT-literacy to explain nonresponse to the GIP over and above the standard socio-demographic variables frequently used in nonresponse modeling. Nonresponse weights including measures of IT-literacy are able to reduce bias for variables of interest that are related to IT-literacy. Added Value Online surveys bear the risk of severe bias for any variables of interest that are connected to IT-literacy. Fielding a paper-and-pencil nonresponse follow-up survey asking about IT-literacy can help to improve nonresponse weights and reduce nonresponse bias. Youth Nonresponse in the Understanding Society Survey: Investigating the Impact of Life Events Utrecht University, The Netherlands Relevance & Research Question Survey response rate are declining worldwide, particularly among young individuals. This trend is evident in both cross-sectional and longitudinal surveys, such as Understanding Society, where young people exhibit a higher likelihood of either missing waves or dropping out entirely. This paper aims to explore why young individuals exhibit lower participation rates in Understanding Society. Specifically, we investigate the hypothesis that young people experience more life events such as a change in job, relationship status and a move of house, and it is the occurrence of such life events that are associated with a higher likelihood to not participate in the survey. Methods & Data The data source is Understanding Society, a mixed-mode probability-based general population panel study in the UK. We analyze individuals aged 18-44 at Understanding Society's Wave 1, and we follow them until Wave 12. We consider four age groups: 18-24 (youth), 25-31 (early adulthood), 32-38 (late adulthood) and 39-45 middle age (reference group for comparison). In order to study the effect of life events on attrition, we applied the Discrete-Time Multinomial Hazard Model. In this model the time is entered as a covariate and the outcome variable is the survey participation indicator (interview, noncontact, refusals or other). The outcome is modeled as a function of lagged covariates, including demographics, labor market participation, qualifications, household structure and characteristics, marital status and mobility, as well as binary indicators for life event-related status changes. Consistent with existing literature, our findings reveal that younger respondents, as well as those with an immigration background, lower education, and unemployment status, are less likely to participate. We also demonstrate that changes in job status and relocation contribute particularly to attrition, with age remaining a significant factor. As many household surveys are moving online to save costs, the findings of this study will offer valuable insights for survey organizations. This paper enriches our understanding of youth nonresponse and presents practical strategies for retaining them. This project is funded by the Understanding Society Research Data Fellowship. Exploring incentive preferences in survey participation: How do socio-demographic factors and personal variables influence the choice of incentive? Deutsches Zentrum für Integrations- und Migrationsforschung (DeZIM), Germany Relevance & Research Question Methods & Data Results Added Value |