GOR 26 - Annual Conference & Workshops
Annual Conference- Rheinische Hochschule Cologne, Campus Vogelsanger Straße
26 - 27 February 2026
GOR Workshops - GESIS - Leibniz-Institut für Sozialwissenschaften in Cologne
25 February 2026
Conference Agenda
Overview and details of the sessions of this conference. Please select a date or location to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).
|
Session Overview |
| Session | ||
11.2: Ensuring participation
| ||
| Presentations | ||
The effects of push-to-complete reminders The SOM Institute, University of Gothenburg, Sweden Relevance & Research Question Survey researchers often allow respondents to fill out questionnaires both online and by paper-and-pencil forms in a mixed mode fashion. However, respondents who choose to complete questionnaires on paper tend to submit their questionnaires with fewer unanswered questions than respondents who choose to complete questionnaires online. Furthermore, many respondents who choose to fill out a questionnaire online do so without ever submitting their questionnaire. The aim of the present study is to evaluate whether sending digital push-to-complete reminders to respondents who have chosen to fill out a questionnaire online but have not yet submitted it are more likely to submit their questionnaires and submit them with fewer unanswered questions than similar respondents who do not get a digital push-to-complete reminder. Methods & Data The assessment will be made on a self-administered push-to-web mixed-mode survey (web and sequential paper-and-pencil questionnaire) distributed to a random sample of 9,000 individuals residing in Gothenburg, Sweden. In the experiment, potential respondents were randomly assigned to one of two groups: One group was sent digital push-to-complete reminders a few days after they had started but not submitted the questionnaire online, whereas the other group did not receive such a reminder. Results Data collection began in August 2025 and will be completed in early January 2026. Empirical results from the experiment will not be available until January 2026. Upon completing the data collection, this present abstract will be amended with the results of the experiment. The data will be analyzed by comparing response rates (RR1) and data quality between the treatment group and the control group. Added Value The present study contributes to existing research on survey reminders by examining whether digital push-to-complete reminders can increase response rates and data quality in push-to-web mixed-mode questionnaires specifically, and online questionnaires generally. The Effect of Survey Burden and Interval Between Survey Waves on Panel Participation: Experimental Evidence from the GLEN Panel 1RPTU Kaiserslautern-Landau, Germany; 2LMU Munich Relevance & Research Question Methods & Data We use data from the German Longitudinal Environmental Study (GLEN), a large-scale, nationwide randomly sampled panel on environmental topics launched in 2024. The experiments were implemented in an inter-wave survey in September 2025, followed by a panel wave in November 2025, used to measure participation effects. Experiment 1 The first experiment investigates the effect of time interval between survey waves by not inviting a random 10% (N = 1,727) of the eligible sample (N = 16,772) to the inter-wave survey. We expect longer intervals between panel waves to reduce participation rates, as a higher survey frequency creates habituation and increases familiarity and engagement with the panel. Experiment 2 In the second experiment, we examine the effect of complexity and thematic content of the questionnaire. In a first random split, 25% of participants received a long item battery on climate change skepticism, expected to increase burden due to its repetitive nature, while the rest answered a more diverse module on internet use. In a second random split, 90% were assigned a complex factorial survey experiment on CO2 pricing policies, expected to increase burden through topic complexity, while the rest answered questions on cultural participation. The assumed completion time was kept constant across groups, allowing us to isolate the effect of the questions on subsequent participation. For both experiments, we analyze the effect on participation in the next panel wave. Data collection will be completed by the end of 2025. We will present results at the conference. We contribute to the literature on panel nonresponse and survey experience by providing experimental evidence from a nationwide randomly sampled panel. Are interviewer administered follow-ups of web non-respondents still needed to maximise data quality? Evidence from Understanding Society: the UK Household Longitudinal Study 1University of Southampton, United Kingdom; 2University of Essex, United Kingdom Relevance & Research Question Many surveys have transitioned to online data collection. To minimize the risk of nonresponse bias, surveys often adopt a web-first mode with follow-up of nonrespondents via face-to-face or telephone interviewing. Evidence suggests such designs may reduce costs and may produce datasets of higher quality than web only designs. However, with proportions of populations using the internet increasing markedly and people becoming less willing to welcome interviewers, in recent years the contributions of web with face-to-face or telephone modes to minimizing non-response biases that justify such a design may have changed. This paper addresses this issue. The main research questions are: Do we still need to follow up web-non-respondents in a second mode to RQ1: maximise response rates? RQ2: maximise dataset representativeness? RQ3: maximise response by under-represented hard-to-reach population subgroups? RQ4: minimise non-response biases remaining after non-response weighting? and how has this changed over time? Methods & Data This study uses data from Understanding Society (UK Household Longitudinal Study, UKHLS). We focus on the Innovation Panel component of the study, in which a subset of sample members has been offered web interviews with face-to-face or telephone follow-ups of non-respondents. For each survey wave, we use Coefficients of Variation of response propensities to quantify the representativeness of web only and web plus face-to-face or telephone respondents. In addition, we use the UKHLS main survey, which enables investigation of hard-to-reach population groups. Results Key findings are: 1) follow-ups are still required to maximise response rates and dataset sizes, though impacts have declined; 2) the impact of follow-ups on representativeness has declined, with web and web plus face-to-face datasets not differing; 3) impacts of follow-ups on the under-representation of hard-to-reach population subgroups have become negligible; and 4) impacts of follow-ups on non-response biases remaining after non-response weighting, have similarly declined and are now negligible. Added Value We discuss the implications for survey practice. This paper is the first to investigate if follow-ups are still needed in web surveys in the UK context. If follow-ups are not needed any more, this could potentially have large cost-saving implications for survey agencies. | ||