Conference Agenda

Overview and details of the sessions of this conference. Please select a date or location to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).

 
 
Session Overview
Session
A7.1: Survey Methods Interventions 2
Time:
Friday, 23/Feb/2024:
3:15pm - 4:15pm

Session Chair: Joss Roßmann, GESIS - Leibniz Institute for the Social Sciences, Germany
Location: Seminar 1 (Room 1.01)

Rheinische Fachhochschule Köln Campus Vogelsanger Straße Vogelsanger Str. 295 50825 Cologne Germany

Show help for 'Increase or decrease the abstract text size'
Presentations

Pushing older target persons to the web: Do we still need a paper questionnaire?

Jan-Lucas Schanze, Caroline Hahn, Oshrat Hochman

GESIS - Leibniz-Institut für Sozialwissenschaften, Germany

Relevance & Research Question
While a sequential, push-to-web mode sequence is very well established in survey research and commonly used in survey practice, many large-scale social surveys still prefer to contact older target persons with a concurrent design, offering a paper questionnaire alongside a web-based questionnaire from the first letter onwards. In this presentation, we compare the performance of a sequential design with a concurrent design for target persons older than 60 years. We analyse response rates and compare the sample compositions and distributions in key items within resulting net samples. Ultimately, we aim to investigate whether we can push older respondents to the web and whether a paper questionnaire is still required for this age group.

Methods & Data
Data stems from the 10th Round of the European Social Survey (ESS) carried out in self-completion modes (CAWI/PAPI) in 2021. In Germany, a mode choice sequence experiment was implemented for all target persons older than 60 years. 50% of this group was invited with a push-to-web approach, offering a paper questionnaire in the third mailing. The control group was invited with a concurrent mode sequence, offering both modes from the beginning on.

Results
Results shows similar response rates for the concurrent design and the sequential design (AAPOR RR2: 38.4% vs. 37.3%). This difference is not statistically significant. In the concurrent group, 21% of the respondents answered the questionnaire online, while in the sequential group this was the case for 50% of all respondents. The resulting net samples are very comparable. Looking at various demographic, socio-economic, attitudinal, and behavioural items, no significant differences were found. In contrast, elderly respondents answering online are younger, more often male, much better educated, economically better off, more politically interested, or more liberal towards immigrants than their peers answering the paper questionnaire.

Added Value
Online questionnaires are considered as not fully appropriate for surveying the older population. This research shows that a higher share of this group can be pushed to the web without negative effects for response rate or sample composition. However, a paper questionnaire is still required for improving the sample composition.



Clarification features in web surveys: Usage and impact of “on-demand” instructions

Patricia Hadler, Timo Lenzner, Ranjit K. Singh, Lukas Schick

GESIS - Leibniz Institute for the Social Sciences, Germany

Relevance & Research Question
Web surveys offer the possibility to include additional clarifications to a survey question via info buttons that can be placed directly beside a word in the question text or next to the question. Previous research on the use of these clarifications and their impact on survey response is scarce.
Methods & Data
Using the non-probability Bilendi panel, we randomly assigned 2,000 respondents to a condition in which they A) were presented clarifications as directly visible instructions under the question texts, B) could click / tip on clarifications via an info button next to the word the respective clarification pertained to, C) could click / tip on clarifications via an info button to the right of the respective question text or D) received no clarifications at all. All questions used an open-ended numeric answer format and respondents were likely to give a smaller number as a response if they read the clarification.
Results
Following the last survey question that contained a clarification, we asked respondents in conditions A) through C) whether they had clicked / tipped on or read the clarification. In addition, we measured the use of the on-demand clarifications using a client-side paradata script. Results showed that while 24% (B) and 15% (C) of respondents claimed to have clicked on the last-shown on-demand clarification, only 14% (B) and 6% (C) actually did so for at least one question with clarification. Moreover, the responses to the survey question did not differ significantly between the conditions with on-demand instructions (B and C) and the condition with no clarifications (D). Thus, the only way to ensure that respondents adhere to a clarification is to present it as an always visible instruction as in condition A.
Added Value
The results demonstrate that presenting complex survey questions remains challenging. Even if additional clarification is needed for some respondents only, this clarification should be presented to all respondents; however, with the potential disadvantage of increasing response burden. To learn more about how respondents process clarification features, we are currently carrying out a qualitative follow-up study applying cognitive interviewing.



 
Contact and Legal Notice · Contact Address:
Privacy Statement · Conference: GOR 24
Conference Software: ConfTool Pro 2.8.101
© 2001–2024 by Dr. H. Weinreich, Hamburg, Germany