Conference Agenda

Overview and details of the sessions of this conference. Please select a date or location to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).

 
 
Session Overview
Session
7.3: Questionnaire Design
Time:
Wednesday, 02/Apr/2025:
9:00am - 10:00am

Session Chair: Yfke Ongena, University of Groningen, Netherlands, The
Location: Hörsaal C


Show help for 'Increase or decrease the abstract text size'
Presentations

B:RADICAL – Co-designing online survey questions with children and young people on their understanding and their experiences of respect and disrespect

Dirk Schubotz

Queen's University Belfast, United Kingdom

Relevance & Research Question

In this presentation, I will showcase how children and young people can be involved as co-researchers and research advisors in designing online survey questions and analysing survey results collected in large-scale online social attitude surveys conducted among children and young people. There is a growing interest in participatory and collaborative research approaches and an increasing recognition that the involvement of co-researchers can enhance the relevance, validity and meaningfulness of research data. Traditionally, collaborative research approaches predominantly utilised qualitative research methods, but more recently survey researchers have also been more open to adopting co-production approaches. In this study I will showcase such an collaborative approach to online surveys involving children and young people.
Methods & Data
This study is grounded in a rights-based approach to undertaking research with children and young people, using the Lundy Model with its four pillars (Voice, Space, Audience, Influence) to ensure that children and young people have a say, and their voices are listened to, when it comes to matters affecting their lives – here their experiences of respect and disrespect. Two Research Advisory Groups were formed and children and young people were trained as peer-researchers. Starting with explorative philosophical workshops, research questions were developed bottom-up, and were then included in two large-scale annual social attitude surveys undertaken in Northern Ireland – the YLT survey of 16 year olds and the KLT survey of 10-11 year olds.
Results
In this presentation, I will largely focus on the processes of designing online survey questions with children and young people as co-researchers, and I will reflect on the challenges of doing this. I may be able to present some very preliminary survey results, as the survey data collections closes at the time of the GOR conference.
Added Value
This presentation will be of value to those who are designing online surveys – in particular, but not exclusively, surveys for children and young people, and who are considering taking a collaborative co-production approach to data collection and analysis.



Does Succeeding on Attention Checks Moderate Treatment Effects?

Sebastian Lundmark1, Jon Krosnick2, Lisanne Wichgers3, Matthew Berent4

1University of Gothenburg, Sweden; 2Stanford University, USA; 3Lisanne Wichgers Consulting; 4Matt Berent Consulting

Relevance & Research Question

Attention checks have become a common tool in questionnaires administered online. Confirming their popularity, 58 out of the 139 articles published in the Journal of Personality and Social Psychology in 2019 featured an online experiment where at least one attention check was used to exclude participants. If participants do not pay attention to survey questions and treatments, Type-II error may be inflated, increasing the likelihood of false negative results. A few studies have found that excluding participants who failed the attention check strengthened treatment effects, presumably because excluding them reduced noise and increased data quality. However, participants failing attention checks typically differ from passers in characteristics such as age, gender, and education, so excluding failers compromises sample representativeness and reduces sample size.

Methods & Data

To assess the impact of excluding participants who fail attention checks on treatment effects and sample composition accuracy, data from sixty-six experiments were analyzed. In all experiments, online respondents were randomly assigned to one of two experimental conditions, with 750 people completing the questionnaire in each condition. This allowed for the assessment of whether the treatment effect became stronger when excluding attention check failers, the degree to which failing rates differed between the treatment and control conditions (which would compromise internal validity if failers are dropped), and the degree to which dropping failers compromised sample distribution of demographic characteristics.

Results

The results indicated that attention checks only weakly moderated treatment effects. Participants who failed the attention check showed statistically significant treatment effects despite, ostensibly, not paying attention. Including or excluding the failing participants did not alter any of the conclusions made about each of the sixty-six experimental treatment effects.

Added Value

This meta-analytical study adds to the growing research investigating the appropriateness of attention checks in online questionnaire administration. The study allowed for differentiating whether certain types of attention checks were more efficient in detecting inattentive participants. Lastly, the study results add insights into how excluding data from participants failing attention checks affects a sample's resemblance to the general population in terms of several demographic characteristics.



Balancing Questionnaire Length and Response Burden: Short and Long-Term Effects in the IAB Job Vacancy Survey

Nicole Gürtzen, Alex Kübis, André Pirralha

IAB, Germany

Relevance and Research Question: Declining survey response rates are a global concern, also affecting official statistics and establishment surveys. This trend is evident in Germany, raising questions about what factors influence participation. Prior research indicates that longer questionnaires can deter respondents due to increased time and effort, leading to higher response burden and lower response rates. While this effect has been shown in household surveys, its impact on establishment surveys should also be researched. This study examines how questionnaire length affects response rates and response burden in the IAB Job Vacancy Survey, a relevant data source for understanding labor demand and recruitment in Germany.

Methods & Data: In the 2023 wave of the IAB Job Vacancy Survey, around 2000 establishments were randomly assigned to receive either a concise two-page questionnaire or a detailed four-page version, differing in the number of questions. The survey employed a mixed-mode design, combining self-selected web or paper modes of data collection. This experimental design was implemented over three consecutive quarters, allowing the estimation of immediate and long-term effects throughout the year. By comparing the response behavior between the two groups over time, we

assessed the influence of questionnaire length on participation and perceived burden.

Results: The findings show no significant differences in response rates between the two groups across the three quarters. However, establishments who received the longer questionnaire reported significantly higher levels of perceived response burden. This effect was particularly pronounced in the paper mode of data collection, where respondents expressed greater burden compared to those using the web mode. This increased burden did not translate into decrease short-term participation but may have implications for respondent satisfaction a data quality.

Added Value: These results are significant for the design of establishment surveys and the production of official statistics. They suggest that reducing questionnaire length can lower response burden without significantly affecting response rates. This insight supports efforts to optimize the IAB Job Vacancy Survey's push-to-web design, aiming to enhance respondent experience and maintain high-quality data collection.



 
Contact and Legal Notice · Contact Address:
Privacy Statement · Conference: GOR 25
Conference Software: ConfTool Pro 2.8.105
© 2001–2025 by Dr. H. Weinreich, Hamburg, Germany