GOR 26 - Annual Conference & Workshops
Annual Conference- Rheinische Hochschule Cologne, Campus Vogelsanger Straße
26 - 27 February 2026
GOR Workshops - GESIS - Leibniz-Institut für Sozialwissenschaften in Cologne
25 February 2026
Conference Agenda
Overview and details of the sessions of this conference. Please select a date or location to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).
|
Session Overview |
| Session | ||
3.2: Video and images in survey research
| ||
| Presentations | ||
A picture is worth a thousand words: Factors influencing the quality of photos received through an online survey 1Centre d'Estudis Demogràfics, Spain; 2GESIS - Leibniz Institute for the Social Sciences; 3University of Mannheim Relevance & Research Question: Photos, which can be easily captured with smartphones, offer new opportunities to improve survey data quality by replacing or complementing conventional questions. However, potential advantages may vary by respondent characteristics, including age, gender, education, photo-taking and -sharing frequency, self-assessed verbal, mathematical, and spatial skills, and comfort with new technologies. This study addresses the following research question: To what extent do such individual characteristics affect the quality of photos of books at home submitted through an online mobile survey? The topic was selected because the number of books is a well-established proxy for cultural and socioeconomic capital in social sciences, yet its measurement through conventional questions is challenging. Methods & Data: Participants were asked for information on the books in their home through conventional survey questions and/or by submitting photos. This information covered the number of books, their intended audience (illiterate children, literate children and teenagers, and general audiences), language, and storage. Quality was evaluated with indicators tailored to book information, drawing on literature about data quality in survey methodology and in computer vision. The survey, conducted in 2023, used Netquest’s opt-in panel in Spain. The target population was parents of children in primary school. Of 1,270 individuals reaching the questions on books, 703 were asked for photos, and 238 provided at least one. Results: Photo quality was not systematically affected by most of the studied variables. Although older participants submitted more photos, extracting book information from their photos was less feasible, and they experienced more capture and submission issues. The findings suggest that photo-based surveys can be collected across diverse populations; however, age may be an exception in contexts like books, where finer details are required to extract information consistently. Added Value: Evidence on the quality of photos submitted through online surveys remains limited, especially when photos address a relevant social science topic, such as counting books and using that information to characterize respondents. This study contributes to the literature by showing that photos can be requested from broad audiences with comparable quality across respondents, and it offers practical guidance for researchers collecting photos. Enhancing participation in visual data collection in online surveys: Evidence from an experimental study about remote work environments RECSM - Universitat Pompeu Fabra, Spain Relevance & Research Question Collecting visual data through web surveys offers a promising way to obtain richer and more accurate information. Yet participation in image-based tasks remains low, and evidence on how to motivate respondents while maintaining data quality is limited. This study examines whether different strategies can help researchers achieve higher participation when requesting photos of remote work environments in web surveys: (1) offering an extra incentive specifically for sharing photos, beyond the standard survey participation incentive, (2) adding a follow-up prompt immediately after the initial photo request emphasizing the importance of sharing the photos, and (3) sending a reminder email to respondents not sharing the photos initially. Furthermore, we investigate whether the timing of incentive announcement (either at the initial request or only in the reminder) also affects participation. Methods & Data An experiment is being conducted in the opt-in online panel Netquest in Spain (N = 1,200) among adults who have worked remotely for at least seven hours per week in the past two months. Respondents are randomly assigned to one of three groups: (1) Control – asked to upload three photos of their home workspace without extra incentive; (2) Incentive – offered 10 extra panel points for uploading all photos; and (3) IncentiveReminder – offered the same extra incentive but only in the reminder. All groups get the follow-up prompt. Descriptive analyses will be used to assess differences in participation. Results The survey is currently being programmed. Data collection is planned for early December. Results are expected by early February 2026. We expect that extra incentives, follow-up prompts and reminders will all increase photo submission rates, and that announcing the incentive initially will be more effective. We will test this by comparing different participation indicators, such as break-off or item nonresponse defined in different ways, across experimental groups and across both conventional and visual formats. Added Value This study provides evidence on how different strategies might help improving participation in photos requests. Findings will help improve the design of online surveys combining textual and visual data while balancing respondent burden and data richness. Video-Interviews in Mixed-Mode Panel Surveys: Selective Feasibility and Data Quality Trade-offs 1DIW Berlin, Germany; 2GESIS Leibniz Institute for the Social Sciences, Germany; 3Humboldt University of Berlin, Germany Relevance & Research Question As survey methodologists seek innovative approaches to address declining response rates and evolving technological landscapes, Computer-assisted live video interviewing (CALVI) emerges as a promising hybrid mode combining the personal interaction of Computer-assisted personal interviews (CAPI) with the convenience of remote participation as given in Computer-assisted web interviews (CAWI). This study addresses the critical question: Is CALVI a feasible and useful add-on in a German general population mixed-mode household panel survey? Understanding CALVI's viability is essential for survey researchers considering technological adaptations to maintain data quality while accommodating respondent preferences. We implemented CALVI using a randomized controlled experimental design within the 2024 Innovation Sample of the Socio-economic Panel Study (SOEP-IS): (E1) 1,261 households from the established panel (previously using CAPI) randomly assigned to CALVI versus (C1) 1,106 households continuing with traditional face-to-face interviews; and (E2) 409 households from the 2023 refreshment sample (recruited in CAWI) assigned to CALVI versus (C2) 1,513 households continuing with web self-completion. Both experimental groups retained fallback options to their previous data collection modes. We analyzed participation rates, technical implementation success, and data quality indicators. Among 1,670 households invited to CALVI, 376 video interviews were conducted. Technical implementation proved largely successful, though 23% of interviews experienced connectivity issues, particularly when interviewers worked remotely. Unit nonresponse among CALVI-invited households is significantly higher compared to established modes, with pronounced effects among former CAWI participants (response rate in E1: 56% vs. C1: 61%; E2: 45% vs. C2: 74%). CALVI attracted specific demographic segments: young, highly educated, high-income, full-time employed urban participants from West Germany with reliable internet access. However, CALVI participants demonstrated superior data quality metrics, including lowest speeding rates, minimal item nonresponse, and highest data linkage consent rates. | ||