Conference Agenda

Overview and details of the sessions of this conference. Please select a date or location to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).

 
 
Session Overview
Session
A3.2: Survey Instruments
Time:
Thursday, 22/Feb/2024:
3:45pm - 4:45pm

Session Chair: Cornelia Neuert, GESIS Leibniz Institute for the Social Sciences, Germany
Location: Seminar 3 (Room 1.03/1.04)

Rheinische Fachhochschule Köln Campus Vogelsanger Straße Vogelsanger Str. 295 50825 Cologne Germany

Show help for 'Increase or decrease the abstract text size'
Presentations

Unmapped potentials: Measuring and considering the self-defined residential area of individuals

Maximilian Sprengholz1, Zerrin Salikutluk2, Christian Hunkler3

1Humboldt-Universität zu Berlin; 2DeZIM-Institut, Humboldt-Universität zu Berlin; 3Humboldt-Universität zu Berlin

Relevance & Research Question

Many research questions would benefit from information about individuals’ self-defined residential area, i.e., where they spend their daily lives. However, the location information collected (or available to researchers given pseudonymization requirements) are typically larger aggregates, such as the postcode area. Even if respondents’ addresses were available, we still would not know where they actually run their errands, go to the doctor, or spend family time on the playground. However, this is the area substantially affecting their lives, and often the area they care about most. We collected data on the self-defined residential areas for research on anti-Muslim racism and examine the determinants of opposition to the opening of new Muslim-read establishments in those areas.

Methods & Data

As part of a representative online survey in Germany implemented by Kantar (n = 17.500), respondents drew their residential area as a polygon on a map using the open map tools offered by OpenStreetMap, Leaflet, and Leaflet Draw. Besides a wide range of socio-demographic and attitudinal measures, we also asked about several aspects of the self-defined residential area, e.g., the number of mosques, Turkish/Arabic-read restaurants and supermarkets. Moreover, we asked if respondents or their neighbors would oppose new establishments being built/opened. We then merged the actual number of establishments in that area fetched via the Google Places API.

Results [preliminary, new results to be expected by Nov 30]

Our results show that about 40 percent of respondents drew a plausible residential area (validated by postcode, shape, and size). First analyses indicate that respondents particularly oppose the opening of new Muslim-read establishments if there are none or very few in their residential area (e.g., not a single mosque).

Added Value

Although still preliminary, it appears that collecting information about respondents’ residential areas works reasonably well in online surveys with tools already available. Once the area information is collected, it is easy to add point-referenced geographical data, e.g., from Google Places or OpenStreetMap, and crosswalks can be used to add data corresponding to other geographical units – all of which may offer valuable additional perspectives.



Partnership biographies in self-administered surveys: The effect of screening-in information on survey outcomes

Lisa Schmid, Theresa Nutz, Irina Bauer

GESIS – Leibniz-Institute for the social sciences, Germany

Relevance & Research Question

Many cross-sectional and panel studies survey retrospective biographical information, covering areas such as educational and occupational careers, fertility biographies, or former intimate relationship experiences. In interview-administered surveys, the use of event history calendars (EHC) has been established by collecting biographical data retrospectively and is evident to reduce recall effects and, thus, improve data quality. However, to handle raising survey costs, recent survey programs are increasingly conducted in self-administered modes, e.g., as web or mail surveys, or combine both in a mixed-mode approach. Additionally, the share of participants using their smartphone to respond to surveys is high and rising in self-administered modes. The lack of an interviewer, as well as displaying questionnaires on small screens impede the use of approved EHCs in self-administered surveys and at the same time increase the need for user-friendly survey tools and less complex questions. Based on data from a survey experiment, we test whether visual feedback in complex survey modules improves the data quality of biographical information, i.e., relationship biographies. We set out to investigate:

(1) Is displaying of information from previous questions on partnership(s) related to survey outcomes regarding non-response and interview duration?

(2) Does the number of partnerships and the partnership duration vary with the display of information from previous questions?

(3) Is the display of information from previous questions related to respondent burden?
Methods & Data

We run anova models on a sample of 3,446 respondents from a web survey conducted in December 2022. Within this survey, we vary the display of the question list on relationships using information from previous questions on partnerships as visual feedback. As visual feedback, we used the names of respondents’ (ex-)partners, their relationship status, and respective dates. The control group runs through the question lists without visual feedback.

Results

Preliminary results do not show differences between the experimental and control group regarding non-response and interview duration. However, our results hint at differences in the number of relationships reported and respondents' perceived burden.

Added Value

Our study adds knowledge on how complex survey modules can be conducted without the presence of an interviewer.



Considering Respondents’ Preferences: The Effects of Self-Selecting the Content in Web Survey Questionnaires

Katharina Pfaff, Sylvia Kritzinger

Universität Wien, Austria

Relevance & Research Question

Previous research has shown that the willingness to participate in surveys increases with the individual salience of the survey. In practice, this is usually taken into account to the extent that the survey topic and its relevance are presented in invitation letters hoping that it motivates a large and representative group to participate. In this study, we investigate how the willingness to participate in a survey and respondents’ survey satisfaction change when they can choose from different topics.

Methods & Data

Our exploratory data analysis compares response rate, panel recruitment, and survey experience of 2,735 respondents. Descriptive statistics and Pearson's Chi-square test were used for the analysis. The sample is stratified by region and has been recruited offline from the Austrian Central Population Register. Respondents are randomly assigned to one of two questionnaire designs. In one design, the number and order are predetermined. Respondents assigned to the other design have the flexibility to decide how many topic blocks (modules) they would like to answer and in what order. Amongst others, the analysis examines the number, thematic preferences, and order of the selected modules. We also evaluate respondents' verbal feedback regarding the module selection choice.

Results

Announcing that respondents can choose among different survey topics does not attract more or different respondents as opposed to an invitation letter to a survey, in which this option is not mentioned. There is also no difference regarding panel recruitment. Yet, the share of respondents being very satisfied with the survey is higher among those, who answer more modules than required for the incentive. Answers to an open-ended feedback question mirrors this satisfaction. While modules on politics are less often selected than others, most respondents choose all six modules.

Added Value

This study examines the effect of a dynamic online survey design, in which respondents can flexibly select - at least partially - the content of their survey. Although in practice it may not always be possible to adjust the entire survey’s content to any respondent’s preferences, the study highlights advantages and disadvantages of letting respondents choose parts of the survey.



 
Contact and Legal Notice · Contact Address:
Privacy Statement · Conference: GOR 24
Conference Software: ConfTool Pro 2.8.101
© 2001–2024 by Dr. H. Weinreich, Hamburg, Germany