Conference Agenda

Overview and details of the sessions of this conference. Please select a date or location to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).

 
 
Session Overview
Session
A2: Mixing Survey Modes
Time:
Thursday, 22/Feb/2024:
12:00pm - 1:15pm

Session Chair: Jessica Daikeler, GESIS, Germany
Location: Seminar 1 (Room 1.01)

Rheinische Fachhochschule Köln Campus Vogelsanger Straße Vogelsanger Str. 295 50825 Cologne Germany

Show help for 'Increase or decrease the abstract text size'
Presentations

Navigating the Digital Shift: Integrating Web in IAB (Panel) Surveys

Mackeben Jan

Institut für Arbeitsmarkt- und Berufsforschung, Germany

Relevance & Research Question

In the realm of social and labor market research, a noteworthy transformation has unfolded over the past few years, marking a departure from conventional survey methods. Traditionally, surveys were predominantly conducted through telephone interviews or face-to-face interactions. These methods, while effective, were time-consuming and resource-intensive. However, with the rapid advancement of technology, there has been a significant paradigm shift towards utilizing online modes for data collection.

The emergence of the web mode has revolutionized the landscape of surveys, offering a more efficient and cost-effective means of gathering information. Online surveys provide researchers with a broader reach, enabling them to engage with diverse populations across geographical boundaries. Moreover, the convenience and accessibility of web-based surveys have contributed to increased respondent participation.

As we navigate the digital age, the web mode has become increasingly integral in shaping the methodologies of social and labor market research. Its versatility, speed, and ability to cater to a global audience underscore its growing importance in ensuring the accuracy and comprehensiveness of data collection in these vital fields.

Methods & Data

In this paper, we focus on the largest panel surveys conducted by the Institute for Employment Research. These include the Panel Labor Market and Social Security (PASS), the IAB Establishment Panel (IAB-EP), the Linked Personnel Panel (LPP), consisting of both employer and employee surveys, and the IAB Job Vacancy Survey. Historically, all these surveys employed traditional data collection methods. However, in recent years, they all have undergone a transition by incorporating or testing the inclusion of the web mode.
Results
In the presentation, I will provide an update on each survey's current status, illustrating how the web mode has been integrated and examining its impact on response rates and sample composition.
Added Value

The incorporation of the web mode in key Institute for Employment Research panel surveys is crucial in the digital age. This transition enhances efficiency, reduces costs, and broadens participant diversity, ensuring studies remain methodologically robust and adaptable to the evolving digital landscape.



Effect of Incentives in a mixed-mode Survey of Movers

Manuela Schmidt

University of Bonn, Germany

Relevance & Research Question

The use of incentives to reduce unit nonresponse in surveys is an established and effective practice. Prepaid incentives have been shown to increase participation rates, especially for postal surveys. As surveys keep moving online and response rates keep dropping, the use of incentives and its differential effect on survey modes need to be further investigated.

In our experiment, we investigate both the effects of survey mode and incentives for participation rates in a postal/web mixed-mode survey. In particular, we aim to answer the questions:

i) In which sociodemographic groups do incentives work (particularly well)?

ii) Is the effect of incentives affected by survey mode?

iii) How does data quality differ between incentivized and non-incentivized participants?

Methods & Data

Our data is based on a random sample of all residents who moved from two neighborhoods of Cologne, Germany, between 2018 and 2022. Addresses were provided by the city's Office for Urban Development and Statistics. We were also provided with the age and gender of all selected residents as reported on their official registration.

For the experiment, we randomly selected 3000 persons. Of those, 2000 received postal invitations to a web survey, while 1000 received a paper questionnaire with the option to participate online. In both groups, 500 participants were randomly selected to receive a prepaid incentive of 5 euros cash with the postal invitation.

Results

Our design yielded a good response rate of around 35% overall (47% with incentives and 26% without). Over 80% participated in the online mode. As we have information on the age and gender of the whole sample, including non-responders, detailed analyses on the effectiveness of incentives and their possible effect on data quality (measured by the share of “non-substantive” answers, response styles, and the amount of information provided in open-ended questions), will be presented.

Added Value

With this paper, we contribute to the literature on the effect of incentives, particularly on the comparison of survey modes. As our data is based on official registration and we have reliable information on non-responders, our results on the effects of incentives are of high quality.



Mode Matters Most, Or Does It? Investigating Mode Effects in Factorial Survey Experiments

Sophie Katharina Hensgen1, Alexander Patzina2, Joe Sakshaug1,3

1Institute for Employment Research, Germany; 2University of Bamberg, Germany; 3Ludwig-Maximilians University Munich, Germany

Relevance & Research Question

Factorial survey experiments (FSEs), such as vignettes, have increased in popularity as they have proven to be of great advantage when collecting opinions on sensitive topics. Generally, FSEs are conducted via self-administered interviews in order to allow the participant to understand and assess the given scenario entirely. However, many establishment panels, such as the BeCovid establishment panel in Germany, rely on interviewer-administered data collection (e.g. telephone interviews), but could also benefit from using FSEs when interested in collecting opinions on more sensitive topics. Thus, the question emerges whether FSEs conducted via telephone result in similar results compared to web-based interviews. Furthermore, it would be of great interest to know whether these modes differ in their answer behavior for FSEs, such as straightlining, extreme responding or item non-response.

Methods & Data

To shed light on this issue, a mode experiment was conducted in the BeCovid panel in which a random subset of telephone respondents was assigned to complete a vignette module online (versus continuing with the telephone mode). Respondents were given a set of 4 vignettes varying in six dimensions followed by two follow-up questions regarding the person’s success in the application process. Additional to various descriptive analyses we run multilevel regressions (random intercept model) to take the multiple levels into account.

Results

The analysis shows no overall difference in the results of the random intercept model when controlling for the mode. However, there are significant differences between the modes regarding specific dimensions of the vignette, which could be described as sensitive. Furthermore, CATI shows an increase of straightlining as well as extreme responding, but no influence on the probability of acquiescence bias or central tendency bias. Lastly, respondents interviewed via telephone lead to more item non-response.

Added Value

This study shows that conducting FSEs through telephone interviews is feasible, but is associated with certain limitations. Depending on the subject matter, these interviews might fail to accurately capture genuine opinions, instead reflecting socially accepted responses. Additionally, they may result in diminished data quality due to satisficing and inattention.



 
Contact and Legal Notice · Contact Address:
Privacy Statement · Conference: GOR 24
Conference Software: ConfTool Pro 2.8.101
© 2001–2024 by Dr. H. Weinreich, Hamburg, Germany