Designing passwords for web survey access: The effect of password length and complexity on survey and panel recruitment
Georg-Christoph Haas, Marieke Volkert, Stefan Zins
Institute for Employment Research, Germany
Relevance & Research Question
For online probability surveys that recruit participants via postal invitation letters, passwords are used to manage access to the survey. These passwords serve several purposes, such as, blocking uninvited individuals, and preventing multiple submissions from the same individual. Research on web survey passwords has primarily focused on whether providing a password for survey access affects response rates. However, the chosen password strength, that is, length and complexity, may also affect response propensities. Password length refers to the number of characters in a password. Password complexity involves the set of characters from which the password can be derived (e.g., lowercase letters and numbers). Our research evaluates the effect of password length and complexity on survey access, response, panel registration and linkage consent rates. Methods & Data
We implemented a survey experiment by varying password length and complexity during the first wave of a general population online survey. For recruitment, every individual received a postal invitation letter with a web-link and QR-code directing to the survey, along with an individualized password. We conducted a 2×2 experiment that manipulated password length (five vs. eleven characters) and complexity (uppercase letters only vs. uppercase + lowercase letters + numbers). Additionally, we included a group that used the default length and complexity settings of the service hosting the survey (eight uppercase letters). Invited individuals were randomly assigned to one of these five groups across two different probability samples: employees (N=77,173) and welfare recipients (N=99,176).
Results
Results show that short as well as long passwords increase the access rate compared to the control group (16.7%, 19.2% vs. 14.9%). The positive effects of the password designs remain for response and panel registration rates. We also find that long passwords have a positive effect on the propensity to consent to linking survey with administrative data.
Added Value
Our research sheds light on an often-overlooked aspect of postal survey invitations for web surveys: Passwords designs. Our talk shows how researchers can strategically design survey passwords, potentially influencing not only survey response rates but also other data quality indicators.
Using panelists self-stated motivations to craft efficient targeted email invitations to an online probability panel
Georg-Christoph Haas1, Benjamin Baisch1, Mark Trappmann1,2, Jonas Weik1
1Institute for Employment Research, Germany; 2University of Bamberg
Relevance & Research Question
In online panels, emails are a crucial element for recruiting respondents. Email invitations may substantially affect panelists’ perception of the study's relevance, potentially influencing both response rates and sample composition. Previous research has examined the use of targeted appeals, where the wording in the invitation letter varies among pre-identified subgroups. In our study, we divide panelists into subgroups based on their self-stated motivations to participate. We then use these self-stated motivations to craft an appealing email invitation to invite panelists to a subsequent wave. Based on the Leverage-Saliency Theory, emphasizing the self-stated motivation in email invitations (Saliency) should have a positive effect on the panelists' response propensity, enhancing cooperation as well as reducing attrition within the panel. Our design enables us to answer the question: Do targeted invitations based on panelists self-stated motivations from a previous wave increase response rates in a subsequent wave?
Methods & Data
We implemented a survey experiment in a German Online Probability Panel: IAB-OPAL. In wave 3, we asked 10,246 panelists to state their main motivation for participation, choosing among seven different motivations: topic, incentive, giving opinion, informing politics, curiosity, helping science, feeling obligated. In wave 4, we randomly assigned panelist either to the standard invitation or to an invitation that aligns with one of the self-stated motivations. Our treatment included a different subject line as well as a motivational email text.
Results
Results show that our treatment did not improve cooperation or reduce attrition within the panel. On the contrary, for the motivations “giving opinion” and “informing politics”, results show that aligning the wording of the invitation email with panelists self-stated motivations from the previous wave reduces response rates compared to the standard invitation email.
Added Value
Aligning panel communication documents with panelists' underlying motivations holds the promise to enhance cooperation and reduce attrition within the panel. At least for email invitations, we find that our results break with this promise, and lead to unexpected results regarding the underlying leverage-saliency-theory. Therefore, we find it noteworthy to share and discuss these results with experienced colleagues in the online survey community.
Backing up a Panel with Piggybacking – The Effect of Piggybacking Recruitment on Nonresponse Bias and Panel Attrition in a Mixed Mode Panel Survey
Björn Rohr
GESIS - Leibnitz Institute for the Social Sciences, Germany
Relevance & Research Question
Sampling and recruiting respondents for (online) probability-based panels can be very expensive. One cost-intensive aspect of the process is drawing a separate sample and recruiting the respondents offline. To reduce the cost of panel recruitment, some mixed-mode or online panels (e.g., the GESIS Panel, the German Internet Panel, and the NatCen Panel) relied on piggybacking in some recruitments or refreshments. Piggybacking means that participants for the panel are recruited at the end of another probability survey so that no additional sample has to be drawn. Although this reduces the cost of panel recruitment, it might also introduce additional nonresponse. Whether or not the higher amount of nonresponse also translates to higher amounts of bias in practical applications of a piggybacking survey will be analyzed in my research. In addition to the bias for the initial recruited panelists, we will also investigate the effect piggybacking has on panel attrition.
Methods & Data
To answer the research question, we use the GESIS Panel, a panel survey that was initially recruited in 2013 (n = 4,961) from a separate sample but later on refreshed three times with the help of piggybacking (n = 1,710, 1,607, 764). This setting allows us to compare the bias of both survey types against each other and disentangle the nonresponse bias introduced by piggybacking in contrast to regular nonresponse bias. To estimate the bias of the separate recruitment waves, we use the German Microcensus as a benchmark. The bias will be measured as a relative bias for demographic and job-related variables, as well as the difference in Pearson’s r between benchmark and survey.
Results
Initial results show that piggybacking did significantly increase the number of nonrespondents, compared to a separate recruitment. As we are currently preparing the data for the bias analyses, actual results regarding the bias need to be added later.
Added Value
Our work will give researchers a better understanding of the bias introduced through piggybacking and if this method is still a useful tool to reduce the cost of a probability (panel) survey without introducing high amounts of nonresponse bias.
|