Conference Agenda

Overview and details of the sessions of this conference. Please select a date or location to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).

Please note that all times are shown in the time zone of the conference. The current conference time is: 14th Aug 2025, 06:43:27am BST

 
 
Session Overview
Session
PSG 22 - Behavioural Public Administration
Time:
Friday, 29/Aug/2025:
11:00am - 12:00pm

Session Chair: Dr. Joris VAN DER VOET, Leiden University

Show help for 'Increase or decrease the abstract text size'
Presentations

Eye-tracking in Public Administration: an introduction and research agenda

Wouter LAMMERS

KU Leuven, Belgium

Measuring attention with an eye-tracker holds significant potential for Public Administration research, and is now gaining increasing traction here due to its enhanced accessibility. However, there is no systematic overview of this potential, resulting in limited guidance for researchers interested in applying it. This article proposes an eye-tracking research agenda and reviews theoretical foundations, research questions, and methodological considerations specific to this discipline. Recording how bureaucrats attend to information aligns well with long-standing theories of Public Administration, especially those rooted in behavioral approaches. Limited attention span is one of the core aspects of bounded rationality: bureaucrats cannot attend to all information and will be selective (March & Simon, 1993; Van Knippenberg et al., 2015). Individual attention allocation affects which issues determine the agenda, which information (and from whom) influences decision-making, and what remains ignored (Jones & Baumgartner, 2004). Bureaucrats do not consume information in isolation: they operate in organizations and environments that shape what gets their attention (Ocasio, 2011). Eye-tracking allows for opening this ‘black box’ and measure not only the determinants and effects of who pays attention to what but also variables such as emotional arousal and depth of understanding. It can provide precise insights into the workings of the object under study: modern government is drenched with information, textual or otherwise, that is consumed through the eyes of bureaucrats. This article a) reviews the current state of eye-tracking research in Public Administration, b) maps theoretical approaches to bureaucrat attention and corresponding research questions and c) highlights methodological as well as practical considerations.

References

Jones, B. D., & Baumgartner, F. R. (2004). A Model of Choice for Public Policy. Journal of Public Administration Research and Theory, 15(3), 325–351. https://doi.org/10.1093/jopart/mui018

March, J. G., & Simon, H. A. (1993). Organizations. John wiley & sons.

Ocasio, W. (2011). Attention to Attention. Organization Science, 22(5), 1286–1296. https://doi.org/10.1287/orsc.1100.0602

Van Knippenberg, D., Dahlander, L., Haas, M. R., & George, G. (2015). Information, Attention, and Decision Making. Academy of Management Journal, 58(3), 649–657. https://doi.org/10.5465/amj.2015.4003



Give Me Attitude: Making Smart Use of Structural Equation Modeling and Other Tools When Analyzing Survey Data

Nathan FAVERO

American University, United States of America

Many quantitative social scientists were trained primarily in the tools of either econometric-style regression or psychology-inspired structural equation modeling (SEM). Research in public administration and related fields regularly draws on both traditions of modeling, posing practical difficulties for many researchers navigating these literatures. For example, behavioral public administration researchers trained in econometric tools may find themselves regularly reading and citing organizational behavior studies that employ SEM, despite little or no formal training in such models.

The present study aims to inform readers about both approaches (econometric and psychometric) while moving forward our understanding of their similarities and differences through careful comparison. I argue that in many cases, econometric and SEM tools can yield similar substantive conclusions, though the workflows associated with each tend to emphasize different aspects of analysis. SEM encourages researchers to think carefully about mediation and measurement, while offering greater flexibility in correcting for measurement error. Still, real-world measurement error is typically even more complicated than what SEM can account for. Econometrics tools offer more options for dealing with measurement error than many researchers realize, and econometric models offer more flexibility in terms of the number and type of variables that can be realistically analyzed.

This study begins by identifying key challenges associated with studying attitudes and other hard-to-measure constructs. It then provides a brief overview of the SEM framework, which is compared to a typical workflow under an econometric-based analysis of attitudinal data. Next, I offer a practical set of guidelines for analyzing hard-to-measure constructs—guidelines which can be used by researchers regardless of which disciplinary tradition of statistical modeling they utilize. Finally, the prior topics are further elaborated on through re-analysis of two datasets used in previously published public administration studies examining variables of interest to behavioral researchers (e.g., job satisfaction, employee engagement).



Anchoring-Based Causal Design (ABCD): Estimating the Effects of Beliefs

Raanan SULITZEANU-KENAN1, Micha Mandel2, Yosef Rinott3

1Hebrew University; 2Hebrew University; 3Hebrew University

A central challenge in any study of the effects of beliefs on outcomes, such as choices and behaviors, is the risk of omitted variable bias. To address this, information provision experiments often serve the purpose of randomly treating beliefs and (perceived) constraints. However, providing differing information to participants in order to alter their beliefs often raises methodological and ethical concerns. Methodological concerns stem from potential violations of the information equivalence assumption, and source influences. The need to vary information in experimental conditions present an ethical risk of deception. This paper proposes and empirically demonstrates a new method for treating beliefs and estimating their effects – the anchoring-based causal design (ABCD) – which avoids deception and source influences. ABCD combines the cognitive mechanism known as anchoring (Tversky & Kahneman, 1974) with instrumental variable (IV) estimation. We present the method and the results of eight experiments demonstrating its application, strengths, and limitations. We conclude by discussing the potential of this design for advancing experimental social science.