Conference Agenda

Overview and details of the sessions of this conference. Please select a date or location to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).

 
 
Session Overview
Session
B4S2_PP: Teaching Assessment and Academic Writing Support
Time:
Tuesday, 23/Sept/2025:
1:45pm - 3:25pm

Session Chair: Olivier Le Deuff
Location: MG2/00.10

Parallel session; 80 persons

Show help for 'Increase or decrease the abstract text size'
Presentations
1:45pm - 2:00pm

Crossing Information Literacy Thresholds: A New Model for Bridging the Novice-Expert Research Gap

Amanda L. Folk

The Ohio State Unviersity, United States of America

Information literacy is more than being able to find an answer to a question. This is especially true in postsecondary education, in which many students are asked to consider ill-structured problems, or problems that do not have a simple right or wrong answer. Students are asked to form arguments and then find evidence to support their claims. Students are not simply consuming information; they are interacting with knowledge with the intent to create knowledge. As such, students’ relationships and beliefs about knowledge are important for how they seek out, evaluate, select, and use knowledge, as well as if and how they position themselves as knowledge creators. Conceptualizations of information literacy have evolved to incorporate more complex understandings of information literacy, and the ACRL Framework for Information Literacy for Higher Education is an example of this. However, as a profession, we have not deeply examined the relationship between information literacy and epistemological development. The connection between epistemological development and information literacy is not new (Anderson & Johnston, 2016; Jackson, 2007; Matteson et al., 2021; Swanson, 2006; Whitmire, 2004). However, there has not been significant exploration between epistemological development and the threshold concepts articulated in the Framework. The threshold concepts and their attendant knowledge practices and dispositions represent expert ways of engaging with information and knowledge, aside from a few mentions of what novice learners might be able to do. The Framework does not provide scaffolding that supports movement from novice to expert – it only presents the end goal.

In this paper, I discuss the first two phases a multi-phase study exploring the relationship between the information literacy threshold concepts and two epistemological development models - the Reflective Judgement Model (King & Kitchener, 1994) and the Epistemological Reflection Model (Baxter Magolda, 1992). I will provide a brief overview of each model and discuss the findings of an analysis of the Framework using both models, which identifies the stages at which learners might need to be in terms of epistemological development to begin the journey of crossing information literacy thresholds. I will share a draft model – the Crossing Information Literacy Thresholds Model - of how library professionals might scaffold each threshold concept based on the analysis of the Framework. This scaffolding is intended to help library professionals who teach to think more deeply about where learners might be in terms of epistemological beliefs and what that means for information literacy-related learning outcomes, learning activities, and learning assessment. This model introduces a more nuanced approach to helping students cross these conceptual thresholds and is intended to provide a foundation for understanding how learners might move from a passive, consumer-oriented relationship to knowledge within the context of higher education to one that is active, engaged, and empowered.

References

Anderson, A., & Johnston, B. (2016). From information literacy to social epistemology. Chandos Publishing.

Baxter Magolda, M.B. (1992). Knowing and reasoning in college: Gender-related patterns in students’ intellectual development. Jossey-Bass.

Jackson, R. (2007). Cognitive development: The missing link in teaching information literacy skills. Reference & User Services Quarterly, 46(4), 28-32.

King, P.M., & Kitchener, K.S. (1994). Developing reflective judgement: Understanding and promoting intellectual growth and critical thinking in adolescents and young adults. Jossey-Bass.

Matteson, M., Ming, Y., Wheeler, H., & McShane, M. (2021). An exploratory student of the relationship between individual cognitive factors and information literacy ability in college students. In S. Kurbanoglu, S. Spiranec, Y. Unal, J Boustany, & D. Kos (Eds.), Information Literacy in a Post-Truth Era: 7th European Conference on Information Literacy, ECIL 2021, Virtual Event, September 20-23, 2021, Revised Selected Papers (pp 399-410). Springer.

Swanson, T. (2006). Information literacy, personal epistemology, and knowledge construction. College & Undergraduate Libraries, 13(3), 93-112.

Whitmire, E. (2004). The relationship between undergraduates’ epistemological beliefs, reflective judgement, and their information-seeking behavior. Information Processing and Management, 40, 97-111.



2:00pm - 2:15pm

Chatbots and information literacy: A comparison between two groups of teacher education students

Ellen Nierenberg

University of Inland Norway, Norway

Objectives

This study investigates the usage and perceptions of artificial intelligence (AI) chatbots among students in the Faculty of Teacher Education at X. By focusing on differences between schoolteacher education students (STEs) and preschool teacher education students (PREs), the research aims to identify distinct patterns and attitudes towards chatbots in educational contexts.

Existing literature highlights the potential benefits and challenges of chatbots in higher education. However, specific studies on how teacher education students engage with these technologies are limited. This study seeks to fill this gap.

Methodology

A survey was conducted online, targeting all bachelor’s and master’s students in the Faculty of Teacher Education at X. The survey ensured anonymity and required informed consent. It was open for 21 days in October 2024. The survey included questions regarding the extent to which students use chatbots in their studies, how they use them, ethical issues related to their use, and how they may affect students’ learning, critical thinking, and future careers. Several survey questions were related to information literacy.

Outcomes

The survey received 492 responses, with nearly equal numbers of PREs (n=184) and STEs (n=182) students. Most respondents were first-year students, and a large majority were female. Analyses revealed several significant differences between PREs and STEs in terms of chatbot use and perceptions.

STEs reported higher knowledge levels of chatbots than PREs. In terms of usage, while 75% of STEs had employed chatbots in their studies, only 34% of PREs had done so. Compared to PREs, more STEs believe that there is widespread use of chatbots among their fellow students. A larger proportion of STEs report feeling an increased addiction to chatbots than PREs. Additionally, more STEs believe that the use of chatbots should not be forbidden in higher education, and that its use should be permitted in graded assignments and home exams. PREs reported increased interest in their studies due to the use of chatbots to a greater extent than STEs, while STEs anticipated a greater impact of chatbots on their learning and future careers than PREs.

For questions related to information literacy, there were two notable differences between the student groups. Firstly, PREs believe to a greater extent than STEs that their critical thinking abilities will be reduced by chatbot use. Secondly, more STEs use chatbots to find relevant information sources than PREs. Similarities between the two groups include their beliefs about the reliability of information generated by chatbots; the extent to which copy/paste from chatbots represents plagiarism; and how worried they would be about being accused of plagiarizing if they used AI-generated texts.

Keywords: chatbots, teacher education, preschool teacher education, artificial intelligence, survey



Measuring Teacher Educators’ Information Problem Solving: A Situational Judgement Test

Klaas-Jan Lammers1, Jos van Helvoort2, Iwan Wopereis3, Nynke Bos1

1Inholland; 2Independent Researcher; 3Open Universiteit

Introduction

In today's digital age, the ability to effectively search for and process online information is crucial, especially in higher education. Teacher educators play a key role in developing these skills in students, who will, in turn, teach them to their future pupils. However, there is a lack of validated instruments to measure the online information skills of teacher educators themselves.

Objectives

This study aims to develop and validate a Situational Judgement Test (SJT) to measure information problem-solving (IPS) skills among teacher educators. Unlike existing SJTs targeting IPS in domains like psychology (e.g., Rosman et al., 2014), this study emphasizes measuring them specific to the professional educational context. SJTs are particularly suitable as they present realistic, job-related scenarios that requiring practical application of knowledge and decision-making. This approach allows for the assessment of practical application of knowledge and decision-making processes, which are critical for effective IPS.

Methodology

Garcia et al.'s (2020) Spanish language SJT called ‘Procedural Information-seeking Knowledge Evaluation-Education (PIKE-E) test’, was translated and adapted to the Dutch context. The translation process included initial translation and cultural adaptation. Feedback from teacher educators and experts refined the test, resulting in a 17-item version aligned with the Information Problem Solving using the Internet (IPS-I) model (Brand-Gruwel et al., 2009). The test items were linked to a familiar scenario for teacher educators, focusing on 'the impact of robotics.' The final version was administered to 39 teacher educators from various disciplines. Quantitative data were analyzed using item-total correlations and factor analysis to assess construct validity.

Results

Teacher educators performed well in tasks related to information selection and processing but struggled with search-related tasks and critical evaluation. Item-total correlations highlighted variability, indicating a need for further refinement of certain items. Construct validity was partially supported, while content validity was strengthened by expert feedback and contextual alignment of test items.

Conclusions and discussion

This contribution is significant as it provides a reliable instrument to assess and improve the information problem-solving skills of teacher educators, ultimately enhancing the quality of education. Future research should focus on test-retest reliability and expanding the sample size to improve generalizability. Additionally, qualitative methods could provide deeper insights into the test items' interpretation and inform further refinements.

Keywords: Information Problem Solving, Situational Judgement Test, Teacher Educators, PIKE-E Test

References

Brand-Gruwel, S., Wopereis, I., & Walraven, A. (2009). A descriptive model of information problem solving while using internet. Computers & Education, 53(4), 1207–1217. https://doi.org/10.1016/j.compedu.2009.06.004

Garcia, C., Argelagós, E., & Privado, J. (2020). Assessment of higher education students’ information problem-solving skills in educational sciences. Information Development, 37(3), 359–375. https://doi.org/10.1177/0266666920976189

Rosman, T., Mayer, A., & Krampen, G. (2014). Combining self-assessments and achievement tests in information literacy assessment: empirical results and recommendations for practice. Assessment & Evaluation in Higher Education, 40(5), 740–754. https://doi.org/10.1080/02602938.2014.950554



Promoting academic writing: an investigation into pedagogical practices of university teachers

Tatiana Sanches1, Carlos Lopes2, Maria Luz Antunes3

1UIDEF, Instituto de Educação, Universidade de Lisboa, Portugal; 2APPsyCI, Ispa-Instituto Universitário, Lisboa, Portugal; 3Instituto Politécnico de Lisboa (ESTeSL), Lisboa, Portugal

Students are encouraged to read, think, discuss, and write as part of their academic requirements in higher education, particularly in advanced second and third-cycle studies. These steps need to be supported by professors, especially when they are acting as supervisors. They should encourage the development of these early researchers (Carter & Laurs, 2018). Therefore, academic writing is an integral part of university education. It is a skill to be developed and a learning process that requires the attention of university professors (Côrte Vitória, 2018). Previous research has addressed the importance of practical experience in academic writing for the development of information literacy (IL) skills (Sanches et al., 2019), with further evidence advocating for the benefits of integrating IL into the teaching of academic writing (Veach, 2019). Limited research in Portugal has focused on the pedagogical role of faculty in this area (Caels et al., 2019), a process that has been studied primarily from the perspective of students and their experiences (Geng & Yu, 2022; Graham et al., 2020).

Therefore, the present study aims to analyze the actions and strategies implemented by university teachers to promote academic writing. It will examine how professors have addressed the issue of writing skills and how they have promoted their development within the academic community, particularly among their students. The study is grounded in frameworks that outline the requirements for IL promotion (Association of College and Research Libraries, 2016) and academic writing in higher education (Council of Writing Program Administrators National Council of Teachers of English National Writing Project, 2011). Particularly in terms of empowering students, developing academic writing and fostering academic skills in this context, the intersection of teaching IL and promoting academic writing was observed.

The responses from approximately 70 professors reveal encouraging perspectives. Practices that integrate IL skills into academic writing classes are already a reality. Furthermore, student-centered pedagogical practices adoption emerges as a key aspect in this context. This highlights the critical role of university teachers in tailoring their approaches to meet students’ learning needs and actively supporting their academic development.

Association of College and Research Libraries. (2016). Framework for Information Literacy for Higher Education. http://www.ala.org/acrl/files/issues/infolit/framework.pdf.

Caels, F., Barbeiro, L. F., & Santos, J. V. (Eds.). (2019). Discurso académico: Uma área disciplinar em construção. CELGA-ILTEC – Universidade de Coimbra; ESECS – Politécnico de Leiria.

Carter, S., & Laurs, D. (Eds.). (2018). Developing research writing: A handbook for supervisors (1st ed).

Côrte Vitória, M. I. (2018). La escritura académica en la formación universitaria. Narcea.

Council of Writing Program Administrators National Council of Teachers of English National Writing Project. (2011). Framework for success in postsecondary writing. www.nwp.org

Geng, F., & Yu, S. (2022). Exploring doctoral students’ emotions in feedback on academic writing: A critical incident perspective. https://doi.org/10.1080/0158037X.2022.2109616

Graham, S., Kiuhara, S. A., & MacKay, M. (2020). The effects of writing on learning in science, social studies, and mathematics: A meta-analysis. Review of Educational Research, 90(2), 179–226. https://doi.org/10.3102/0034654320914744

Sanches, T., Antunes, M. L., & Lopes, C. (2019). Improving the academic writing experience in higher education. In Improving the academic writing experience in higher education. NOVA Publishers.

Veach, G. (2019). Teaching information literacy and writing studies (Vol. 2). Pardue University Press. https://muse.jhu.edu/book/65069