GOR 26 - Annual Conference & Workshops
Annual Conference- Rheinische Hochschule Cologne, Campus Vogelsanger Straße
26 - 27 February 2026
GOR Workshops - GESIS - Leibniz-Institut für Sozialwissenschaften in Cologne
25 February 2026
Conference Agenda
Overview and details of the sessions of this conference. Please select a date or location to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).
|
Session Overview |
| Session | ||
12.3: Methods, tools, and frameworks - a bird's view on data collection
| ||
| Presentations | ||
The Methods Hub: Integrating Tools, Tutorials, and Environments for Transparent Online Research GESIS - Leibniz Institute for the Social Sciences, Germany Relevance & Research Question: As digital communication increasingly unfolds on online platforms, behavioral data have become central to understanding media exposure, polarization, and social interaction. Yet computational approaches necessary to analyze such data often remain inaccessible to many communication and social science researchers who lack extensive programming expertise or institutional resources. As a result, many research-driven tools remain scattered across personal repositories, supplementary materials, or project websites, reducing their visibility, reusability, and long-term sustainability. This presentation addresses the question of how a community-driven infrastructure can lower entry barriers for computational methods and support transparent, reproducible online research. Methods & Data: The Methods Hub is designed as an open platform that curates computational resources relevant to social science research. It integrates three core components: (1) open-source tools ranging from lightweight scripts to fully developed software packages, (2) tutorials explaining both general principles of reproducible computational workflows and concrete methodological applications, and (3) containerized interactive coding environments that can be executed directly in the browser without local installation. All contributions follow open licensing and reproducibility standards and are reviewed accordingly. The platform architecture supports interoperability with complementary infrastructures (e.g., KODAQS) to facilitate cross-linking between datasets, tools, and training materials. The development process combines community submissions, expert curation, and iterative user testing to ensure methodological relevance and usability. Results: Preliminary implementation demonstrates that the platform successfully bridges gaps between computational tooling and social science workflows. Initial contributions include tools for digital trace data collection, automated preprocessing pipelines, validation and reliability routines, and visualization templates. Tutorials and browser-based execution environments have proven effective in enabling researchers to test methods without configuring complex software environments. User feedback from pilot workshops indicates substantial reductions in setup time, increased willingness to experiment with computational approaches, and improved understanding of reproducible research practices. Added Value: The platform lowers entry barriers to behavioral data analysis, strengthens methodological knowledge transfer, and promotes long-term visibility and reuse of tools otherwise confined to fragmented project repositories. Through openness, interoperability, and executable documentation, the Methods Hub contributes to building a robust ecosystem for computational communication science. Let's Talk About Limitations: Data Quality Reporting Practices in Quantitative Social Science Research 1University of Mannheim, Germany; 2GESIS – Leibniz Institute for the Social Sciences Relevance & Research Question Clearly communicating data quality limitations is essential for transparent research. Data quality frameworks and reporting guidelines support researchers in identifying and documenting potential data quality concerns, but it is unclear how well this translates to reporting practices. In this project, we analyze reports of data quality limitations in substantive social science publications. Thus, we provide insights into typical limitations that reoccur but also highlight underrepresented areas where researchers might require additional guidance. Methods & Data We analyze the “Limitations” sections and limitation-related paragraphs in “Discussion” sections of substantive survey-based research published in the journals including American Sociological Review and Public Opinion Quarterly. We use a large language model to extract data-quality-related aspects of these sections and paragraphs and assign them to the measurement and representation sides as defined in Total Data Quality error frameworks. We then cluster the excerpts into themes and compare the themes to components of the error frameworks. Based on this, we discuss which data quality dimensions are commonly and rarely mentioned, and what possible reasons for these differences may be. Through comparisons with reporting guidelines (e.g., AAPOR transparency initiative, datasheets for datasets), we highlight areas where researchers might require additional support. We also analyze areas where current guidelines might be adapted to better represent researchers’ needs in reporting. Results Initial findings show a prevalence of discussions on measurement validity and on coverage of the target population in contrast to only few mentions of limitations related to data processing. We also find that limitations are often communicated implicitly, adding a challenge for readers from other disciplines. For example, briefly mentioning the concrete implications of using an “online non-probability sample” would increase interdisciplinary validity assessments. Added Value We contribute to the transparent and well-structured communication of data quality as a crucial step for validating research by providing an overview of current reporting practices and directions for improved reporting. Qualitative Research in Digital Contexts: A Systematic Review of Online Data Collection Practices FH Wiener Neustadt GmbH City Campus, Austria Relevance & Research Question We conducted a systematic literature review (Tranfield et al. 2003) of academic journal articles reporting experiences and reflections of qualitative online data collection from 2000 to 2024. A literature search was carried out in the databases Springer Link, Science Direct and Emerald Insight, using among others, the following terms: “digital OR virtual OR online data collection” AND “qualitative research OR method”. Following a three-stage selection process, 44 articles were selected and systematically coded in MaxQDA, combining a deductive framework (according to “before, during and after the survey”) with an inductive coding process. Results The online context will continue to play an increasingly important role in qualitative social research. We fulfil the need for practical guidance on conducting qualitative research projects online while maintaining quality standards. Furthermore, we relate research practices to existing debates on research quality. In doing so, it not only offers practical guidance but also theoretical connections for developing a reflexive methodology of digital social research. | ||