4:00pm - 4:15pmID: 162
/ PS-11: 1
Topics: Privacy and EthicsKeywords: peer review, editorial review, misconduct, plagiarism, censorship
The Hitchhiker’s Guide to Scholarly Research Integrity
1Brain Health Alliance, USA; 2Weill Cornell Medicine, Al Rayyan, Qatar; 3Huntington Library, USA
The pursuit of truth in research should be both an ideal in aspiration and also a reality in practice. The PORTAL-DOORS Project (PDP) strives to promote creative authenticity, fair citation, and adherence to integrity and ethics in scholarly research publishing using the FAIR family of quantitative metrics with acronym FAIR for the phrases Fair Attribution to Indexed Reports and Fair Acknowledgment of Information Records, and the DREAM principles with acronym DREAM for the phrase Discoverable Data with Reproducible Results for Equivalent Entities with Accessible Attributes and Manageable Metadata. This report presents formalized definitions for idea-laundering plagiarism by authors, idea-bleaching censorship by editors, and proposed assertion claims for authors, reviewers, editors, and publishers in ethical peer-reviewed publishing to support integrity in research. All of these principles have been implemented in version 2 of the PDP-DREAM ontology written in OWL 2. This PDP-DREAM ontology will serve as the model foundation for development of a software-guided workflow process intended to manage the ethical peer-reviewed publishing of web-enabled open access journals operated online with PDP software.
4:15pm - 4:30pmID: 214
/ PS-11: 2
Topics: Library and Information ScienceKeywords: Literature Review, Active Reading, Scholarly Primitive, Sensemaking, Interdisciplinary Research
Opening Up the Black Box of Scholarly Synthesis: Intermediate Products, Processes, and Tools
College of Information Studies, University of Maryland, USA
Synthesis is a foundational scholarly product that generates new conceptual wholes from independent intellectual sources. But effective synthesis products—such as literature reviews—are rare, in part due to inadequate support from existing tools and information systems. A detailed, situated understanding of the work practices behind synthesis is necessary to inform the development of synthesis tools. Previous work in scholarly primitives, active reading, and sensemaking provide partial explanations of aspects of synthesis, but a detailed explanation of scholarly synthesis, specifically, is lacking. This paper presents a foundational empirical examination of the work practices behind synthesis to address the gap, focusing on unpacking the intermediate products, processes, and tools through in-depth contextual interviews with scholars. Results shed light on the distinctive intermediate products generated during synthesis—including in-source annotations, per-source summaries, and cross-source syntheses—as well as effortful processes for nonlinear progression of these intermediate products towards a final synthesis product. These products and practices were also embedded in a complex ecology of creative re-appropriated tools. This work enriches understanding of the complex scholarly practices that produce synthesis and opens up a research agenda for understanding and supporting scholarly synthesis more generally.
4:30pm - 4:45pmID: 309
/ PS-11: 3
Topics: Library and Information ScienceKeywords: authorship, author order, scientific writing, authorship rubrics, scholarly communication
Quantifying Authorship: A Comparison of Authorship Rubrics from Five Disciplines
University of Missouri, USA
Transparency in authorship is a continuing topic in information science and scholarly communication. The process of determining authorship order in multi-author publications, however, can be complicated. Authorship rubrics helping teams arrive at authorship order exist, but the extent to which certain roles are quantified (or not) and rewarded with authorship is unclear. This study examines eight authorship rubrics from five disciplines in the sciences and social sciences and evaluates their alignment with the Contributor Role Taxonomy (CRediT) framework; authorship rubrics are also compared on how they assign authorship credit and determine authorship order. Findings indicate that work on the methodology and the initial writing of the manuscript are most consistently quantified in authorship rubrics across the disciplines. Procedures for awarding authorship credit vary widely and methods for tie-breaking rubric scores range from systematic to arbitrary. These findings suggest that regardless of discipline, contributions to carrying out and writing up research are seen as criteria for authorship. Differences in procedures for ordering authors may result, however, in different author order based on the chosen rubric. Ultimately, the fitness of use for authorship rubrics should be carefully considered by members of research teams, especially if the teams are comprised of interdisciplinary members.
4:45pm - 5:00pmID: 315
/ PS-11: 4
Topics: Library and Information ScienceKeywords: Open peer review, Review comments, Citations, Rounds of review, Matching
Does Opening up Peer Review Benefit Science in Terms of Citations?
Nanjing University, People's Republic of China
This paper studied whether opening up review reports benefits science in terms of citations, by taking Nature Communications as an example. To address these questions, we extracted 3,500 papers published in Nature Communications in 2016 and 4,326 papers in 2017, and retrieved their three-year citations since publication in the Web of Science database. By applying the Matching method, we constructed an observation group including 1,726 open peer review (OPR) papers and a control group including 1,726 non-OPR counterparts. The results of the paired sample t-test showed that there is no significant difference between the OPR and non-OPR papers. We conclude that opening up peer review reports did not benefit papers in Nature Communications in terms of citations. We further examined whether the length, the rounds or the lexical diversity of the review report contributed to the paper’s citations, through regression analysis, and found that longer comments, more rounds of review and the more diversified words did not contribute to the citations of the OPR papers in Nature Communications. Further research covering more OPR journals is required to justify the benefits of OPR.