In electrophysiological research, researchers have to make a myriad of decisions when going from raw data to interpretations (“analytic flexibility”), including many preprocessing steps and possible analysis methods. These decisions are not always transparently reported. In addition, researchers’ decisions might be unconsciously biased by seeing the data. In this symposium, we aim to (1) demonstrate how variable preprocessing and analysis pipelines in EEG research are and how this influences study outcomes, and (2) discuss possibilities to increase transparent reporting and decrease bias in making analytic decisions, and, thereby, increase reproducibility of EEG studies. Regarding the first aim, two of the talks in this symposium will demonstrate the high variability of methodological choices in EEG research and how this influences results, using both local and large-scale (“many analysts”) approaches. Turning to the second aim, discussing possible solutions, the third talk will focus on how preregistration can help increase transparency in EEG research and reduce researcher biases in analytic choices. The fourth talk will discuss another way to increase transparency, through developing concrete reporting guidelines in the form of checklists. Finally, we will discuss the current state of transparency and reproducibility in EEG research. We will highlight both possibilities and challenges in the adoption of reproducible and transparent practices in the EEG community, as well as the growing role of collaborative efforts in our research field.
Comparing the effects of different methodological decisions on the reliability of the error-related negativity and its association with individual differences
1McGill University, Canada; 2University of Hamburg, Germany; 3Humboldt University of Berlin, Germany; 4Florida State University, United States
Psychophysiological researchers make a number of methodological decisions when quantifying event-related potentials (ERP) and these decisions vary across studies. It is not well understood how each of these methodological choices—let alone the combinations of all of them—affect the psychometric properties of ERPs, nor their associations with individual differences. To illustrate the impact of these choices, this talk focuses on the error-related negativity (ERN), an ERP component that is widely used to study human performance monitoring. Specifically, we compared 72 distinct processing pipelines to quantify the ERN and examined their effects on the component’s measured amplitude, psychometric properties, and association with individual differences, specifically behavior and gender. We collected ERN data from 263 young adults during a Flanker task and again in a subsample of 33 participants five months later. The means, internal consistencies, and test-retest reliabilities of the ERN were compared across different reference schemes (mastoid and average), baseline correction periods (-100 to 0, -200 to 0, and -500 to -300 ms), amplitude scoring approaches (area, peak, and peak-to-peak), and electrode site selections (Cz and region-of-interest). We found that these data processing choices affected not just the measured amplitude of the ERN, but also measures of internal consistency and test-retest reliability, as well as its associations with individual differences. Together, these results highlight the importance of considering methodological influences on ERP measurement, and we discuss steps to obtain more reliable and robust measurement of the ERN.
EEGManyPipelines: Mapping the diversity of EEG analysis pipelines and their impact on results
1Donders Institute for Brain, Cognition, and Behavior, Radboud University Nijmegen Netherlands; 2Department of Psychology, Julius Maximilians University of Würzburg, Germany; 3Institute of Psychology, University of Münster, Germany
Electroencephalography (EEG) is widely used in psychophysiological research. However, the analytical flexibility of EEG has challenged the robustness of EEG findings. Since there are so many various ways to process and analyze EEG data, analysis pipelines vary greatly between studies. It is currently unclear to what extent alternate, plausible pipelines produce various findings and assumptions. The EEGManyPipelines project is inspired by other recent projects involving many independent analysis teams to investigate a) how different analysts approach a given data set and b) how analysis approaches affect the obtained results. EEGManyPipelines extends this novel initiative to EEG research. Participants in this project will get access to an EEG dataset and are invited to analyze the data with an analysis pipeline they deem sensible and representative of their research. Participants will then report their results and a detailed description of the analysis pipeline back to us. We will use these reports to map the diversity of analysis pipelines and the effect of pipeline parameters on obtained results. Thereby, EEGManyPipelines will help assessing the robustness of EEG findings across alternative analyses, identifying (sub)optimal analysis pipelines, and informing guidelines for reporting EEG analyses in publications. Thus, we expect that EEGManyPipelines will help improving the credibility of EEG research and the quality of analyses, and will inspire new standards for conducting and reporting EEG studies. Given the widespread use of EEG in human cognitive neuroscience and psychology, this project represents a timely and crucial endeavour that will benefit the cognitive neuroscience community at large.
Preregistration: Increasing transparency in electrophysiological research
1Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany; 2Einstein Center for Neurosciences Berlin, Germany; 3Berlin School of Mind and Brain, Germany; 4Georg-August-Universität Göttingen, Germany; 5contributed equally
In this talk, we will discuss how preregistration can be used to increase transparency in electrophysiological research. We will start by discussing how confirmation bias (looking for information that supports prior beliefs), hindsight bias (overestimating in how far past events predicted a current outcome), and pressure to publish can lead to (unconscious) data exploration after which only (statistically) significant results are reported. We will highlight some of the problems associated with this undisclosed analytic flexibility, focusing on EEG research, in which complex multidimensional data can be preprocessed and analyzed in many possible ways. We argue that transparently disclosing analytic choices can mitigate confirmation and hindsight bias and make EEG research more verifiable. One possible tool for transparent reporting is preregistration: providing a time-stamped, publicly accessible research plan with hypotheses, a data collection plan, and the intended pre-processing and statistical analyses, written before the data were accessed. We will provide examples on how to create preregistrations for EEG studies that are specific, precise and exhaustive, focusing on data pre-processing and analysis steps. Finally, we will highlight the benefits and critically discuss the limitations of adopting preregistration for EEG researchers.
Transparency in reporting on ERP research and how we can improve it
1University of Belgrade Teacher Education Faculty; 2University of Belgrade Laboratory for Neurocognition and Applied Cognition
Given the complexity of ERP pre-processing and analysis pipeline, it is challenging to incorporate all information needed for a replication attempt or adequate critical assessment of the study when writing a paper. In a recent systematic review on methodology and reporting in ERP studies (Šoškić et al., in press), we have demonstrated that verbal descriptions of methods in journal articles are not optimal for this task due to susceptibility to information omission and ambiguous wording. This talk will discuss (1) which are the principle areas in which improvements in reporting on ERP methods are necessary, (2) a proposal to improve transparency and reduce reporting errors by designing a supplementary ERP metadata template to be filled with methodology information necessary for study evaluation, replication, metaanalysis and data reuse, and (3) ARTEM-IS (Agreed Reporting Template for EEG Methodology - International Standard, https://osf.io/pvrn6/), a project to create such a template through a collaborative process that would gather stakeholders across the entire ERP community, ensuring ease of use, clarity and relevance of the template contents.