Large-scale analysis of electrophysiology data in cognitive neurology
Magnetoencephalography (MEG) and electroencephalography (EEG) can access population-level neuronal dynamics across diverse behaviorally relevant temporal scales from seconds to milliseconds. With the advent of machine learning, M/EEG has become a promising resource for predictive modeling of cognitive states and neuro-psychiatric conditions due to the intimate link between brain rhythms and various facets of cognitive processes. While MEG can recover spatial patterns at higher signal-to-noise ratio (SNR), the high availability and portability of EEG makes it a promising candidate for "big data" approaches in clinical neuroscience. Processing M/EEG data, however, is inherently challenging due to its high-dimensional nature and low SNR. Unfortunately, to date, various steps of M/EEG processing pipelines have not yet been fully automated. This not only incurs high costs in human processing time but also makes research findings less repeatable. Throughout this lecture I will share insights on how machine learning can help improve M/EEG analysis from data cleaning over source localization to predictive modeling in neurological conditions. I will start with an introduction into methodological issues related to statistical inference and machine learning in clinical neuroscience and M/EEG [background a-c]. I will then present two clinical applications in which we tackle the issue of cross-site generalization of EEG-based diagnosis in severely brain injured patients  and between-subjects variability in MEG signals from early blind individuals  using different machine learning approaches. Finally, I will discuss issues in automated large-scale analysis of M/EEG signals and present machine learning solutions to cleaning M/EEG data , hyper-parameter selection in source modeling  and minimizing distortions when regressing on MEG power spectra in the absence of source modeling . I will conclude with a few remarks on how these issues drive MNE software development and highlight a few tools and resources [background d-f].