Session 1: Monday morning - 9:15 to 12:15 AM
Brainstorm: New tools for SEEG, anatomical parcellations and FEM modeling
Brainstorm is a collaborative, open-source platform dedicated to the analysis of brain recordings: MEG, EEG, fNIRS, ECoG, depth electrodes and animal invasive neurophysiology. This session will introduce new features for realistic EEG/MEG forward modelling: generation of 3D meshes from T1/T2/DTI MRI images with external software (SimNIBS, Iso2Mesh, FieldTrip) and computation of FEM forward solutions with the state-of-the-art library Duneuro. While still under active development, these new methods are very promising for more accurate EEG/SEEG source estimation and simulations. Participants will learn how to access these new tools from the Brainstorm environment on their personal laptops.
BIDS datasets with MNE-Python and MNE-BIDS: conversion and analysis
In this course you will learn about MNE-Python https://mne.tools/stable/index.html which has become a reference tool to process MEG/EEG/sEEG/ECoG data in Python. We’ll get you up to speed on recent efforts from the MNE community to make the BIDS format a first-class citizen of the EEG/MEG Python ecosystem. You’ll learn about MNE-BIDS https://mne.tools/mne-bids/ which is a package you can use to read/modify/save BIDS formatted data. You’ll practice with a standard analysis pipeline after seeing how to convert your data to BIDS including the anonymization of the MRI anatomical informations. The teaching will be done hands-on using Jupyter notebooks and public datasets.
EEGLAB and LIMO: Hierarchical General Linear Modelling and Robust Statistics for EEG
During the workshop, we will analyze the full data space of publicly available data, using the open source LIMO EEG Toolbox (in the time domain, but it works the same in the frequency domain). The LInear MOdeling of EEG toolbox is an EEGLAB toolbox that integrates seamlessly with ‘Studies’ and provides all the tools to analyze any experimental design including all sorts of covariates at the subject or group level. It allows analyzing all electrodes, all time and/or frequency frames and has robust statistical methods implemented along several multiple comparisons procedures. Depending on time available (ie speed of the group), the various options of the toolbox will be explored. It is expected that attendees will have learned enough to be able to use the toolbox on their own data by the end of the session.
Time-Frequency analyses with Fieldtrip
Robert Oostenveld & Jan-Mathijs Schoffelen
FieldTrip is the MATLAB software toolbox for MEG, EEG and iEEG analysis. It offers preprocessing and advanced analysis methods, such as time-frequency analysis, source reconstruction using dipoles, distributed sources and beamformers, and non-parametric statistical testing. It supports the data formats of all major MEG systems and of the most popular EEG and iEEG systems. FieldTrip contains high-level functions that you can use to construct your own analysis protocols as a MATLAB script.
In this hands-on session we will go over preprocessing and time-frequency analysis, and specifically look at how your selection of data segments, filtering and handling of artifacts can be optimized to get the best time-frequency estimates of the EEG and MEG activity. If time allows, we will also look at how non-parametric cluster-based statistics can be used to test for differences between conditions.
Session 3: wednesday morning - 8:30 to 11:30 AM
Contributing to reproducible science with git and github
Efforts towards reproducible science are more and more promoted by the organisational and funding institutions, as well as by the scientific community. The use of shared repository containing source code with automated tracking of changes is one of the aspect to implementing more reproducible research. The tutorial will focus on two aspects :
- In the first part, I will present the the basic commands of git, the most widely use tool for code sharing and versionning. Git can be implemented as a stand alone solution on a server with gitlab, but also supported by websites such as framagit, github or bitbucket.
- In the second part, I will show how to contribute to existing collaborative large open source projects, with reference to examples of successful open source projects (e.g. scikit-learn, python-mne, python-neo, etc.).
I will introduce how to establish your own project, and allow contributions from others (including with outside of lab) respecting the minimal rules to ensure efficient collaborative efforts.
Analyzing combined eye-tracking/EEG data
Humans actively explore their visual environment with 2-4 saccadic eye movements per second. The combination of eye tracking with simultaneous EEG recordings is a promising and data-rich approach to study attention and cognition under more natural viewing conditions (e.g., during face perception, scene viewing, or reading). This workshop aims to introduce researchers to this relatively new method combination, with a focus on data analysis. It will be split into three parts. The first part of the workshop will summarize basic properties of saccade- and fixation-related brain potentials in the EEG. It will also briefly cover some practical issues of co-registration (laboratory setups, data synchronization) as well as state-of-the-art techniques for removing ocular artifacts from the EEG. The second part of the workshop will then focus on regression-based strategies for analyzing combined eye-tracking/EEG data. In particular, I will show how linear deconvolution modeling with spline predictors (www.unfoldtoolbox.org) can isolate reliable brain activity during free viewing, while controlling for the confounding effects of overlapping potentials and nuisance variables. The final part of the workshop consists of hands-on exercises. We will analyze a simple EEG/eye-tracking dataset using two freely available Matlab toolboxes: EYE-EEG (www.eyetracking-eeg.org) and UNFOLD (www.unfoldtoolbox.org). To participate in the hands-on exercises, you will a version of Matlab (>2016a), which includes the statistics toolbox. Knowledge of Matlab is helpful but not required.
Dimigen, O. (2020). Optimizing the ICA-based removal of ocular EEG artifacts during from free viewing experiments. NeuroImage. https://doi.org/10.1016/j.neuroimage.2019.116117
Dimigen, O., & Ehinger, B. V. (2021). Regression-based analysis of combined EEG and eye-tracking data: Theory and applications. Journal of Vision. https://doi.org/10.1167/jov.21.1.3
Using computational models to understand the EEG signal underlying a cognitive task (Drift diffusion models)
Marieke Van Vugt
As the field is becoming increasingly aware of the need to have strong theories of cognition, computational models are becoming more popular as well. In this tutorial, we will talk about how we can use such computational models to better understand what is happening from moment to moment in EEG data. There are multiple approaches to do so. Most studies focus on a signal at a single moment in time, e.g., the peak of a particular ERP component. Conversely, here we will discuss what methods we can use to understand the changes over the full EEG signal, moment by moment. We will then launch into specifics for one such model, the Drift Diffusion Model (DDM). We will briefly go over how to fit a DDM to behavioural data, and then discuss how we can turn the results of these fits into a continuous regressor to relate to EEG data. Following this, we will go into how to do the canonical correlation between the DDM regressor we just created and EEG data. We will end by discussing how to extend this method to other tasks and models.
Brain-Computer Interface (BCI) using OpenViBE, an open-source software platform for Brain-Computer Interfaces
Arthur Desbois & Marie-Constance Corsi
OpenViBE is a software platform dedicated to designing, testing and using brain-computer interfaces (BCIs). It can be used to acquire, filter, process, classify and visualize brain signals in real time. During the first part of the session, a general overview of the BCI context and of the software will be given. We will also present several examples of use of OpenViBE in the BCI domain. During the second part, we will propose a step-by-step tutorial in which participants will design a simple motor imagery based BCI scenario from pre-recorded EEG signals.