This lesson is in the early stages of development (Alpha version)

The Lossless Pipeline

The purpose of this lesson is to teach users about the benefits of EEG preprocessing pipelines, the inputs, procedures, and outputs of the BIDS Lossless pipeline, as well as how to submit pipeline jobs to a remote cluster. In this lesson, we will be using the Batch Context plugin for EEGLAB to run a batch of EEG data files in parallel through the Lossless pipeline by submitting jobs to Compute Canada’s Graham cluster. The same procedure should work for any other HPC using a slurm scheduler.

Prerequisites

  • MATLAB
  • All of the previous BIDS and BIDS EEG lessons
  • Familiarity with EEGLAB and the Batch Context plugin
  • A notion of statistical approaches to EEG
  • Bash usage for scp/rsync/ssh

Schedule

Setup Download files required for the lesson
10:00 1. Introduction What is an EEG preprocessing pipeline?
What are the benefits of a Lossless pipeline in EEG research?
10:30 2. The Lossless Pipeline What are the inputs and outputs of the Lossless pipeline?
What kinds of decisions does the Lossless pipeline make?
11:00 3. Downloading the Lossless Pipeline How do we download and set up the Lossless pipeline?
11:45 4. Co-registering EEG Data Why is co-registering EEG data important?
How is co-registeration of EEG data done?
12:15 5. Running the Lossless Pipeline How do we run a batch of EEG files through the Lossless pipeline?
How do we submit Lossless pipeline jobs to a remote parallel computing cluster?
13:00 Finish

The actual schedule may vary slightly depending on the topics and exercises chosen by the instructor.