This blog post demonstrates how to use the mindaffectBCI software with Brain Products amplifiers. We will first guide you through a specific example of how to use the c-VEP-based speller with our LiveAmp and then illustrate how this can be extended to any of our amplifiers with the use of LSL.
This tutorial demonstrates how LabStreamingLayer (LSL) can be used within a complete pipeline for recording EEG and other signals, adding event markers to the data stream, accessing and visualizing data online, as well as analyzing and converting data after it has been recorded.
In order to keep the advantages of using SNAP while offsetting the disadvantage of the delay, we used a Brain Products Photo Sensor to more accurately measure the onset times of visual stimuli, and correct the markers after the recording.
The M(eye)ndtris study introduces a completely hands-free version of Tetris as an example of BCI+. It uses eye tracking and passive brain-computer interfacing to replace existing game elements and to introduce new, neuroadaptive control.