M(eye)ndtris

A GUEST BLOG POST BY THORSTEN O. ZANDER

At a glance

Description

M(eye)ndtris

The M(eye)ndtris study introduces a completely hands-free version of Tetris as an example of BCI+. It uses eye tracking and passive brain-computer interfacing to replace existing game elements and to introduce new, neuroadaptive controls. In M(eye)ndtris, well time-based eye tracking is used for the movement of the tetromino. Passive BCIs assess two mental states of the player to influence the game in real time. The player’s relaxation is used to change the speed of the game and the corresponding music (see video). An incorrectly placed tetromino can induce a state of error perception which, if detected by the passive BCI, would delete the tetromino and free the space. Together, this results in a multimodal, hands-free version of the classic Tetris game.

Tetris + Eye Tracking

In Tetris, eye tracking provides a uniquely intuitive method of control. The M(eye)ndtris game is fully programmed in python-based SNAP code. The tetrominos moved horizontally following the player‘s gaze. „Flicking“ the block upwards with the eyes, by looking at the top of the screen, rotates the block. The game field is entirely block-based; block size can easily be adjusted depending on the accuracy of the calibration.

Tetris + Brain-Computer Interfacing (BCI)

Allowing the player‘s brain activity to influence the game adds novel game mechanics and challenges. To achieve this, EEG is recorded with a portable LiveAmp with 32 dry electrodes and interpreted by BCILAB in real time. First, the player‘s state of relaxation influences game speed: the more relaxed the player, the slower the game. Don‘t let your mistakes upset you—the game will become more difficult! Second, if you do make mistakes, a state of error perception may be recognised by the BCI, allowing erroneously placed blocks to be automatically removed. This works best when you are properly focused on the game—don’t be too relaxed!

Tetris + Eye Tracking + Brain-Computer Interfacing

All communication between LiveAmp, eye tracker, BCILAB and SNAP is coordinated through LSL, utilizing the full BCI+ framework. The Automatic BCI-based error correction can make up for potential inaccuracies resulting from the eye tracking. The bottleneck of manual input is removed and replaced by intuitive gaze control. The game‘s difficulty now depends on the player‘s mental state. The battle in the game becomes a battle against oneself, balancing mental states throughout the game.