All Projects → sylvchev → dataset-ssvep-exoskeleton

sylvchev / dataset-ssvep-exoskeleton

Licence: CC0-1.0 license
SSVEP-based BCI recording of 12 subjects operating an upper limb exoskeleton during a shared control task. The exoskeleton is either controlled with a touchless interface detecting hand poses or with BCI.

Projects that are alternatives of or similar to dataset-ssvep-exoskeleton

brain-powered
With the Brain Powered honourscourse at the UvA we aimed to control a drone using brain signals, measured by means of EEG, only.
Stars: ✭ 13 (-63.89%)
Mutual labels:  bci
ganglion-ble
Web Bluetooth client for the Ganglion brain-computer interface by OpenBCI
Stars: ✭ 27 (-25%)
Mutual labels:  bci
openEMSstim
openEMSstim: open-hardware module to adjust the intensity of EMS/TENS stimulators.
Stars: ✭ 90 (+150%)
Mutual labels:  stimulation
Scouter
Scouter is an open source APM (Application Performance Management) tool.
Stars: ✭ 1,792 (+4877.78%)
Mutual labels:  bci
Wukong Robot
🤖 wukong-robot 是一个简单、灵活、优雅的中文语音对话机器人/智能音箱项目,还可能是首个支持脑机交互的开源智能音箱项目。
Stars: ✭ 3,110 (+8538.89%)
Mutual labels:  bci
EEGEdu
Interactive Brain Playground - Browser based tutorials on EEG with webbluetooth and muse
Stars: ✭ 91 (+152.78%)
Mutual labels:  bci
neurosky-android-sdk
Android SDK for the NeuroSky MindWave Mobile Brainwave Sensing Headset
Stars: ✭ 39 (+8.33%)
Mutual labels:  bci
BCI
A RobotC Library for Vex Robotics
Stars: ✭ 12 (-66.67%)
Mutual labels:  bci
EEGwithRaspberryPI
Open-Source board for converting RaspberryPI to Brain-computer interface
Stars: ✭ 402 (+1016.67%)
Mutual labels:  bci
EEGReader
EEG Reader is an Android mobile application, which reads EEG signal from NeuroSky mobile device connected to smartphone via Bluetooth.
Stars: ✭ 36 (+0%)
Mutual labels:  bci
Deep-Learning-for-BCI
Resources for Book: Deep Learning for EEG-based Brain-Computer Interface: Representations, Algorithms and Applications
Stars: ✭ 63 (+75%)
Mutual labels:  bci
march
The main repository of the March exoskeleton 🎛️
Stars: ✭ 30 (-16.67%)
Mutual labels:  exoskeleton
SSVEP-Neural-Generative-Models
Code to accompany our International Joint Conference on Neural Networks (IJCNN) paper entitled - Simulating Brain Signals: Creating Synthetic EEG Data via Neural-Based Generative Models for Improved SSVEP Classification
Stars: ✭ 37 (+2.78%)
Mutual labels:  ssvep

dataset-ssvep-exoskeleton

Introduction

This dataset gathers SSVEP-based BCI recordings of 12 subjects operating an upper limb exoskeleton during a shared control task. The exoskeleton is either controlled with a touchless interface detecting hand poses or with SSVEP-based BCI.

This dataset is part of the MOABB project and could simply be downloaded and used via the SSVEPExo() class.

Related publications:

This dataset is used in the following publications:

  • Emmanuel K. Kalunga, Sylvain Chevallier, Olivier Rabreau, Eric Monacelli. Hybrid interface : Integrating BCI in Multimodal Human-Machine Interfaces. IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM), 2014, Besancon, France.
  • Emmanuel Kalunga, Sylvain Chevallier, Quentin Barthelemy. Data augmentation in Riemannian space for Brain-Computer Interfaces, STAMLINS (ICML workshop), 2015, Lille, France.
  • Emmanuel K. Kalunga, Sylvain Chevallier, Quentin Barthelemy. Online SSVEP-based BCI using Riemannian Geometry. Neurocomputing, 2016. arXiv research report on arXiv:1501.03227.

It should be indicated that for these publications, the authors excluded the following sessions due to protocol issues (synchronization problem, hardware failure, etc): subject 08 session record-[2013.04.06-16.22.32], subject 10 sessions record-[2014.02.26-15.10.48] and record-[2014.02.26-15.50.09], subject 11 sessions record-[2014.02.24-17.56.37] and record-[2014.02.24-18.02.40].

Experimental setup

  1. Arm exoskeleton: The exoskeleton used here is the ESTA robotic arm [1]. ESTA is designed to compensate for muscular dystrophy in the shoulder and elbow muscles occurring in several degenerative diseases, which affect the large muscles but spare the wrists and hands motor capacities.

  2. Touchless interface: Our touchless interface embeds 5 IR-sensors which could set up in different spatial positions, according to the user requirements. The control system relies on a iterative kNN scheme to learn hand poses of each user. The details of the algorithm is provided in [2].

  3. Steady-state visually evoked potentials: The g.Mobilab+ device is used for recording EEG at 256 Hz on 8 channels. For SSVEP stimulation, flash stimulus technique has been chosen. To avoid limitation imposed by refresh rate of computer screens, a microcontroller is set up to flash stimuli with light emitting diodes (LED) at frequencies F ={13, 17, 21} Hz. The device has been controlled and the LED blinking is precise up to the millisecond. The eight electrodes are placed according to the 10/20 system on Oz, O1, O2, POz, PO3, PO4, PO7 and PO8. The ground was placed on Fz and the reference was located on the right (or left) hear mastoid.

Data description

The datasets contains 12 directories, containing recording from 12 male and female subjects aged between 20 and 28 years. Informed consent was obtained from all subjects, each one has signed a form attesting her or his consent. The subject sits in an electric wheelchair, his right upper limb is resting on the exoskeleton. The exoskeleton is functional but is not used during the recording of this experiment.

A panel of size 20x30 cm is attached on the left side of the chair, with 3 groups of 4 LEDs blinking at different frequencies. Even if the panel is on the left side, the user could see it without moving its head. The subjects were asked to sit comfortably in the wheelchair and to follow the auditory instructions, they could move and blink freely.

A sequence of trials is proposed to the user. A trial begin by an audio cue indicating which LED to focus on, or to focus on a fixation point set at an equal distance from all LEDs for the reject class. A trial lasts 5 seconds and there is a 3 second pause between each trial. The evaluation is conducted during a session consisting of 32 trials, with 8 trials for each frequency (13Hz, 17Hz and 21Hz) and 8 trials for the reject class, i.e. when the subject is not focusing on any specific blinking LED.

The recording are saved in GDF format [3], the stimulations code for each class are available as time events. There is between 2 and 5 sessions for each user, recorded on different days, by the same operators, on the same hardware and in the same conditions.

To allow direct Python processing, a gzip pickle version of the data are also available as '.pz' file. Sample code to use these file is provided below.

The stimulation code used in GDF file are those defined by OpenVibe:

  • ExperimentStart: 32769, 0x00008001,
  • ExperimentStop: 32770, 0x00008002
  • VisualStimulationStart: 32779, 0x0000800b
  • VisualStimulationStop: 32780, 0x0000800c
  • Label_00: 33024, 0x00008100
  • Label_01: 33025, 0x00008101
  • Label_02: 33026, 0x00008102
  • Label_03: 33027, 0x00008103

The stimulation code are used as follows: ExperimentStart and ExperimentStop indicate the begining and the end of the session. A trial start with a Label_XX stimulation code indicating the class of the example, there is a 3s pause before the audio cue indicating the stimulus to focus. The audio cue onset is indicated by VisualStimulationStart, this is the start of the trial. The end of the trial take place 5s after and is indicated by VisualStimulationStop. Label_00 is for resting class, Label_01 is for 13Hz stimulation, Label_02 is for 21Hz stimulation and Label_03 is for 17Hz stimulation.

Example code

Two example notebooks explain how to extract covariance matrices from the dataset based on the stimulation code and how to use a Riemannian-based classification scheme with the extracted spatial covariance matrices. It is also possible to rely on MNE to open and process this dataset.

Bibliography

[1] M. Baklouti, P. A. Guyot, E. Monacelli, and S. Couvet. Force controlled upper-limb powered exoskeleton for rehabilitation, in Intelligent Robots and Systems (IROS), 2008, p. 4202.

[2] H. Martin, S. Chevallier, and E. Monacelli, Fast calibration of hand movement-based interface for arm exoskeleton control, in European Symposium on Artificial Neural Networks (ESANN), 2012, pp. 573– 578.

[3] A. Schlogl, GDF - A general dataformat for biosignals, http://arxiv.org/abs/cs/0608052

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].