All Projects → NTX-McGill → NeuroTechX-McGill-2019

NTX-McGill / NeuroTechX-McGill-2019

Licence: other
A wheelchair controlled by EEG brain signals and enhanced with assisted driving

Programming Languages

python
139335 projects - #7 most used programming language
CSS
56736 projects
javascript
184084 projects - #8 most used programming language
HTML
75241 projects
C++
36643 projects - #6 most used programming language

Projects that are alternatives of or similar to NeuroTechX-McGill-2019

EEG-Motor-Imagery-Classification-CNNs-TensorFlow
EEG Motor Imagery Tasks Classification (by Channels) via Convolutional Neural Networks (CNNs) based on TensorFlow
Stars: ✭ 125 (+95.31%)
Mutual labels:  eeg, brain-computer-interface
neurosky-android-sdk
Android SDK for the NeuroSky MindWave Mobile Brainwave Sensing Headset
Stars: ✭ 39 (-39.06%)
Mutual labels:  eeg, brain-computer-interface
CereLink
Blackrock Microsystems Cerebus Link for Neural Signal Processing
Stars: ✭ 33 (-48.44%)
Mutual labels:  eeg, brain-computer-interface
pyRiemann
Python machine learning package based on sklearn API for multivariate data processing and statistical analysis of symmetric positive definite matrices via Riemannian geometry
Stars: ✭ 470 (+634.38%)
Mutual labels:  eeg, brain-computer-interface
sigviewer
SigViewer is a viewing application for biosignals.
Stars: ✭ 109 (+70.31%)
Mutual labels:  eeg
brain-monitor
A terminal app written in Node.js to monitor brain signals in real-time
Stars: ✭ 119 (+85.94%)
Mutual labels:  eeg
EEGReader
EEG Reader is an Android mobile application, which reads EEG signal from NeuroSky mobile device connected to smartphone via Bluetooth.
Stars: ✭ 36 (-43.75%)
Mutual labels:  eeg
hnn-core
Simulation and optimization of neural circuits for MEG/EEG source estimates
Stars: ✭ 24 (-62.5%)
Mutual labels:  eeg
eeg-rsenet
Motor Imagery EEG Signal Classification Using Random Subspace Ensemble Network
Stars: ✭ 24 (-62.5%)
Mutual labels:  eeg
Neuroimaging.jl
Neuroimaging in Julia
Stars: ✭ 39 (-39.06%)
Mutual labels:  eeg
eeg-gcnn
Resources for the paper titled "EEG-GCNN: Augmenting Electroencephalogram-based Neurological Disease Diagnosis using a Domain-guided Graph Convolutional Neural Network". Accepted for publication (with an oral spotlight!) at ML4H Workshop, NeurIPS 2020.
Stars: ✭ 50 (-21.87%)
Mutual labels:  eeg
python-meegkit
🔧🧠 MEEGkit: MEG & EEG processing toolkit in Python 🧠🔧
Stars: ✭ 99 (+54.69%)
Mutual labels:  eeg
antropy
AntroPy: entropy and complexity of (EEG) time-series in Python
Stars: ✭ 111 (+73.44%)
Mutual labels:  eeg
eeg-adapt
Source Code for "Adaptive Transfer Learning with Deep CNN for EEG Motor Imagery Classification".
Stars: ✭ 32 (-50%)
Mutual labels:  eeg
candock
A time series signal analysis and classification framework
Stars: ✭ 56 (-12.5%)
Mutual labels:  eeg
Deep-Learning-for-BCI
Resources for Book: Deep Learning for EEG-based Brain-Computer Interface: Representations, Algorithms and Applications
Stars: ✭ 63 (-1.56%)
Mutual labels:  eeg
BioAmp-v1.5
Upside Down Lab's Biopotential Amplifier v1.5 - Buy on Tindie at https://bit.ly/BioAmp-v1_5
Stars: ✭ 27 (-57.81%)
Mutual labels:  eeg
mne-bids
MNE-BIDS is a Python package that allows you to read and write BIDS-compatible datasets with the help of MNE-Python.
Stars: ✭ 88 (+37.5%)
Mutual labels:  eeg
hnn
The Human Neocortical Neurosolver (HNN) is a software tool that gives researchers/clinicians the ability to develop/test hypotheses on circuit mechanisms underlying EEG/MEG data.
Stars: ✭ 62 (-3.12%)
Mutual labels:  eeg
qEEG feature set
NEURAL: a neonatal EEG feature set in Matlab
Stars: ✭ 29 (-54.69%)
Mutual labels:  eeg

Milo: The Brain-Controlled Wheelchair

Milo helps people navigate without the use of hands or limbs. We think it could be especially useful for people with ALS, locked-in syndrome, or other forms of paralysis.

Our brain-computer interface makes use of electroencephalography (EEG), an affordable, accessible, and non-invasive technique to detect brain activity. Specifically, Milo uses motor imagery signals to turn, by detecting a suppression of the mu rhythm (7-13 Hz) in the sensorimotor cortex (the brain area associated with movement) when users imagine movements. In addition to motor imagery, eye blinking signals and jaw artefacts were used to initiate starts and stops, and to indicate the desire to turn. With Milo, users can toggle between moving forward and stoping by blinking their eyes or clenching their jaw. They can turn left or right by simply thinking about left and right hand movements. We also designed a web application for caregivers, from which they can view the location of the wheelchair user in real time to ensure their safety. A text message is also sent to the caregiver if the user's heart rate is abnormal or a crash occurs. In addition, we implemented assisted-driving features for wall following and object avoidance.

Github Navigation

  • \offline contains raw EEG data and scripts for offline analysis and visualization
    • \offline\data contains the raw EEG data recorded from consenting and anonymized participants. Each folder contains the recording for a single anonymized and consenting participant. The data collection paradigms are specified in the README.md of each folder
    • \offline\signal-processing contains scripts for signal processing
    • \offline\visualization contains scripts to vizualize the data
    • \offline\ML contains scripts for feature classification
  • \robotics contains scripts to interface with the Arduino hardware connected to the wheelchair
  • \src contains software for the dashboard, the assisted driving features, the caregiver app and the real-time rendering
    • \src\dashboard contains the dashboard software as well the instructions to set up and launch it. This is a user interface for testing the wheelchair
    • \src\real-time contains scripts to classify EEG signals acquired in real-time, to send/receive data from the wheelchair, and for assisted driving
    • \src\caregiver-app contains the web app and text messaging for the caregiver

Project Pipeline

Project pipeline

Data Collection

Medical grade Ten20 EEG conductive paste was used to secure four passive gold cup electrodes directly onto the scalp of the user. The four electrodes used to collect motor imagery data were placed along the sensorimotor cortex according to the 10/20 System (channels C1, C2, C3 and C4), as well as two reference electrodes placed on the subject's ear lobes. For heart-rate detection, an electrode was placed on the left wrist. References and the heart-rate electrode were secured using electrical tape. To acquire raw EEG data, a laptop configured to OpenBCI's Cyton Biosensing 8-channel, 32-bit board was used.

To collect training date, users were presented with a cue (right, left or rest) during which they were instructed to imagine moving their right hand or left hand, or to relax. Neurofeedback was provided in the form of bar plots indicating the strength of their motor imagery.

Signal Processing

Our targeted frequency range is the mu rhythm (7-13 Hz) when the subjects are at rest and beta rhythm (13-36 Hz) when the subjects blink their eyes. To process real-time data, we sampled at 250 Hz with a time window of two seconds.The signal was first notched-filtered at 60 Hz and 120 Hz to remove power-line noise using a Butterworth filter. After data pre-processing, we used Power Spectral Density (PSD) to extract the power of the mu band and the beta band with respect to frequency using Welch’s method.

Machine Learning

The paradigm used to move, turn, and stop the wheelchair consists of alternating between three states: Rest, Stop and Intermediate. motor imagery classification takes place within the intermediate state, which outputs either a full stop, or a command to turn the wheelchair in the appropriate direction. To switch from one state to another, artifacts, such as jaw-clenches or eye blinks, are used. A sustained artifact signal will command the wheelchair to move to the next state.

A linear regression is used to classify the motor imagery state of the user in real-time. The feature used in the regression is the average mu band power, given as the average of the frequencies of interest for all time points. The linear regression then gives a motor imagery state for every given time point. The direction with the most occurrence within a 3 second time-window is the final decision output and is fed to the wheelchair.

If no motor imagery signals are detected and jaw-clenching or eye-blinking is sustained, the wheelchair will go into a stop. Sustaining these artifacts again will bring the wheelchair to move forward again.

Dashboard

Our dashboard acts as the hub for both data collection and the real time control of the wheelchair.

During data collection, the experimenter can create a queue of thoughts ("left", "right", "rest") for the subject to think about. After each trial, a CSV file is written with the EEG data collected as well as its corresponding though label. Spectrograms are displayed in real time to assist the experimenter. Neurofeedback is also integrated by displaying the mu bands.

During the real time control of the wheelchair, bar graphs and charts display the machine learning confidence. The sensor readings are also shown on the screen.

Caregiver Web App

An application capable of sending the wheelchair's location to the caregiver in real-time was designed as a safety measure for wheelchair users. A notification feature is implemented so that the caregiver receives a text via Twilio, a cloud communication platform, when the user of the wheelchair experiences trouble or distress (i.e. obstacles, trauma, high stress, malfunction, etc.). The location information is received through the location services of the user's smartphone. The measure of stress dictating whether to send an alert or not is currently based on heart rate monitoring information. Once the heart rate exceeds a pre-established threshold customized to the user’s resting heart rate, the caregiver is alerted that the user might require assistance.

Hardware

The commercially available Orthofab Oasis 2008 wheelchair was modified and customized to fit the needs of the project. The motor controller of the wheelchair was replaced with two commercial-grade 40A, 12V PWM controllers connected to an Arduino Uno. Furthermore, the seat of the wheelchair was reupholstered and custom-built footrests were installed. Four motion sensors were installed around the circumference of the footrest for the implementation of the assisted-driving feature.

Assisted Driving

Relying on motor imagery for finer navigation is challenging if not impossible. We therefore created an assisted-driving model which serves to refine movements involved in straight navigation. The model has two primary functions: wall following and object avoidance.

In order to detect whether the user is following a wall, two ultrasonic sensors — one on the left and one on the right — are used to continuously monitor the wheelchair’s position relative to a potential wall. In order to determine if a wall is present, a linear regression model is fit to the last 5 seconds of sensor data collected from each side. A threshold on the standard error determines whether the wheelchair is approaching a wall from the side or is parallel to a wall. If a wall is detected, the optimal distance to the wall is calculated as the median of the data 1 to 5 seconds previously. If the difference between the current and optimal distances to the wall is large, a slight turn is executed to correct it.

The second function of the assisted-driving paradigm is obstacle avoidance. The two sensors used in wall following are combined with a frontward facing sensor and a sensor pointing at 45º from the vertical towards the ground. As the wheelchair approaches a small obstacle, using information about the chair’s distance from the obstacle, the algorithm determines if there is room to navigate around it. Once the obstacle has been cleared, the wheelchair continues on the straight path that it had initially set out on. If there isn’t room to navigate around the obstacle, the wheelchair comes to a complete stop and the user decides what subsequent action they wish to execute. The system uses the 45º ultrasonic sensor to detect the presence of stairs or steep hills in its upcoming path and stops if the former are detected.

Future Directions

Our current methodology uses jaw-clench as a mean to navigate by allowing the user to toggle between different states. This was the most robust signal we could acquire and was proved to work quite well, but is far from ideal considering locked-in patients cannot produce jaw clenches. An obvious next direction will be to implement eye blinking, a signal that can be produced by the targeted patient populations. The small amount of electrodes used in the current experiment prevents us from attaining high temporal and spatial sensitivity, which subsequently affects the sensitivity of the signals we can pick up and work with.

To prevent the wheelchair from changing the states too quickly, we set the minimum time for each state to be four seconds, which means that the wheelchair has to stay in a state for at least four seconds before it can change to another state. However, the period also makes it less flexible to control the wheelchair. For example, if the wheelchair is moving forward, it cannot stop even if it will collide with an obstacle in four seconds. To solve this problem in the future, we will integrate our self-driving feature into the states command. We will also try to increase the real-time sensitivity to the brain signals.

We are currently developing a doorway detection and navigation algorithm which would enable autonomous navigation through doorways. This feature is developed as doorways are a major hindrance to wheelchair users wishing to gain autonomy. Our current model uses an X-Box Kinect sensor for doorway mapping, planning and navigation as outlined in CoPilot.

Partners

A special thank you to Dr. Georgios Mitsis, our faculty advisor, and to Dr. Stefanie Blain-Moraes for lending equipment.

The Team

We are an interdisciplinary group of dedicated undergraduate students from McGill University and our mission is to raise awareness and interest in neurotechnology. For more information, see our facebook page or our website.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].