All Projects → bartkowiaktomasz → har-wisdm-lstm-rnns

bartkowiaktomasz / har-wisdm-lstm-rnns

Licence: other
Human Activity Recognition on the Wireless Sensor Data Mining (WISDM) dataset using LSTM Recurrent Neural Networks

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to har-wisdm-lstm-rnns

Deep-Learning-for-Human-Activity-Recognition
Keras implementation of CNN, DeepConvLSTM, and SDAE and LightGBM for sensor-based Human Activity Recognition (HAR).
Stars: ✭ 48 (+65.52%)
Mutual labels:  human-activity-recognition
Lstm Human Activity Recognition
Human Activity Recognition example using TensorFlow on smartphone sensors dataset and an LSTM RNN. Classifying the type of movement amongst six activity categories - Guillaume Chevalier
Stars: ✭ 2,943 (+10048.28%)
Mutual labels:  human-activity-recognition
KCC
Kernel Cross-Correlator (KCC) for Tracking and Recognition (AAAI 2018)
Stars: ✭ 43 (+48.28%)
Mutual labels:  human-activity-recognition
HAR
Recognize one of six human activities such as standing, sitting, and walking using a Softmax Classifier trained on mobile phone sensor data.
Stars: ✭ 18 (-37.93%)
Mutual labels:  human-activity-recognition
WearableSensorData
This repository provides the codes and data used in our paper "Human Activity Recognition Based on Wearable Sensor Data: A Standardization of the State-of-the-Art", where we implement and evaluate several state-of-the-art approaches, ranging from handcrafted-based methods to convolutional neural networks.
Stars: ✭ 65 (+124.14%)
Mutual labels:  human-activity-recognition
Robust-Deep-Learning-Pipeline
Deep Convolutional Bidirectional LSTM for Complex Activity Recognition with Missing Data. Human Activity Recognition Challenge. Springer SIST (2020)
Stars: ✭ 20 (-31.03%)
Mutual labels:  human-activity-recognition
Human Activity Recognition
A new and computationally cheap method to perform human activity recognition using PoseNet and LSTM. Where we use PoseNet for Preprocessing and LSTM for understand the sequence.
Stars: ✭ 25 (-13.79%)
Mutual labels:  human-activity-recognition
dana
DANA: Dimension-Adaptive Neural Architecture (UbiComp'21)( ACM IMWUT)
Stars: ✭ 28 (-3.45%)
Mutual labels:  human-activity-recognition
lstm har
LSTM based human activity recognition using smart phone sensor dataset
Stars: ✭ 20 (-31.03%)
Mutual labels:  human-activity-recognition
Human-Activity-Recognition
Human activity recognition using TensorFlow on smartphone sensors dataset and an LSTM RNN. Classifying the type of movement amongst six categories (WALKING, WALKING_UPSTAIRS, WALKING_DOWNSTAIRS, SITTING, STANDING, LAYING).
Stars: ✭ 16 (-44.83%)
Mutual labels:  human-activity-recognition
dl-for-har
Official GitHub page of the best-paper award publication "Improving Deep Learning for HAR with shallow LSTMs" presented at the International Symposium on Wearable Computers 21' (ISWC 21')
Stars: ✭ 29 (+0%)
Mutual labels:  human-activity-recognition
Awesome-Human-Activity-Recognition
An up-to-date & curated list of Awesome IMU-based Human Activity Recognition(Ubiquitous Computing) papers, methods & resources. Please note that most of the collections of researches are mainly based on IMU data.
Stars: ✭ 72 (+148.28%)
Mutual labels:  human-activity-recognition

Human Activity Recognition on the Wireless Sensor Data Mining (WISDM) dataset using LSTM Recurrent Neural Networks

This repository cotains code used to recognize human activity based on the Wireless Sensor Data Mining (WISDM) dataset using LSTM (Long short-term memory) and is heavily based on the article by Venelin Valkov.

This repository contains an improved version of the model, where Bidirectional LSTM is used with Bayesian Optimization to find optimal architecture.

Dataset

The data used for classification is provided by the Wireless Sensor Data Mining (WISDM) Lab and can be downloaded here. It consists of 1,098,207 examples of various physical activities (sampled at 20Hz) with 6 attributes: user,activity,timestamp,x-acceleration,y-acceleration,z-acceleration, and the activities include: Walking, Jogging, Upstairs, Downstairs, Sitting, Standing. X acceleration for Jogging

Original research done on this dataset can be found here.

Data preprocessing

The following graph shows how the x-acceleration was changing with time (or more accurately - at each timestep) for Jogging. X acceleration for Jogging In order to feed the network with such temporal dependencies a sliding time window is used to extract separate data segments. The window width and the step size can be both adjusted and optimised for better accuracy. Each time step is associated with an activity label, so for each segment the most frequently appearing label is chosen. Here, the time segment or window width is chosen to be 200 and time step is chosen to be 100.

Input:

  • data (data/WISDM_ar_v1.1_raw.txt)

The data needs to be separated into features and labels and then further into training and test sets. Labels need to be one-hot encoded before feeding into the classifier.

Output:

  • Trained classifier
  • Confusion matrix graph
  • Error/Accuracy rate graph

LSTMs

Long short-term memory (LSTM) Recurrent Neural Networks (RNNs) are used to model temporal data (i.e. speech recognition, NLP, human activity recognition), where there is a need to keep some state information. More info on LSTMs can be found here.

Results

The classifier achieves the accuracy of 94%, though it might presumably be slightly improved by decreasing the step size of sliding window. The following graphs show the train/test error/accuracy for each epoch and the final confusion matrix (normalised so that each row sums to one). Train/test set accuracy/error Confusion matrix

Dependencies

  • matplotlib 1.5.3
  • seaborn 0.8.1
  • numpy 1.14
  • pandas 0.20.3
  • scikit-learn 0.19.1
  • tensorflow 1.5.0

Use

  1. Run the script with python3 HAR_Recognition.py
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].