All Projects → IRC-SPHERE → Sphere Challenge

IRC-SPHERE / Sphere Challenge

Licence: mit
SPHERE Challenge: Activity Recognition with Multimodal Sensor Data

Projects that are alternatives of or similar to Sphere Challenge

Pelee
Pelee: A Real-Time Object Detection System on Mobile Devices
Stars: ✭ 851 (+9355.56%)
Mutual labels:  jupyter-notebook
Mvca
Code for simulations and empirical analyses for the article "How to control for confounds in decoding analyses of neuroimaging data"
Stars: ✭ 9 (+0%)
Mutual labels:  jupyter-notebook
Notebooks
Misc. Jupyter notebooks for testing and exploring various things.
Stars: ✭ 9 (+0%)
Mutual labels:  jupyter-notebook
Code snippets
No description, website, or topics provided.
Stars: ✭ 8,186 (+90855.56%)
Mutual labels:  jupyter-notebook
Finance
Stars: ✭ 9 (+0%)
Mutual labels:  jupyter-notebook
Tutorials Iml2017
Stars: ✭ 9 (+0%)
Mutual labels:  jupyter-notebook
Poems
Poems (Mirror)
Stars: ✭ 8 (-11.11%)
Mutual labels:  jupyter-notebook
Machinelearningtutorial
Short Machine Learning Tutorial
Stars: ✭ 9 (+0%)
Mutual labels:  jupyter-notebook
Socialgraphs2017
This is the repo associated with the class 02805 "Social Graphs and Interactions" at the Technical University of Denmark
Stars: ✭ 9 (+0%)
Mutual labels:  jupyter-notebook
Losc event tutorial
Tutorial for working with binary black hole data. http://mybinder.org/repo/losc-tutorial/LOSC_Event_tutorial
Stars: ✭ 9 (+0%)
Mutual labels:  jupyter-notebook
100days
100 days of algorithms
Stars: ✭ 6,789 (+75333.33%)
Mutual labels:  jupyter-notebook
Syntree2vec
An algorithm to augment syntactic hierarchy into word embeddings
Stars: ✭ 9 (+0%)
Mutual labels:  jupyter-notebook
Deeplearning
Stars: ✭ 9 (+0%)
Mutual labels:  jupyter-notebook
Light head rcnn
Light-Head R-CNN
Stars: ✭ 852 (+9366.67%)
Mutual labels:  jupyter-notebook
Novel Twitter Anomalies Pydatalondon2016
Detect novel anomalies on Twitter
Stars: ✭ 9 (+0%)
Mutual labels:  jupyter-notebook
Jupyter Notebooks
data analysis experiments in haskell and python
Stars: ✭ 8 (-11.11%)
Mutual labels:  jupyter-notebook
Mnist Ewc
Implementation of ews weight constraint mentioned in recent Deep Mind paper: http://www.pnas.org/content/early/2017/03/13/1611835114.full.pdf
Stars: ✭ 9 (+0%)
Mutual labels:  jupyter-notebook
Scipyecosystem
An Introduction to the SciPy Ecosystem presentation
Stars: ✭ 9 (+0%)
Mutual labels:  jupyter-notebook
Headline analysis
Analyzing news headlines for fun and profit
Stars: ✭ 9 (+0%)
Mutual labels:  jupyter-notebook
Ismir2016eeg Tutorial
ISMIR 2016 Tutorial - Introduction to EEG Decoding for Music Information Retrieval Research
Stars: ✭ 9 (+0%)
Mutual labels:  jupyter-notebook

SPHERE Challenge: Activity Recognition with Multimodal Sensor Data

All use of the data must cite the following:

Niall Twomey, Tom Diethe, Meelis Kull, Hao Song, Massimo Camplani, Sion Hannuna, Xenofon Fafoutis, Ni Zhu, Pete Woznowski, Peter Flach, and Ian Craddock. The SPHERE Challenge: Activity Recognition with Multimodal Sensor Data. 2016.

BibTeX: @article{twomey2016sphere, title={The {SPHERE} Challenge: Activity Recognition with Multimodal Sensor Data}, author={Twomey, Niall and Diethe, Tom and Kull, Meelis and Song, Hao and Camplani, Massimo and Hannuna, Sion and Fafoutis, Xenofon and Zhu, Ni and Woznowski, Pete and Flach, Peter and others}, journal={arXiv preprint arXiv:1603.00797}, year={2016}}

This dataset has an associated homepage:

The following pages are available in the dataset website:

A number of processing and visualisation scripts can be found in the challenge github repository:

INTRODUCTION

The task of this challenge is to predict aspects the activities of residents within a smart home based only on observed sensor data. Sensor data are obtained from the following three sensing modalities:

  • A wrist-worn accelerometer
  • Video + Depth (RGB-D)
  • Passive environmental presence sensors

The data and file formats are described in the following section.

DATA

Training data and testing data can be found in the ‘train’ and ‘test’ subdirectories respectively. The recorded data are collected under unique codes (each recording will be referred to as a ‘data sequence’). Timestamps are rebased to be relative to the start of the sequences, i.e. for a sequence of length 10 seconds, all timestamps will be within the range 0-10 seconds.

Each data sequence contains the following files:

  • targets.csv (available only with training data)
  • pir.csv
  • video_hallway.csv
  • video_living_room.csv
  • video_kitchen.csv

The following files are also available within the training directory:

  • annotations_*.csv
  • aocations_*.csv

The data from annotations_*.csv is used to create the targets.csv file, and locations_*.csv files are availablef or participants that want to model indoor localisation. These are only available for the training set.

The dataset may be downloaded from data.bris:

The row titled 'Compressed Dataset' in the table privides a link to download the full dataset.

targets.csv (available in train only)

This file contains the probabilistic targets for classification. Multiple annotators may have annotated each sequence, and this file aggregates all of the annotations over one second windows. The mean duration of each label within this window is used as the target variable.

The following 20 activities are labelled:

annotation_names = ('a_ascend', 'a_descend', 'a_jump', 'a_loadwalk', 'a_walk', 'p_bent', 'p_kneel', 'p_lie', 'p_sit', 'p_squat', 'p_stand', 't_bend', 't_kneel_stand', 't_lie_sit', 't_sit_lie', 't_sit_stand', 't_stand_kneel', 't_stand_sit', 't_straighten', 't_turn')

The prefix ‘a_’ indicates an ambulation activity (i.e. an activity consisting of continuing movement), ‘p_’ annotations indicate static postures (i.e. times when the participants are stationary), and ‘t_’ annotations indicate posture-to-posture transitions.

This file contains of 22 columns:

  • start: The starting time of the window
  • end: The ending time of the window
  • targets: Columns 3-22: the 20 probabilistic targets.

pir.csv (available for train and test)

This file contains the start time and duration for all PIR sensors in the smart environment. A PIR sensor is located in every room:

pir_locations = ('bath', 'bed1', 'bed2', 'hall', 'kitchen', 'living', 'stairs', 'study', 'toilet')

The columns of this CSV file are:

  • start: the start time of the PIR sensor (relative to the start of the sequence)
  • end: the end time of the PIR sensor (relative to the start of the sequence)
  • name: the name of the PIR sensor being activated (from the above list)
  • index: the index of the activated sensor from the pir_locations list starting at 0

acceleration.csv (available for train and test)

The acceleration file consists of eight columns:

  • t: this is the time of the recording (relative to the start of the sequence)
  • x/y/z: these are the acceleration values recorded on the x/y/z axes of the accelerometer.
  • Kitchen_AP/Lounge_AP/Upstairs_AP/Study_AP: these specify the received signal strength (RSSI) of the acceleration signal as received by the access kitchen/lounge/upstairs access points. Empty values indicate that the access point did not receive the packet.

video_*.csv (available for train and test)

The following columns are found in the video_hallway.csv, video_kitchen.csv and video_living_room.csv files:

  • t: The current time (relative to the start of the sequence)
  • centre_2d_x/centre_2d_y: The x- and y-coordinates of the center of the 2D bounding box.
  • bb_2d_br_x/bb_2d_br_y: The x and y coordinates of the bottom right (br) corner of the 2D bounding box
  • bb_2d_tl_x/bb_2d_tl_y: The x and y coordinates of the top left (tl) corner of the 2D bounding box
  • centre_3d_x/centre_3d_y/centre_3d_z: the x, y and z coordinates for the center of the 3D bounding box
  • bb_3d_brb_x/bb_3d_brb_y/bb_3d_brb_z: the x, y, and z coordinates for the bottom right back corner of the 3D bounding box
  • bb_3d_flt_x/bb_3d_flt_y/bb_3d_flt_z: the x, y, and z coordinates of the front left top corner of the 3D bounding box.

SUPPLEMENTARY FILES

The following two sets of file need not be used for the challenge, but are included to facilitate users that wish to perform additional modelling of the sensor environment.

locations_*.csv (available in train only)

This labels the room that is currently occupied by the recruited participant. The following rooms are labelled:

location_names = ('bath', 'bed1', 'bed2', 'hall', 'kitchen', 'living', 'stairs', 'study', 'toilet')

locations.csv contains the following four columns:

  • start: the time a participant entered a room (relative to the start of the sequence)
  • end: the time the participant left the room (relative to the start of the sequence)
  • name: the name of the room (from the above list)
  • index: the index of the room name starting at 0

annotations_*.csv (available in train only)

annotations.csv contains the annotations that were provided by the annotators. Each file contains the following:

  • start: the start time of the activity (relative to the start of the sequence)
  • end: the end time of the activity (relative to the start of the sequence)
  • name: the name of the label (from the list of annotation_names)
  • index: the index of the label name starting at 0
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].