All Projects → Aminoid → React Native Activity Recognition

Aminoid / React Native Activity Recognition

Licence: gpl-2.0
React Native wrapper for the Activity Recognition API.

Programming Languages

java
68154 projects - #9 most used programming language

Projects that are alternatives of or similar to React Native Activity Recognition

Charades Algorithms
Activity Recognition Algorithms for the Charades Dataset
Stars: ✭ 181 (+162.32%)
Mutual labels:  activity-recognition
awesome-egocentric-vision
A curated list of egocentric (first-person) vision and related area resources
Stars: ✭ 103 (+49.28%)
Mutual labels:  activity-recognition
Lstm Human Activity Recognition
Human Activity Recognition example using TensorFlow on smartphone sensors dataset and an LSTM RNN. Classifying the type of movement amongst six activity categories - Guillaume Chevalier
Stars: ✭ 2,943 (+4165.22%)
Mutual labels:  activity-recognition
Gait-Recognition-Using-Smartphones
Deep Learning-Based Gait Recognition Using Smartphones in the Wild
Stars: ✭ 77 (+11.59%)
Mutual labels:  activity-recognition
Awesome-Human-Activity-Recognition
An up-to-date & curated list of Awesome IMU-based Human Activity Recognition(Ubiquitous Computing) papers, methods & resources. Please note that most of the collections of researches are mainly based on IMU data.
Stars: ✭ 72 (+4.35%)
Mutual labels:  activity-recognition
hamnet
PyTorch implementation of AAAI 2021 paper: A Hybrid Attention Mechanism for Weakly-Supervised Temporal Action Localization
Stars: ✭ 30 (-56.52%)
Mutual labels:  activity-recognition
C3d Keras
C3D for Keras + TensorFlow
Stars: ✭ 171 (+147.83%)
Mutual labels:  activity-recognition
Sense
Enhance your application with the ability to see and interact with humans using any RGB camera.
Stars: ✭ 522 (+656.52%)
Mutual labels:  activity-recognition
Squeeze-and-Recursion-Temporal-Gates
Code for : [Pattern Recognit. Lett. 2021] "Learn to cycle: Time-consistent feature discovery for action recognition" and [IJCNN 2021] "Multi-Temporal Convolutions for Human Action Recognition in Videos".
Stars: ✭ 62 (-10.14%)
Mutual labels:  activity-recognition
On-device-activity-recognition
Personalized machine learning on the smartphone
Stars: ✭ 46 (-33.33%)
Mutual labels:  activity-recognition
R2Plus1D-C3D
A PyTorch implementation of R2Plus1D and C3D based on CVPR 2017 paper "A Closer Look at Spatiotemporal Convolutions for Action Recognition" and CVPR 2014 paper "Learning Spatiotemporal Features with 3D Convolutional Networks"
Stars: ✭ 54 (-21.74%)
Mutual labels:  activity-recognition
glimpse clouds
Pytorch implementation of the paper "Glimpse Clouds: Human Activity Recognition from Unstructured Feature Points", F. Baradel, C. Wolf, J. Mille , G.W. Taylor, CVPR 2018
Stars: ✭ 30 (-56.52%)
Mutual labels:  activity-recognition
Robust-Deep-Learning-Pipeline
Deep Convolutional Bidirectional LSTM for Complex Activity Recognition with Missing Data. Human Activity Recognition Challenge. Springer SIST (2020)
Stars: ✭ 20 (-71.01%)
Mutual labels:  activity-recognition
Step
STEP: Spatio-Temporal Progressive Learning for Video Action Detection. CVPR'19 (Oral)
Stars: ✭ 196 (+184.06%)
Mutual labels:  activity-recognition
Awesome Action Recognition
A curated list of action recognition and related area resources
Stars: ✭ 3,202 (+4540.58%)
Mutual labels:  activity-recognition
Video Caffe
Video-friendly caffe -- comes with the most recent version of Caffe (as of Jan 2019), a video reader, 3D(ND) pooling layer, and an example training script for C3D network and UCF-101 data
Stars: ✭ 172 (+149.28%)
Mutual labels:  activity-recognition
dana
DANA: Dimension-Adaptive Neural Architecture (UbiComp'21)( ACM IMWUT)
Stars: ✭ 28 (-59.42%)
Mutual labels:  activity-recognition
Wdk
The Wearables Development Toolkit - a development environment for activity recognition applications with sensor signals
Stars: ✭ 68 (-1.45%)
Mutual labels:  activity-recognition
Activity Recognition With Cnn And Rnn
Temporal Segments LSTM and Temporal-Inception for Activity Recognition
Stars: ✭ 415 (+501.45%)
Mutual labels:  activity-recognition
Activity-Recognition-CovMIJ
Skeleton-based method for activity recognition problem
Stars: ✭ 13 (-81.16%)
Mutual labels:  activity-recognition

react-native-activity-recognition

npm version

React Native wrapper for the Android Activity Recognition API and CMMotionActivity. It attempts to determine the user activity such as driving, walking, running and cycling. Possible detected activities for android are listed here and for iOS are listed here.
Updated January 7th and tested with react-native v0.57.5

Installation

npm i -S react-native-activity-recognition

or with Yarn:

yarn add react-native-activity-recognition

Linking

Automatic

react-native link react-native-activity-recognition

IMPORTANT NOTE: You'll need to follow Step 4 for both iOS and Android of manual-linking

Manual

Make alterations to the following files in your project:

Android

  1. Add following lines to android/settings.gradle
...
include ':react-native-activity-recognition'
project(':react-native-activity-recognition').projectDir = new File(rootProject.projectDir, '../node_modules/react-native-activity-recognition/android')
...
  1. Add the compile line to dependencies in android/app/build.gradle
...
dependencies {
    ...
    compile project(':react-native-activity-recognition')
    ...
}
  1. Add import and link the package in android/app/src/.../MainApplication.java
import com.xebia.activityrecognition.RNActivityRecognitionPackage;  // <--- add import

public class MainApplication extends Application implements ReactApplication {
    // ...
    @Override
    protected List<ReactPackage> getPackages() {
        return Arrays.<ReactPackage>asList(
            new MainReactPackage(),
            // ...
            new RNActivityRecognitionPackage()                      // <--- add package
        );
    }
  1. Add activityrecognition service in android/app/src/main/AndroidManifest.xml
...
<application ...>
    ...
    <service android:name="com.xebia.activityrecognition.DetectionService"/>
    ...
</application>
...

iOS

  1. In the XCode's "Project navigator", right click on your project's Libraries folder ➜ Add Files to <...>
  2. Go to node_modulesreact-native-activity-recognitionios ➜ select RNActivityRecognition.xcodeproj
  3. Add RNActivityRecognition.a to Build Phases -> Link Binary With Libraries
  4. Add NSMotionUsageDescription key to your Info.plist with strings describing why your app needs this permission

Usage

Class based implementation

import ActivityRecognition from 'react-native-activity-recognition'

...

// Subscribe to updates
this.unsubscribe = ActivityRecognition.subscribe(detectedActivities => {
  const mostProbableActivity = detectedActivities.sorted[0]
})

...

// Start activity detection
const detectionIntervalMillis = 1000
ActivityRecognition.start(detectionIntervalMillis)

...

// Stop activity detection and remove the listener
ActivityRecognition.stop()
this.unsubscribe()

Hooks based implementation

import ActivityRecognition from 'react-native-activity-recognition'

...

// Subscribe to updates on mount

  useEffect(() => {
    ActivityRecognition.subscribe(detectedActivities => {
      const mostProbableActivity = detectedActivities.sorted[0];
      console.log(mostProbableActivity);
    });

// Stop activity detection and remove the listener on unmount

    return ActivityRecognition.stop();
  });

...

// Start activity detection

const detectionIntervalMillis = 1000
ActivityRecognition.start(detectionIntervalMillis)

Android

detectedActivities is an object with keys for each detected activity, each of which have an integer percentage (0-100) indicating the likelihood that the user is performing this activity. For example:

{
  ON_FOOT: 8,
  IN_VEHICLE: 15,
  WALKING: 8,
  STILL: 77
}

Additionally, the detectedActivities.sorted getter is provided which returns an array of activities, ordered by their confidence value:

[
  { type: 'STILL', confidence: 77 },
  { type: 'IN_VEHICLE', confidence: 15 },
  { type: 'ON_FOOT', confidence: 8 },
  { type: 'WALKING', confidence: 8 },
]

Because the activities are sorted by confidence level, the first value will be the one with the highest probability Note that ON_FOOT and WALKING are related but won't always have the same value. I have never seen WALKING with a higher confidence than ON_FOOT, but it may happen that WALKING comes before ON_FOOT in the array if they have the same value.

The following activity types are supported:

  • IN_VEHICLE
  • ON_BICYCLE
  • ON_FOOT
  • RUNNING
  • WALKING
  • STILL
  • TILTING
  • UNKNOWN

iOS

detectedActivities is an object with key to the detected activity with a confidence value for that activity given by CMMotionActivityManager. For example:

{
    WALKING: 2
}

detectedActivities.sorted getter will return it in the form of an array.

[
    {type: "WALKING", confidence: 2}
]

The following activity types are supported:

  • RUNNING
  • WALKING
  • STATIONARY
  • AUTOMOTIVE
  • CYCLING
  • UNKNOWN

Credits / prior art

The following projects were very helpful in developing this library:

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].