All Projects → edusense → edusense

edusense / edusense

Licence: BSD-3-Clause license
EduSense: Practical Classroom Sensing at Scale

Programming Languages

python
139335 projects - #7 most used programming language
go
31211 projects - #10 most used programming language
C++
36643 projects - #6 most used programming language
Dockerfile
14818 projects
shell
77523 projects

Projects that are alternatives of or similar to edusense

github-classroom-utils
Python tools for instructors working with GitHub Classroom
Stars: ✭ 66 (+50%)
Mutual labels:  teachers, classroom
classmanager-student-teacher-portal
A Student-Teacher Portal built using HTML, CSS, Python and Django
Stars: ✭ 155 (+252.27%)
Mutual labels:  teachers, classroom
full-teaching
A web application to make teaching online easy. WARNING: the updated version of this repo is now in the link below
Stars: ✭ 34 (-22.73%)
Mutual labels:  teachers, classroom
brfv4 android examples
Android Studio project (Java)
Stars: ✭ 43 (-2.27%)
Mutual labels:  tracking
opencv TLD
TLD:tracking-learning-detection 跟踪算法
Stars: ✭ 41 (-6.82%)
Mutual labels:  tracking
TobiiGlassesPyController
Tobii Pro Glasses 2 Python controller
Stars: ✭ 42 (-4.55%)
Mutual labels:  tracking
BirdsEye
Applying Perspective transformations to 2d images.
Stars: ✭ 22 (-50%)
Mutual labels:  tracking
tracking-python3
In this repository I will give some implementation of single and multiple object tracking algorithms. These include meanShift, CamShift, Boosting, MIL, KCF, TLD , GoTurn, and MedianFlow. Additionally I will show you how to grab frames at a very high FPS from camera and videos.
Stars: ✭ 98 (+122.73%)
Mutual labels:  tracking
trackanimation
Track Animation is a Python 2 and 3 library that provides an easy and user-adjustable way of creating visualizations from GPS data.
Stars: ✭ 74 (+68.18%)
Mutual labels:  tracking
ouroom-project
A simple Classroom application to maintain your daily class stuffs.
Stars: ✭ 16 (-63.64%)
Mutual labels:  classroom
Moxo-Tech
Android智慧互动课堂(课堂辅助软件),包含client和server
Stars: ✭ 25 (-43.18%)
Mutual labels:  classroom
pyMHT
Track oriented, multi target, multi hypothesis tracker
Stars: ✭ 66 (+50%)
Mutual labels:  tracking
FMFNet
Official Pytorch implementation for the paper: "FMFNet: Improve the 3D Object Detection and Tracking via Feature Map Flow" [Accepted in IJCNN-2022]
Stars: ✭ 23 (-47.73%)
Mutual labels:  tracking
repobee
CLI tool for managing Git repositories on GitHub and GitLab in the context of education
Stars: ✭ 51 (+15.91%)
Mutual labels:  teachers
Face-Detection-and-Tracking
Face Detection and tracking using CamShift, Kalman Filter, Optical Flow
Stars: ✭ 30 (-31.82%)
Mutual labels:  tracking
FMF Server
FindMyFriends API intended to be used for server side applications written using requests and iCloud.com
Stars: ✭ 22 (-50%)
Mutual labels:  tracking
Awesome-Vision-Transformer-Collection
Variants of Vision Transformer and its downstream tasks
Stars: ✭ 124 (+181.82%)
Mutual labels:  tracking
IMU-VR-Full-Body-Tracker
Inertial Measurement Unit (IMU) based full body tracker for Steam VR.
Stars: ✭ 46 (+4.55%)
Mutual labels:  tracking
untrace
🐳 Minimal event tracking on the client in 300 bytes.
Stars: ✭ 26 (-40.91%)
Mutual labels:  tracking
ARFaceFilter
Javascript/WebGL lightweight face tracking library designed for augmented reality webcam filters. Features : multiple faces detection, rotation, mouth opening. Various integration examples are provided (Three.js, Babylon.js, FaceSwap, Canvas2D, CSS3D...).
Stars: ✭ 72 (+63.64%)
Mutual labels:  tracking

EduSense: Practical Classroom Sensing at Scale

hero image

EduSense represents the first real-time, in-the-wild evaluated and practically-deployable classroom sensing system at scale that produces a plethora of theoretically-motivated visual and audio features correlated with effective instruction.

Our getting started is a good starting point if you are interested in building/developing/deploying EduSense. More information about the team can be found on the EduSense website.

News

  • Oct 2019 We open-source our EduSense code!
  • Sep 2019 We presented our paper titled "Edusense: Practical Classroom Sensing at Scale" at Ubicomp'19.

Features for Students and Instructors

features

  • Visual Features:
    • Body Segmentation, Keypoints and Inter-frame tracking:
      • Hand Raise Detection
      • Upper Body Pose Estimation
      • Sit vs Stand Detection
      • Synthetic Accelerometer
      • Classroom Topology
    • Facial Lanndmarks and Attributes:
      • Smile Detection
      • Mouth State Detection
      • Gaze Estimation
  • Audio Features:
    • Speech Detection:
      • Student vs Instructor Speech
      • Speech Act Delimation
  • Classroom Digital Twins

Visualization Dashboard

viz dashboard

System Architecture

system architecture

Related Links

Citation

Karan Ahuja, Dohyun Kim, Franceska Xhakaj, Virag Varga, Anne Xie, Stanley Zhang, Jay Eric Townsend, Chris Harrison, Amy Ogan, and Yuvraj Agarwal. 2019. EduSense: Practical Classroom Sensing at Scale. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 3, 3, Article 71 (September 2019), 26 pages. DOI: https://doi.org/10.1145/3351229

@article{Ahuja:2019:EPC:3361560.3351229,
 author = {Ahuja, Karan and Kim, Dohyun and Xhakaj, Franceska and Varga, Virag and Xie, Anne and Zhang, Stanley and Townsend, Jay Eric and Harrison, Chris and Ogan, Amy and Agarwal, Yuvraj},
 title = {EduSense: Practical Classroom Sensing at Scale},
 journal = {Proc. ACM Interact. Mob. Wearable Ubiquitous Technol.},
 issue_date = {September 2019},
 volume = {3},
 number = {3},
 month = sep,
 year = {2019},
 issn = {2474-9567},
 pages = {71:1--71:26},
 articleno = {71},
 numpages = {26},
 url = {http://doi.acm.org/10.1145/3351229},
 doi = {10.1145/3351229},
 acmid = {3351229},
 publisher = {ACM},
 address = {New York, NY, USA},
 keywords = {Audio, Classroom, Computer Vision, Instructor, Machine Learning, Pedagogy, Sensing, Speech Detection, Teacher},
}

Karan Ahuja, Deval Shah, Sujeath Pareddy, Franceska Xhakaj, Amy Ogan, Yuvraj Agarwal, and Chris Harrison. 2021. Classroom Digital Twins with Instrumentation-Free Gaze Tracking. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems (CHI '21). Association for Computing Machinery, New York, NY, USA, Article 484, 1–9. DOI:https://doi.org/10.1145/3411764.3445711

@inproceedings{10.1145/3411764.3445711,
author = {Ahuja, Karan and Shah, Deval and Pareddy, Sujeath and Xhakaj, Franceska and Ogan, Amy and Agarwal, Yuvraj and Harrison, Chris},
title = {Classroom Digital Twins with Instrumentation-Free Gaze Tracking},
year = {2021},
isbn = {9781450380966},
publisher = {Association for Computing Machinery},
address = {New York, NY, USA},
url = {https://doi.org/10.1145/3411764.3445711},
doi = {10.1145/3411764.3445711},
articleno = {484},
numpages = {9},
keywords = {digital twins., Classroom sensing, gaze tracking},
location = {Yokohama, Japan},
series = {CHI '21}
}

License

The source code in this directory and its subdirectories are all governed by BSD 3-Clause License unless otherwise noted in the source code. Once compiled or packaged, it is the user's reponsibility to ensure that any use of the result binary or image complies with any relevant licenses for all software packaged together.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].