All Projects → aws-samples → amazon-rekognition-engagement-meter

aws-samples / amazon-rekognition-engagement-meter

Licence: other
The Engagement Meter calculates and shows engagement levels of an audience participating in a meeting

Programming Languages

javascript
184084 projects - #8 most used programming language
HTML
75241 projects
CSS
56736 projects

Projects that are alternatives of or similar to amazon-rekognition-engagement-meter

Amazon Rekognition Video Analyzer
A working prototype for capturing frames off of a live MJPEG video stream, identifying objects in near real-time using deep learning, and triggering actions based on an objects watch list.
Stars: ✭ 309 (+530.61%)
Mutual labels:  amazon-web-services, image-analysis
terraform-aws-s3-bucket
A Terraform module to create a Simple Storage Service (S3) Bucket on Amazon Web Services (AWS). https://aws.amazon.com/s3/
Stars: ✭ 47 (-4.08%)
Mutual labels:  amazon-web-services
face-attendence
Face Attendance (AWS rekognition)
Stars: ✭ 39 (-20.41%)
Mutual labels:  amazon-web-services
kesci-urdu-sentiment-analysis
sentiment-analysis
Stars: ✭ 70 (+42.86%)
Mutual labels:  sentiment-analysis
oracdc
Oracle database CDC (Change Data Capture)
Stars: ✭ 51 (+4.08%)
Mutual labels:  amazon-web-services
vista-net
Code for the paper "VistaNet: Visual Aspect Attention Network for Multimodal Sentiment Analysis", AAAI'19
Stars: ✭ 67 (+36.73%)
Mutual labels:  sentiment-analysis
Stock-Prediction
LSTM RNN for sentiment-based stock prediction
Stars: ✭ 50 (+2.04%)
Mutual labels:  sentiment-analysis
image-checker
Provides image optimisation information within the browser
Stars: ✭ 14 (-71.43%)
Mutual labels:  image-analysis
sentiment-thermometer
Measure the sentiment towards a word, name or sentence on social networks
Stars: ✭ 56 (+14.29%)
Mutual labels:  sentiment-analysis
objectfit-focalpoint
Generate the object-position value to capture an image's focal point given a custom aspect-ratio.
Stars: ✭ 14 (-71.43%)
Mutual labels:  image-analysis
billboard
🎤 Lyrics/associated NLP data for Billboard's Top 100, 1950-2015.
Stars: ✭ 53 (+8.16%)
Mutual labels:  sentiment-analysis
Persian-Sentiment-Analyzer
Persian sentiment analysis ( آناکاوی سهش های فارسی | تحلیل احساسات فارسی )
Stars: ✭ 30 (-38.78%)
Mutual labels:  sentiment-analysis
DeepSentiPers
Repository for the experiments described in the paper named "DeepSentiPers: Novel Deep Learning Models Trained Over Proposed Augmented Persian Sentiment Corpus"
Stars: ✭ 17 (-65.31%)
Mutual labels:  sentiment-analysis
Movie-Recommendation-System-with-Sentiment-Analysis
Content based movie recommendation system with sentiment analysis
Stars: ✭ 44 (-10.2%)
Mutual labels:  sentiment-analysis
pyslide
Digital Pathology Whole Slide Image Analysis Toolbox
Stars: ✭ 38 (-22.45%)
Mutual labels:  image-analysis
FinBERT
A Pretrained BERT Model for Financial Communications. https://arxiv.org/abs/2006.08097
Stars: ✭ 193 (+293.88%)
Mutual labels:  sentiment-analysis
serverless-data-pipeline-sam
Serverless Data Pipeline powered by Kinesis Firehose, API Gateway, Lambda, S3, and Athena
Stars: ✭ 78 (+59.18%)
Mutual labels:  amazon-web-services
Sentiment-Analysis-Play-Store-Reviews
Sentiment Analyser: Sentiment Analysis of user feedback in Play Store to improve app quality!
Stars: ✭ 22 (-55.1%)
Mutual labels:  sentiment-analysis
ML2017FALL
Machine Learning (EE 5184) in NTU
Stars: ✭ 66 (+34.69%)
Mutual labels:  sentiment-analysis
TwEater
A Python Bot for Scraping Conversations from Twitter
Stars: ✭ 16 (-67.35%)
Mutual labels:  sentiment-analysis

Amazon Rekognition Engagement Meter

The Engagement Meter is a web application that calculates and displays engagement levels of an audience observed by a webcam. It also includes the capability to recognize attendants by associating their faces to individual user profiles.

Index

Architecture

The Engagement Meter uses Amazon Rekognition for image and sentiment analysis, Amazon DynamoDB for storage, Amazon API Gateway and Amazon Cognito for the API, and Amazon S3, AWS Amplify, and React for the front-end layer.

Architecture Diagram

There are three main user flows:

  • the "add user" flow (A) is triggered when clicking the "Add user" button
  • the "added users recognition" flow (B) and the "sentiment analysis" flow (C) are both triggered when clicking the "Start Rekognition" button and repeat until the "Stop Rekognition" button is clicked.

The diagram below represents the API calls performed by Amplify, which takes care of authenticating all the calls to the API Gateway using Cognito.

User flow

The "add user" flow (A)

Amplify makes a POST /faces/add request to the API Gateway including the uploaded picture and an autogenerated unique identifier (known as ExternalImageId), then the API Gateway calls the IndexFaces action in Amazon Rekognition. After that, Amplify makes a POST /people request to the API Gateway including the ExternalImageId and some extra metadata (Name and Job Title), then the API Gateway writes that data to the Faces table in Amazon DynamoDB. To learn more about IndexFaces see the Rekognition documentation.

The "added users recognition" flow (B)

Amplify makes a GET /people request to the API Gateway, which then queries the Faces table on Amazon DynamoDB. In case any people have been registered, Amplify makes another call to POST /faces/search including a screenshot detected from the webcam. Then, the API Gateway calls the SearchFacesByImage action in Amazon Rekognition. If any previously registered person is recognized, the service provides details about the matches, including each face's coordinate and confidence. In this case, the UI displays a welcome message showing the recognized users' names. To learn more about SearchFacesByImage see the Rekognition documentation.

The "sentiment analysis" flow (C)

Amplify makes two parallel calls to the API Gateway (here represented in a sequential manner for simplicity). First, Amplify makes a POST /faces/detect request with a screenshot detected from the webcam to the API Gateway, which then calls the DetectedFaces action on Amazon Rekognition. If any face is detected, the service provides details about the matches, including physical characteristics and sentiments. In that case, a little recap is shown on the UI for each recognized person. Then Amplify makes a POST /engagement request with some of the recognized sentiments (Angry, Calm, Happy, Sad, Surprised) to the API Gateway, which writes that data to the Sentiment table in DynamoDB. In parallel, Amplify makes a GET /engagement request to the API Gateway, which then queries the Sentiment table in DynamoDB to retrieve an aggregate for all the sentiments recorded during the last hour, in order to calibrate and draw the meter. To learn more about DetectFaces see the Rekognition documentation.

Usage

Prerequisites

To deploy the sample application you will require an AWS account. If you don’t already have an AWS account, create one at https://aws.amazon.com by following the on-screen instructions. Your access to the AWS account must have IAM permissions to launch AWS CloudFormation templates that create IAM roles.

To use the sample application you will require a modern browser and a webcam.

Deployment

The demo application is deployed as an AWS CloudFormation template.

Note
You are responsible for the cost of the AWS services used while running this sample deployment. There is no additional cost for using this sample. For full details, see the pricing pages for each AWS service you will be using in this sample. Prices are subject to change.

  1. Deploy the latest CloudFormation template by following the link below for your preferred AWS region:
Region Launch Template
US East (N. Virginia) (us-east-1) Launch the EngagementMeter Stack with CloudFormation
US East (Ohio) (us-east-2) Launch the EngagementMeter Stack with CloudFormation
US West (Oregon) (us-west-2) Launch the EngagementMeter Stack with CloudFormation
Asia Pacific (Seoul) (ap-northeast-2) Launch the EngagementMeter Stack with CloudFormation
Asia Pacific (Sydney) (ap-southeast-2) Launch the EngagementMeter Stack with CloudFormation
Asia Pacific (Tokyo) (ap-northeast-1) Launch the EngagementMeter Stack with CloudFormation
EU (Ireland) (eu-west-1) Launch the EngagementMeter Stack with CloudFormation
  1. If prompted, login using your AWS account credentials.

  2. You should see a screen titled "Create Stack" at the "Specify template" step. The fields specifying the CloudFormation template are pre-populated. Click the Next button at the bottom of the page.

  3. On the "Specify stack details" screen you may customize the following parameters of the CloudFormation stack:

    • Stack Name: (Default: EngagementMeter) This is the name that is used to refer to this stack in CloudFormation once deployed. The value must be 15 characters or less.
    • CollectionId: (Default: RekogDemo) AWS Resources are named based on the value of this parameter. You must customise this if you are launching more than one instance of the stack within the same account.
    • CreateCloudFrontDistribution (Default: false) Creates a CloudFront distribution for accessing the web interface of the demo. This must be enabled if S3 Block Public Access is enabled at an account level. Note: Creating a CloudFront distribution may significantly increase the deploy time (from approximately 5 minutes to over 30 minutes)

    When completed, click Next

  4. Configure stack options if desired, then click Next.

  5. On the review you screen, you must check the boxes for:

    • "I acknowledge that AWS CloudFormation might create IAM resources"
    • "I acknowledge that AWS CloudFormation might create IAM resources with custom names"

    These are required to allow CloudFormation to create a Role to allow access to resources needed by the stack and name the resources in a dynamic way.

  6. Click Create Change Set

  7. On the Change Set screen, click Execute to launch your stack.

    • You may need to wait for the Execution status of the change set to become "AVAILABLE" before the "Execute" button becomes available.
  8. Wait for the CloudFormation stack to launch. Completion is indicated when the "Stack status" is "CREATE_COMPLETE".

    • You can monitor the stack creation progress in the "Events" tab.
  9. Note the url displayed in the Outputs tab for the stack. This is used to access the application.

Accessing the Application

The application is accessed using a web browser. The address is the url output from the CloudFormation stack created during the Deployment steps.

  • When accessing the application, the browser will ask you the permission for using your camera. You will need to click "Allow" for the application to work.
  • Click "Add a new user" if you wish to add new profiles.
  • Click "Start Rekognition" to start the engine. The app will start displaying information about the recognized faces and will calibrate the meter.

Remove the application

To remove the application open the AWS CloudFormation Console, click the Engagement Meter project, right-click and select "Delete Stack". Your stack will take some time to be deleted. You can track its progress in the "Events" tab. When it is done, the status will change from DELETE_IN_PROGRESS" to "DELETE_COMPLETE". It will then disappear from the list.

Making changes to the code and customization

The contributing guidelines contains some instructions about how to run the front-end locally and make changes to the back-end stack.

Contributing

Contributions are more than welcome. Please read the code of conduct and the contributing guidelines.

License Summary

This sample code is made available under a modified MIT license. See the LICENSE file.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].