All Projects → umdsquare → data-at-hand-mobile

umdsquare / data-at-hand-mobile

Licence: MIT license
Mobile application for exploring fitness data using both speech and touch interaction.

Programming Languages

typescript
32286 projects
swift
15916 projects
java
68154 projects - #9 most used programming language
objective c
16641 projects - #2 most used programming language
javascript
184084 projects - #8 most used programming language
ruby
36898 projects - #4 most used programming language
Starlark
911 projects

Projects that are alternatives of or similar to data-at-hand-mobile

IoT-iBeacon
An Ionic app for indoor localization and navigation using BLE iBeacons.
Stars: ✭ 39 (-22%)
Mutual labels:  navigation, mobile-app
organicmaps
🍃 Organic Maps is a free Android & iOS offline maps app for travelers, tourists, hikers, and cyclists. It uses crowd-sourced OpenStreetMap data and is developed with love by MapsWithMe (MapsMe) founders and our community. No ads, no tracking, no data collection, no crapware. Your donations and positive reviews motivate and inspire our small team!
Stars: ✭ 3,689 (+7278%)
Mutual labels:  navigation, mobile-app
Router-deprecated
🛣 Simple Navigation for iOS - ⚠️ Deprecated
Stars: ✭ 458 (+816%)
Mutual labels:  navigation
laravel-breadcrumbs
Simple breadcrumbs package for your Laravel project.
Stars: ✭ 23 (-54%)
Mutual labels:  navigation
MajorDomo-Scenarios
Сценарии для системы домашней автоматизации Majordomo
Stars: ✭ 12 (-76%)
Mutual labels:  speech
angular-translate-loader
"angular-translate" loader for webpack
Stars: ✭ 15 (-70%)
Mutual labels:  fitbit
AdaSpeech
AdaSpeech: Adaptive Text to Speech for Custom Voice
Stars: ✭ 108 (+116%)
Mutual labels:  speech
navigator.lua
Source code analysis & navigation plugin for Neovim. Navigate codes like a breeze🎐. Exploring LSP and 🌲Treesitter symbols a piece of 🍰. Take control like a boss 🦍.
Stars: ✭ 781 (+1462%)
Mutual labels:  navigation
nice intro
Get your users to know your app with ease
Stars: ✭ 17 (-66%)
Mutual labels:  mobile-app
sdk-design-assets
Downloadable design assets for the Fitbit SDK.
Stars: ✭ 75 (+50%)
Mutual labels:  fitbit
voice-based-email-for-blind
Emailing System for visually impaired persons
Stars: ✭ 35 (-30%)
Mutual labels:  speech
aframe-speech-controls-component
alternative form of inputs for in-VR interaction with the content of a scene
Stars: ✭ 13 (-74%)
Mutual labels:  speech
PhotoFeed
🛵 Instagram in Swift 4
Stars: ✭ 43 (-14%)
Mutual labels:  navigation
angular-sticky-navigation-directive
Angular directive to make a sticky element, quick demo here: http://ng-milk.github.io/angular-sticky-navigation-directive/
Stars: ✭ 20 (-60%)
Mutual labels:  navigation
flutter redux navigation
Navigation Middleware for Flutter's redux library.
Stars: ✭ 43 (-14%)
Mutual labels:  navigation
android-clean-code
Writing Clean Code in Android
Stars: ✭ 22 (-56%)
Mutual labels:  mobile-app
navigator
🧿 Build navigation or menu for Laravel and Awes.io. Unlimited complexity and depth, with permissions and sorting support.
Stars: ✭ 47 (-6%)
Mutual labels:  navigation
sink
Verify that you're spending more than you can afford
Stars: ✭ 78 (+56%)
Mutual labels:  mobile-app
CVC
CVC: Contrastive Learning for Non-parallel Voice Conversion (INTERSPEECH 2021, in PyTorch)
Stars: ✭ 45 (-10%)
Mutual labels:  speech
nextjs-breadcrumbs
A dynamic, highly customizable breadcrumbs component for Next.js
Stars: ✭ 70 (+40%)
Mutual labels:  navigation

Data@Hand

Data@Hand is a cross-platform smartphone app that facilitates visual data exploration leveraging both speech and touch interactions. Data visualization is a common way that mobile health apps enable people to explore their data on smartphones. However, due to smartphones’ limitations such as small screen size and lack of precise pointing input, they provide limited support for visual data exploration with over-simplified time navigation, even though time is a primary dimension of self-tracking data. Data@Hand leverages the synergy of speech and touch; speech-based interaction takes little screen space and natural language is flexible to cover different ways of specifying dates and their ranges (e.g., “October 7th”, “Last Sunday”, “This month”). Currently, Data@Hand supports displaying the Fitbit data (e.g., step count, heart rate, sleep, and weight) for navigation and temporal comparisons tasks.

For more information about this project, please visit https://data-at-hand.github.io.

Related Research Paper (Describes the design and a user study)

Data@Hand: Fostering Visual Exploration of Personal Data on Smartphones Leveraging Speech and Touch Interaction
[Best Paper Honorable Mention Award]
Young-Ho Kim, Bongshin Lee, Arjun Srinivasan, and Eun Kyoung Choe
ACM CHI 2021 (PDF)

How to build & run

System Overview

Data@Hand is a stand-alone application that does not require a backend server. The app communicates with the Fitbit server and fetches the data locally on the device.

Acquire Fitbit API Key

  1. Register an app on the Fitbit developer page https://dev.fitbit.com/apps/new.

    1. Select Client for OAuth 2.0 Application Type.
    2. Use a URL similar to edu.umd.hcil.data-at-hand://oauth2/redirect for Callback URL. This URL will be used locally on your device.
  2. Data@Hand leverages Fitbit's Intraday API, which you should explicitly get approval from Fitbit https://dev.fitbit.com/build/reference/web-api/intraday-requests/.

  3. In the credentials directory in the repository, copy fitbit.example.json and rename it into fitbit.json.

  4. Fill the information accordingly. You can get the information in Manage My Apps on the Fitbit developer page.

{
  "client_id": "YOUR_FITBIT_ID", // <- OAuth 2.0 Client ID 
  "client_secret": "YOUR_FITBIT_SECRET", // <- Client Secret
  "redirect_uri": "YOUR_REDIRECT_URI" // <- Callback URL
}

(Android Only) Acquire Microsoft Cognitive Speech API Key

  1. Register a Microsoft Cognitive Speech-to-text service at a free-tier https://azure.microsoft.com/en-us/services/cognitive-services/speech-to-text/.
  2. In the credentials directory in the repository, copy microsoft_cognitive_service_speech.example.json and rename it into microsoft_cognitive_service_speech.json.
  3. Fill the information accordingly. You need a subscription ID and the region information.
{
  "subscriptionId": "YOUR_SUBSCRIPTION_ID",
  "region": "YOUR_AZURE_REGION" // <- Depending on the region you set. e.g., "eastus"
}

(Optional) If you want to track exceptions, register Bugsnag.

  1. Create a Bugsnag project and get the API Key https://www.bugsnag.com/.
  2. In the credentials directory in the repository, copy bugsnag.example.json and rename it into bugsnag.json.
  3. Fill the information accordingly.
{
  "api_key": "YOUR_BUGSNAG_API_KEY"
}

Compile Data@Hand

Install Node.js on your system.

Install react-native CLI:

> npm install -g @react-native-community/cli

Install dependencies (In the directory of the repository where package.json exists)

> npm i

Run on IOS:

If you have not used Cocoapods before, install it once:

> sudo gem install cocoapods

Install iOS project dependencies.

> cd ios
> pod install

Run on iOS.

> react-native run-ios

Run on Android:

> react-native run-android

Third-party Services Used


Research Team Member

Young-Ho Kim (Website)
Postdoctoral Associate
University of Maryland, College Park
*Contact for code and implementation

Bongshin Lee (Website)
Sr. Principal Researcher
Microsoft Research

Arjun Srinivasan (Website)
Research Scientist
Tableau Research
*Arjun did this work while at Georgia Institute of Technology

Eun Kyoung Choe (Website)
Associate Professor
University of Maryland, College Park


Acknowledgment

This work was in part supported by National Science Foundation award #1753452 (CAREER: Advancing Personal Informatics through Semi-Automated and Collaborative Tracking).


License

Source Code

MIT License

Original Design Resources including Logos and Assets

CC BY 4.0

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].