All Projects β†’ mida-project β†’ eye-tracker-setup

mida-project / eye-tracker-setup

Licence: MIT license
πŸ‘€ Tobii Eye Tracker 4C Setup

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to eye-tracker-setup

TobiiGlassesPyController
Tobii Pro Glasses 2 Python controller
Stars: ✭ 42 (+75%)
Mutual labels:  eye-tracking, tobii-eye-tracker
monai-deploy
MONAI Deploy aims to become the de-facto standard for developing, packaging, testing, deploying and running medical AI applications in clinical production.
Stars: ✭ 56 (+133.33%)
Mutual labels:  medical-imaging, radiology
Optikey
OptiKey - Full computer control and speech with your eyes
Stars: ✭ 3,906 (+16175%)
Mutual labels:  eye-tracking, eye-tracker
CheXbert
Combining Automatic Labelers and Expert Annotations for Accurate Radiology Report Labeling Using BERT
Stars: ✭ 51 (+112.5%)
Mutual labels:  medical-imaging, radiology
GMIC
An interpretable classifier for high-resolution breast cancer screening images utilizing weakly supervised localization
Stars: ✭ 106 (+341.67%)
Mutual labels:  medical-imaging, breast-cancer
mammography metarepository
Meta-repository of screening mammography classifiers
Stars: ✭ 44 (+83.33%)
Mutual labels:  medical-imaging, breast-cancer
chr247.com
An open source multi tenant cloud platform for small scale clinics
Stars: ✭ 56 (+133.33%)
Mutual labels:  doctor
Weeping-Angels
Minecraft Mod - Adds the terrifying Weeping Angels to the Game, Minecraft. Don't Blink!
Stars: ✭ 22 (-8.33%)
Mutual labels:  doctor
FusionMouse
Combines Tobii eye tracking with TrackIR head tracking for a fast hands-free mouse replacement, in Rust!
Stars: ✭ 33 (+37.5%)
Mutual labels:  eye-tracking
Dicom-Viewer
An application displaying 2D/3D Dicom
Stars: ✭ 37 (+54.17%)
Mutual labels:  medical-imaging
covid19.MIScnn
Robust Chest CT Image Segmentation of COVID-19 Lung Infection based on limited data
Stars: ✭ 77 (+220.83%)
Mutual labels:  medical-imaging
3d-prostate-segmentation
Segmentation of prostate from MRI scans
Stars: ✭ 36 (+50%)
Mutual labels:  medical-imaging
MouseView.js
Attentional mouse tracking. Alternative to online eye tracking. Eye tracking without the eyes!
Stars: ✭ 46 (+91.67%)
Mutual labels:  eye-tracking
AIML-in-Medicine-club
Repository for "AI/MD in Medicine" club in clubhouse
Stars: ✭ 42 (+75%)
Mutual labels:  medical-imaging
unet-pytorch
This is the example implementation of UNet model for semantic segmentations
Stars: ✭ 17 (-29.17%)
Mutual labels:  medical-imaging
FCN-CTSCAN
A small TensorFlow project created to test some machine learning problems
Stars: ✭ 17 (-29.17%)
Mutual labels:  medical-imaging
Brainy
Brainy is a virtual MRI analyzer. Just upload the MRI scan file and get 3 different classes of tumors detected and segmented. In Beta.
Stars: ✭ 29 (+20.83%)
Mutual labels:  medical-imaging
rt-utils
A minimal Python library to facilitate the creation and manipulation of DICOM RTStructs.
Stars: ✭ 89 (+270.83%)
Mutual labels:  medical-imaging
chestViewSplit
Automatically split the chest x-ray into two views
Stars: ✭ 17 (-29.17%)
Mutual labels:  medical-imaging
cornerstone widget
A jupyter widget for the cornerstone library to make showing flashy images with nice tools easier.
Stars: ✭ 25 (+4.17%)
Mutual labels:  medical-imaging

Eye Tracker Setup

Setup screen Tobii Eye Tracker 4C gazing information for usability testing purpose. The Tobii Eye Tracker 4C aims at providing an immersive reality without a headset. Also, with this product nothing stands between the screen and the immersive experience. Therefore, our clinicians will work with no interference of the device. This repository includes functions for the setup os the eye-tracking routines including: (i) calibration of the eye-tracker; (ii) finding eye positions; and (iii) validation of eye-tracker calibration settings. It contains functions for working with with the new Tobii Pro SDK for Python, along with essential Eye-Tracking routines, in a TobiiHelper class. The repository is part of the work done by SIPg, an ISR-Lisboa research group and M-ITI, two R&D Units of LARSyS. The project also involves the collaborative effort of INESC-ID. Both ISR-Lisboa and INESC-ID are Associate Laboratories of IST from ULisboa.

Citing

We kindly ask scientific works and studies that make use of the repository to cite it in their associated publications. Similarly, we ask open-source and closed-source works that make use of the repository to warn us about this use.

You can cite our work using the following BibTeX entry:

@article{CALISTO2021102607,
title = {Introduction of human-centric AI assistant to aid radiologists for multimodal breast image classification},
journal = {International Journal of Human-Computer Studies},
volume = {150},
pages = {102607},
year = {2021},
issn = {1071-5819},
doi = {https://doi.org/10.1016/j.ijhcs.2021.102607},
url = {https://www.sciencedirect.com/science/article/pii/S1071581921000252},
author = {Francisco Maria Calisto and Carlos Santiago and Nuno Nunes and Jacinto C. Nascimento},
keywords = {Human-computer interaction, Artificial intelligence, Healthcare, Medical imaging, Breast cancer},
abstract = {In this research, we take an HCI perspective on the opportunities provided by AI techniques in medical imaging, focusing on workflow efficiency and quality, preventing errors and variability of diagnosis in Breast Cancer. Starting from a holistic understanding of the clinical context, we developed BreastScreening to support Multimodality and integrate AI techniques (using a deep neural network to support automatic and reliable classification) in the medical diagnosis workflow. This was assessed by using a significant number of clinical settings and radiologists. Here we present: i) user study findings of 45 physicians comprising nine clinical institutions; ii) list of design recommendations for visualization to support breast screening radiomics; iii) evaluation results of a proof-of-concept BreastScreening prototype for two conditions Current (without AI assistant) and AI-Assisted; and iv) evidence from the impact of a Multimodality and AI-Assisted strategy in diagnosing and severity classification of lesions. The above strategies will allow us to conclude about the behaviour of clinicians when an AI module is present in a diagnostic system. This behaviour will have a direct impact in the clinicians workflow that is thoroughly addressed herein. Our results show a high level of acceptance of AI techniques from radiologists and point to a significant reduction of cognitive workload and improvement in diagnosis execution.}
}

Pre-Requisites

The following list is showing the set of dependencies for this project. Please, install and build in your machine the recommended versions.

List of dependencies for this project:

Analytical Use

Tobii’s consumer eye trackers are primarily intended for personal interaction use and not for analytical purposes. Any application that stores or transfers eye tracking data must have a special license from Tobii (Read more). Please, apply for a license here.

Instructions

The instructions are as follows. We assume that you already have knowledge over Git and GitHub. If not, please follow this support information. Any need for support, just open a New issue.

Clone

To clone the hereby repository follow the guidelines. It is easy as that.

1.1. Please clone the repository by typing the command:

git clone https://github.com/mida-project/eye-tracker-setup.git

1.2. Get inside of the repository directory:

cd eye-tracker-setup/

1.3. For the installation and running of the source code, follow the next steps;

Install

The installation guidelines are as follows. Please, be sure that you follow it correctly.

2.1. Run the following command to install the library using pip:

On Linux or OS X

pip install -U pip setuptools
pip install tobii-research

On Windows

python -m pip install -U pip setuptools
pip install tobii-research

2.2. Follow the next step;

Run

The running guidelines are as follows. Please, be sure that you follow it correctly.

3.1. Run the sample using the following command:

python2 src/core/main.py

3.2. Enjoy our source code!

Notebooks

You can also run a Notebook to watch some of our models chart plots. For this goal we are using the well known Jupyter Notebook web application. To run the Jupyter Notebook just follow the steps.

4.1. Get inside our project directory:

cd eye-tracker-setup/src/notebooks/

4.2. Run Jupyter Notebook application by typing:

jupyter notebook

If you have any question regarding the Jupyter Notebook just follow their Documentation. You can also ask for help close to the Community.

Information

To find out how to apply the Upgrade Key to a Tobii Eye Tracker 4C, follow the Tobii Pro Upgrade Key – User Instructions document. Nevertheless, the Tobii Pro SDK Python API Documentation page is of chief importance to this repository, as well as, their Examples page for Python. For the first configurations, please follow both Python - Getting started and Python - Step-by-step guide pages, or follow the presented steps. Any questions regarding the Eye-Tracking topic just follow the StackOverflow tag for the purpose.

Acknowledgements

The work is also based and highly contributed from the tobii_pro_wrapper. The tobii_pro_wrapper repository was developed by Olivia Guayasamin (oguayasa) that we would like to thank. That repository shows pretty much everything we need to connect to a Tobii Eye-Tracker, calibrate the eyetracker, get gaze, eye, and time synchronization data from the eyetracker device, and convert the Tobii coordinate systems units.

Authors

Sponsors

fct fccn ulisboa ist hff

Departments

dei dei

Laboratories

sipg isr larsys inesc-id inesc-id

Domain

eu pt
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].