All Projects → BarnesLab → Patient2Vec

BarnesLab / Patient2Vec

Licence: MIT license
Patient2Vec: A Personalized Interpretable Deep Representation of the Longitudinal Electronic Health Record

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Patient2Vec

ntua-slp-semeval2018
Deep-learning models of NTUA-SLP team submitted in SemEval 2018 tasks 1, 2 and 3.
Stars: ✭ 79 (-7.06%)
Mutual labels:  lstm, attention-mechanism
BadMedicine
Library and CLI for randomly generating medical data like you might get out of an Electronic Health Records (EHR) system
Stars: ✭ 18 (-78.82%)
Mutual labels:  ehr, electronic-health-records
awesome-multimodal-ml
Reading list for research topics in multimodal machine learning
Stars: ✭ 3,125 (+3576.47%)
Mutual labels:  healthcare, representation-learning
Lstm attention
attention-based LSTM/Dense implemented by Keras
Stars: ✭ 168 (+97.65%)
Mutual labels:  lstm, attention-mechanism
datastories-semeval2017-task6
Deep-learning model presented in "DataStories at SemEval-2017 Task 6: Siamese LSTM with Attention for Humorous Text Comparison".
Stars: ✭ 20 (-76.47%)
Mutual labels:  lstm, attention-mechanism
Multimodal Sentiment Analysis
Attention-based multimodal fusion for sentiment analysis
Stars: ✭ 172 (+102.35%)
Mutual labels:  lstm, attention-mechanism
Openemr
The most popular open source electronic health records and medical practice management solution.
Stars: ✭ 1,762 (+1972.94%)
Mutual labels:  healthcare, ehr
Abstractive Summarization
Implementation of abstractive summarization using LSTM in the encoder-decoder architecture with local attention.
Stars: ✭ 128 (+50.59%)
Mutual labels:  lstm, attention-mechanism
loinc2hpo
Java library to map LOINC-encoded test results to Human Phenotype Ontology
Stars: ✭ 19 (-77.65%)
Mutual labels:  healthcare, ehr
Hierarchical-Word-Sense-Disambiguation-using-WordNet-Senses
Word Sense Disambiguation using Word Specific models, All word models and Hierarchical models in Tensorflow
Stars: ✭ 33 (-61.18%)
Mutual labels:  lstm, attention-mechanism
Eeg Dl
A Deep Learning library for EEG Tasks (Signals) Classification, based on TensorFlow.
Stars: ✭ 165 (+94.12%)
Mutual labels:  lstm, attention-mechanism
automatic-personality-prediction
[AAAI 2020] Modeling Personality with Attentive Networks and Contextual Embeddings
Stars: ✭ 43 (-49.41%)
Mutual labels:  lstm, attention-mechanism
Poetry Seq2seq
Chinese Poetry Generation
Stars: ✭ 159 (+87.06%)
Mutual labels:  lstm, attention-mechanism
Datastories Semeval2017 Task4
Deep-learning model presented in "DataStories at SemEval-2017 Task 4: Deep LSTM with Attention for Message-level and Topic-based Sentiment Analysis".
Stars: ✭ 184 (+116.47%)
Mutual labels:  lstm, attention-mechanism
Document Classifier Lstm
A bidirectional LSTM with attention for multiclass/multilabel text classification.
Stars: ✭ 136 (+60%)
Mutual labels:  lstm, attention-mechanism
freehealth
Free and open source Electronic Health Record
Stars: ✭ 39 (-54.12%)
Mutual labels:  healthcare, ehr
Linear Attention Recurrent Neural Network
A recurrent attention module consisting of an LSTM cell which can query its own past cell states by the means of windowed multi-head attention. The formulas are derived from the BN-LSTM and the Transformer Network. The LARNN cell with attention can be easily used inside a loop on the cell state, just like any other RNN. (LARNN)
Stars: ✭ 119 (+40%)
Mutual labels:  lstm, attention-mechanism
Image Caption Generator
A neural network to generate captions for an image using CNN and RNN with BEAM Search.
Stars: ✭ 126 (+48.24%)
Mutual labels:  lstm, attention-mechanism
MTL-AQA
What and How Well You Performed? A Multitask Learning Approach to Action Quality Assessment [CVPR 2019]
Stars: ✭ 38 (-55.29%)
Mutual labels:  lstm, representation-learning
extkeras
Playground for implementing custom layers and other components compatible with keras, with the purpose to learn the framework better and perhaps in future offer some utils for others.
Stars: ✭ 18 (-78.82%)
Mutual labels:  lstm, attention-mechanism

DOI arxiv license twitter

Patient2Vec: A Personalized Interpretable Deep Representation of the Longitudinal Electronic Health Record

Referenced paper : Patient2Vec: A Personalized Interpretable Deep Representation of the Longitudinal Electronic Health Record

Patient2Vec

Documentation

The wide implementation of electronic health record (EHR) systems facilitates the collection of large-scale health data from real clinical settings. Despite the significant increase in adoption of EHR systems, this data remains largely unexplored, but presents a rich data source for knowledge discovery from patient health histories in tasks such as understanding disease correlations and predicting health outcomes. However, the heterogeneity, sparsity, noise, and bias in this data present many complex challenges. This complexity makes it difficult to translate potentially relevant information into machine learning algorithms. In this paper, we propose a computational framework, Patient2Vec, to learn an interpretable deep representation of longitudinal EHR data which is personalized for each patient. To evaluate this approach, we apply it to the prediction of future hospitalizations using real EHR data and compare its predictive performance with baseline methods. Patient2Vec produces a vector space with meaningful structure and it achieves an AUC around 0.799 outperforming baseline methods. In the end, the learned feature importance can be visualized and interpreted at both the individual and population levels to bring clinical insights.

Citation:

@ARTICLE{Patient2Vec,
      author={Zhang, Jinghe and Kowsari, Kamran and Harrison, James H and Lobo, Jennifer M and Barnes, Laura E},
      journal={IEEE Access},
      title={Patient2Vec: A Personalized Interpretable Deep Representation of the Longitudinal Electronic Health Record},
      year={2018},
      volume={6},
      pages={65333-65346},
      doi={10.1109/ACCESS.2018.2875677},
      ISSN={2169-3536}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].