All Projects → A2Zadeh → TensorFusionNetwork

A2Zadeh / TensorFusionNetwork

Licence: other
EMNLP 2017 (Oral): Tensor Fusion Network for Multimodal Sentiment Analysis Code

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to TensorFusionNetwork

hfusion
Multimodal sentiment analysis using hierarchical fusion with context modeling
Stars: ✭ 42 (-23.64%)
Mutual labels:  multimodal-interactions, multimodal-sentiment-analysis
data-at-hand-mobile
Mobile application for exploring fitness data using both speech and touch interaction.
Stars: ✭ 50 (-9.09%)
Mutual labels:  multimodal-interactions
MSAF
Offical implementation of paper "MSAF: Multimodal Split Attention Fusion"
Stars: ✭ 47 (-14.55%)
Mutual labels:  multimodal-sentiment-analysis
BBFN
This repository contains the implementation of the paper -- Bi-Bimodal Modality Fusion for Correlation-Controlled Multimodal Sentiment Analysis
Stars: ✭ 42 (-23.64%)
Mutual labels:  multimodal-sentiment-analysis
vista-net
Code for the paper "VistaNet: Visual Aspect Attention Network for Multimodal Sentiment Analysis", AAAI'19
Stars: ✭ 67 (+21.82%)
Mutual labels:  multimodal-sentiment-analysis
Self-Supervised-Embedding-Fusion-Transformer
The code for our IEEE ACCESS (2020) paper Multimodal Emotion Recognition with Transformer-Based Self Supervised Feature Fusion.
Stars: ✭ 57 (+3.64%)
Mutual labels:  multimodal-sentiment-analysis

The code will no longer be maintained in this repository due to theano being discontinued. Please refer to CMU-Multimodal Data SDK under mmmodelsdk and related_repos to find the newest implementations for Tensor Fusion Network. Also a very nice implementation is here: https://github.com/Justin1904/TensorFusionNetworks

TensorFusionNetwork

This is the code for Tensor Fusion Network for Multimodal Sentiment Analysis published in EMNLP 2017 and orally presented in multimodal session. The code is quite straight forward. Please download the CMU-MOSI dataset using CMU Multimodal Data SDK or my website. The data_loader.py helps you load the data in the correct format, however I suggest using the CMU Multimodal Data SDK for better loading as the directory structure of CMU-MOSI changes when we add new features. The code for the algorithm is in tf_mosi.py.

Please cite the following publication if you are using this code:

@inproceedings{tensoremnlp17,
title={Tensor Fusion Network for Multimodal Sentiment Analysis},
author={Zadeh, Amir and Chen, Minghai and Poria, Soujanya and Cambria, Erik and Morency, Louis-Philippe},
booktitle={Empirical Methods in Natural Language Processing, EMNLP},
year={2017}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].