All Projects → shunk031 → Multi-task-Conditional-Attention-Networks

shunk031 / Multi-task-Conditional-Attention-Networks

Licence: other
A prototype version of our submitted paper: Conversion Prediction Using Multi-task Conditional Attention Networks to Support the Creation of Effective Ad Creatives.

Programming Languages

python
139335 projects - #7 most used programming language
Dockerfile
14818 projects

Projects that are alternatives of or similar to Multi-task-Conditional-Attention-Networks

Attention is all you need
Transformer of "Attention Is All You Need" (Vaswani et al. 2017) by Chainer.
Stars: ✭ 303 (+1342.86%)
Mutual labels:  chainer, attention-mechanism
amta-net
Asymmetric Multi-Task Attention Network for Prostate Bed Segmentation in CT Images
Stars: ✭ 26 (+23.81%)
Mutual labels:  attention-mechanism, multi-task-learning
Visual-Attention-Model
Chainer implementation of Deepmind's Visual Attention Model paper
Stars: ✭ 27 (+28.57%)
Mutual labels:  chainer, attention-mechanism
chainer-graph-cnn
Chainer implementation of 'Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering' (https://arxiv.org/abs/1606.09375)
Stars: ✭ 67 (+219.05%)
Mutual labels:  chainer
Brain-Tumor-Segmentation
Attention-Guided Version of 2D UNet for Automatic Brain Tumor Segmentation
Stars: ✭ 125 (+495.24%)
Mutual labels:  attention-mechanism
Attention mechanism-event-extraction
Attention mechanism in CNNs to extract events of interest
Stars: ✭ 17 (-19.05%)
Mutual labels:  attention-mechanism
chainer-ResDrop
Deep Networks with Stochastic Depth implementation by Chainer
Stars: ✭ 40 (+90.48%)
Mutual labels:  chainer
h-transformer-1d
Implementation of H-Transformer-1D, Hierarchical Attention for Sequence Learning
Stars: ✭ 121 (+476.19%)
Mutual labels:  attention-mechanism
long-short-transformer
Implementation of Long-Short Transformer, combining local and global inductive biases for attention over long sequences, in Pytorch
Stars: ✭ 103 (+390.48%)
Mutual labels:  attention-mechanism
superresolution gan
Chainer implementation of Photo-Realistic Single Image Super-Resolution Using a Generative Adversarial Network
Stars: ✭ 50 (+138.1%)
Mutual labels:  chainer
NLP-paper
🎨 🎨NLP 自然语言处理教程 🎨🎨 https://dataxujing.github.io/NLP-paper/
Stars: ✭ 23 (+9.52%)
Mutual labels:  attention-mechanism
campaign-manager
The Camapign Management UI for RTB4Free, the open source bidder / DSP.
Stars: ✭ 24 (+14.29%)
Mutual labels:  advertising
Linear-Attention-Mechanism
Attention mechanism
Stars: ✭ 27 (+28.57%)
Mutual labels:  attention-mechanism
SAMN
This is our implementation of SAMN: Social Attentional Memory Network
Stars: ✭ 45 (+114.29%)
Mutual labels:  attention-mechanism
discourse-adplugin
Official Discourse Advertising Plugin. Install & Start Serving Ads on Your Discourse Forum
Stars: ✭ 115 (+447.62%)
Mutual labels:  advertising
Machine-Translation-Hindi-to-english-
Machine translation is the task of converting one language to other. Unlike the traditional phrase-based translation system which consists of many small sub-components that are tuned separately, neural machine translation attempts to build and train a single, large neural network that reads a sentence and outputs a correct translation.
Stars: ✭ 19 (-9.52%)
Mutual labels:  attention-mechanism
resolutions-2019
A list of data mining and machine learning papers that I implemented in 2019.
Stars: ✭ 19 (-9.52%)
Mutual labels:  attention-mechanism
CrabNet
Predict materials properties using only the composition information!
Stars: ✭ 57 (+171.43%)
Mutual labels:  attention-mechanism
Compact-Global-Descriptor
Pytorch implementation of "Compact Global Descriptor for Neural Networks" (CGD).
Stars: ✭ 22 (+4.76%)
Mutual labels:  attention-mechanism
Hierarchical-Word-Sense-Disambiguation-using-WordNet-Senses
Word Sense Disambiguation using Word Specific models, All word models and Hierarchical models in Tensorflow
Stars: ✭ 33 (+57.14%)
Mutual labels:  attention-mechanism

Multi-task Conditional Attention Networks

A prototype version of our submitted paper: Conversion Prediction Using Multi-task Conditional Attention Networks to Support the Creation of Effective Ad Creatives.

Setup using Docker

$ docker build -t multi-task-cond-net-env .
$ docker create -it -v /data:/data --name datavolume busybox
$ docker run -it -p 8888:8888 --runtime=nvidia --volumes-from datavolume --rm --name multi-task-cond-net multi-task-cond-net-env
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].