All Projects → Pen Net For Inpainting → Similar Projects or Alternatives

177 Open source projects that are alternatives of or similar to Pen Net For Inpainting

Attention Transfer
Improving Convolutional Networks via Attention Transfer (ICLR 2017)
Stars: ✭ 1,231 (+497.57%)
Mutual labels:  attention
Attention Over Attention Tf Qa
论文“Attention-over-Attention Neural Networks for Reading Comprehension”中AoA模型实现
Stars: ✭ 58 (-71.84%)
Mutual labels:  attention
Ccnet Pure Pytorch
Criss-Cross Attention for Semantic Segmentation in pure Pytorch with a faster and more precise implementation.
Stars: ✭ 124 (-39.81%)
Mutual labels:  attention
Nlp Journey
Documents, papers and codes related to Natural Language Processing, including Topic Model, Word Embedding, Named Entity Recognition, Text Classificatin, Text Generation, Text Similarity, Machine Translation),etc. All codes are implemented intensorflow 2.0.
Stars: ✭ 1,290 (+526.21%)
Mutual labels:  attention
Attentions
PyTorch implementation of some attentions for Deep Learning Researchers.
Stars: ✭ 39 (-81.07%)
Mutual labels:  attention
Image Caption Generator
A neural network to generate captions for an image using CNN and RNN with BEAM Search.
Stars: ✭ 126 (-38.83%)
Mutual labels:  attention
Absa Pytorch
Aspect Based Sentiment Analysis, PyTorch Implementations. 基于方面的情感分析,使用PyTorch实现。
Stars: ✭ 1,181 (+473.3%)
Mutual labels:  attention
Attentionn
All about attention in neural networks. Soft attention, attention maps, local and global attention and multi-head attention.
Stars: ✭ 175 (-15.05%)
Mutual labels:  attention
Text Classification Keras
📚 Text classification library with Keras
Stars: ✭ 53 (-74.27%)
Mutual labels:  attention
Leader Line
Draw a leader line in your web page.
Stars: ✭ 1,872 (+808.74%)
Mutual labels:  attention
Njunmt Tf
An open-source neural machine translation system developed by Natural Language Processing Group, Nanjing University.
Stars: ✭ 97 (-52.91%)
Mutual labels:  attention
Banglatranslator
Bangla Machine Translator
Stars: ✭ 21 (-89.81%)
Mutual labels:  attention
Prediction Flow
Deep-Learning based CTR models implemented by PyTorch
Stars: ✭ 138 (-33.01%)
Mutual labels:  attention
Self Attention Classification
document classification using LSTM + self attention
Stars: ✭ 84 (-59.22%)
Mutual labels:  attention
Pyramid Attention Networks Pytorch
Implementation of Pyramid Attention Networks for Semantic Segmentation.
Stars: ✭ 182 (-11.65%)
Mutual labels:  attention
Nlp Tutorial
Natural Language Processing Tutorial for Deep Learning Researchers
Stars: ✭ 9,895 (+4703.4%)
Mutual labels:  attention
Asr syllable
基于卷积神经网络的语音识别声学模型的研究
Stars: ✭ 127 (-38.35%)
Mutual labels:  attention
Deeplearning Nlp Models
A small, interpretable codebase containing the re-implementation of a few "deep" NLP models in PyTorch. Colab notebooks to run with GPUs. Models: word2vec, CNNs, transformer, gpt.
Stars: ✭ 64 (-68.93%)
Mutual labels:  attention
Self Attentive Tensorflow
Tensorflow implementation of "A Structured Self-Attentive Sentence Embedding"
Stars: ✭ 189 (-8.25%)
Mutual labels:  attention
Fluence
A deep learning library based on Pytorch focussed on low resource language research and robustness
Stars: ✭ 54 (-73.79%)
Mutual labels:  attention
Sightseq
Computer vision tools for fairseq, containing PyTorch implementation of text recognition and object detection
Stars: ✭ 116 (-43.69%)
Mutual labels:  attention
Sentences pair similarity calculation siamese lstm
A Keras Implementation of Attention_based Siamese Manhattan LSTM
Stars: ✭ 48 (-76.7%)
Mutual labels:  attention
Multimodal Sentiment Analysis
Attention-based multimodal fusion for sentiment analysis
Stars: ✭ 172 (-16.5%)
Mutual labels:  attention
Attentive Neural Processes
implementing "recurrent attentive neural processes" to forecast power usage (w. LSTM baseline, MCDropout)
Stars: ✭ 33 (-83.98%)
Mutual labels:  attention
Numpy Ml
Machine learning, in numpy
Stars: ✭ 11,100 (+5288.35%)
Mutual labels:  attention
Captcharecognition
End-to-end variable length Captcha recognition using CNN+RNN+Attention/CTC (pytorch implementation). 端到端的不定长验证码识别
Stars: ✭ 97 (-52.91%)
Mutual labels:  attention
Isab Pytorch
An implementation of (Induced) Set Attention Block, from the Set Transformers paper
Stars: ✭ 21 (-89.81%)
Mutual labels:  attention
Multihead Siamese Nets
Implementation of Siamese Neural Networks built upon multihead attention mechanism for text semantic similarity task.
Stars: ✭ 144 (-30.1%)
Mutual labels:  attention
Cnn lstm for text classify
CNN, LSTM, NBOW, fasttext 中文文本分类
Stars: ✭ 90 (-56.31%)
Mutual labels:  attention
Deep Time Series Prediction
Seq2Seq, Bert, Transformer, WaveNet for time series prediction.
Stars: ✭ 183 (-11.17%)
Mutual labels:  attention
Eval On Nn Of Rc
Empirical Evaluation on Current Neural Networks on Cloze-style Reading Comprehension
Stars: ✭ 84 (-59.22%)
Mutual labels:  attention
Vqa regat
Research Code for ICCV 2019 paper "Relation-aware Graph Attention Network for Visual Question Answering"
Stars: ✭ 129 (-37.38%)
Mutual labels:  attention
Attend infer repeat
A Tensorfflow implementation of Attend, Infer, Repeat
Stars: ✭ 82 (-60.19%)
Mutual labels:  attention
Hnatt
Train and visualize Hierarchical Attention Networks
Stars: ✭ 192 (-6.8%)
Mutual labels:  attention
Machine Learning
My Attempt(s) In The World Of ML/DL....
Stars: ✭ 78 (-62.14%)
Mutual labels:  attention
Chinese Chatbot
中文聊天机器人,基于10万组对白训练而成,采用注意力机制,对一般问题都会生成一个有意义的答复。已上传模型,可直接运行,跑不起来直播吃键盘。
Stars: ✭ 124 (-39.81%)
Mutual labels:  attention
Hatn
Hierarchical Attention Transfer Network for Cross-domain Sentiment Classification (AAAI'18)
Stars: ✭ 73 (-64.56%)
Mutual labels:  attention
Rnn For Joint Nlu
Pytorch implementation of "Attention-Based Recurrent Neural Network Models for Joint Intent Detection and Slot Filling" (https://arxiv.org/abs/1609.01454)
Stars: ✭ 176 (-14.56%)
Mutual labels:  attention
Enjoy Hamburger
[ICLR 2021] Is Attention Better Than Matrix Decomposition?
Stars: ✭ 69 (-66.5%)
Mutual labels:  attention
Absa keras
Keras Implementation of Aspect based Sentiment Analysis
Stars: ✭ 126 (-38.83%)
Mutual labels:  attention
Global Self Attention Network
A Pytorch implementation of Global Self-Attention Network, a fully-attention backbone for vision tasks
Stars: ✭ 64 (-68.93%)
Mutual labels:  attention
Guided Attention Inference Network
Contains implementation of Guided Attention Inference Network (GAIN) presented in Tell Me Where to Look(CVPR 2018). This repository aims to apply GAIN on fcn8 architecture used for segmentation.
Stars: ✭ 204 (-0.97%)
Mutual labels:  attention
Yolov4 Pytorch
This is a pytorch repository of YOLOv4, attentive YOLOv4 and mobilenet YOLOv4 with PASCAL VOC and COCO
Stars: ✭ 1,070 (+419.42%)
Mutual labels:  attention
Fastpunct
Punctuation restoration and spell correction experiments.
Stars: ✭ 121 (-41.26%)
Mutual labels:  attention
Pointer Networks Experiments
Sorting numbers with pointer networks
Stars: ✭ 53 (-74.27%)
Mutual labels:  attention
Transformers.jl
Julia Implementation of Transformer models
Stars: ✭ 173 (-16.02%)
Mutual labels:  attention
Time Attention
Implementation of RNN for Time Series prediction from the paper https://arxiv.org/abs/1704.02971
Stars: ✭ 52 (-74.76%)
Mutual labels:  attention
Nlp Models Tensorflow
Gathers machine learning and Tensorflow deep learning models for NLP problems, 1.13 < Tensorflow < 2.0
Stars: ✭ 1,603 (+678.16%)
Mutual labels:  attention
Biblosa Pytorch
Re-implementation of Bi-Directional Block Self-Attention for Fast and Memory-Efficient Sequence Modeling (T. Shen et al., ICLR 2018) on Pytorch.
Stars: ✭ 43 (-79.13%)
Mutual labels:  attention
Graph attention pool
Attention over nodes in Graph Neural Networks using PyTorch (NeurIPS 2019)
Stars: ✭ 186 (-9.71%)
Mutual labels:  attention
Attentioncluster
TensorFlow Implementation of "Attention Clusters: Purely Attention Based Local Feature Integration for Video Classification"
Stars: ✭ 33 (-83.98%)
Mutual labels:  attention
Bertqa Attention On Steroids
BertQA - Attention on Steroids
Stars: ✭ 112 (-45.63%)
Mutual labels:  attention
Defactonlp
DeFactoNLP: An Automated Fact-checking System that uses Named Entity Recognition, TF-IDF vector comparison and Decomposable Attention models.
Stars: ✭ 30 (-85.44%)
Mutual labels:  attention
Hey Jetson
Deep Learning based Automatic Speech Recognition with attention for the Nvidia Jetson.
Stars: ✭ 161 (-21.84%)
Mutual labels:  attention
Lambda Networks
Implementation of LambdaNetworks, a new approach to image recognition that reaches SOTA with less compute
Stars: ✭ 1,497 (+626.7%)
Mutual labels:  attention
Doc Han Att
Hierarchical Attention Networks for Chinese Sentiment Classification
Stars: ✭ 206 (+0%)
Mutual labels:  attention
Graphtransformer
Graph Transformer Architecture. Source code for "A Generalization of Transformer Networks to Graphs", DLG-AAAI'21.
Stars: ✭ 187 (-9.22%)
Mutual labels:  attention
Datastories Semeval2017 Task4
Deep-learning model presented in "DataStories at SemEval-2017 Task 4: Deep LSTM with Attention for Message-level and Topic-based Sentiment Analysis".
Stars: ✭ 184 (-10.68%)
Mutual labels:  attention
Medical Transformer
Pytorch Code for "Medical Transformer: Gated Axial-Attention for Medical Image Segmentation"
Stars: ✭ 153 (-25.73%)
Mutual labels:  attention
Multiturndialogzoo
Multi-turn dialogue baselines written in PyTorch
Stars: ✭ 106 (-48.54%)
Mutual labels:  attention
1-60 of 177 similar projects