DCAN[AAAI 2020] Code release for "Domain Conditioned Adaptation Network" https://arxiv.org/abs/2005.06717
Stars: ✭ 27 (-55%)
amta-netAsymmetric Multi-Task Attention Network for Prostate Bed Segmentation in CT Images
Stars: ✭ 26 (-56.67%)
TS3000 TheChatBOTIts a social networking chat-bot trained on Reddit dataset . It supports open bounded queries developed on the concept of Neural Machine Translation. Beware of its being sarcastic just like its creator 😝 BDW it uses Pytorch framework and Python3.
Stars: ✭ 20 (-66.67%)
DocTrThe official code for “DocTr: Document Image Transformer for Geometric Unwarping and Illumination Correction”, ACM MM, Oral Paper, 2021.
Stars: ✭ 202 (+236.67%)
TailCalibXPytorch implementation of Feature Generation for Long-Tail Classification by Rahul Vigneswaran, Marc T Law, Vineeth N Balasubramaniam and Makarand Tapaswi
Stars: ✭ 32 (-46.67%)
datastories-semeval2017-task6Deep-learning model presented in "DataStories at SemEval-2017 Task 6: Siamese LSTM with Attention for Humorous Text Comparison".
Stars: ✭ 20 (-66.67%)
3HANAn original implementation of "3HAN: A Deep Neural Network for Fake News Detection" (ICONIP 2017)
Stars: ✭ 29 (-51.67%)
Compact-Global-DescriptorPytorch implementation of "Compact Global Descriptor for Neural Networks" (CGD).
Stars: ✭ 22 (-63.33%)
SAMNThis is our implementation of SAMN: Social Attentional Memory Network
Stars: ✭ 45 (-25%)
Deep-MVLMA tool for precisely placing 3D landmarks on 3D facial scans based on the paper "Multi-view Consensus CNN for 3D Facial Landmark Placement"
Stars: ✭ 71 (+18.33%)
resolutions-2019A list of data mining and machine learning papers that I implemented in 2019.
Stars: ✭ 19 (-68.33%)
minimal-nmtA minimal nmt example to serve as an seq2seq+attention reference.
Stars: ✭ 36 (-40%)
SpinNet[CVPR 2021] SpinNet: Learning a General Surface Descriptor for 3D Point Cloud Registration
Stars: ✭ 181 (+201.67%)
NLP-paper🎨 🎨NLP 自然语言处理教程 🎨🎨 https://dataxujing.github.io/NLP-paper/
Stars: ✭ 23 (-61.67%)
tldrTLDR is an unsupervised dimensionality reduction method that combines neighborhood embedding learning with the simplicity and effectiveness of recent self-supervised learning losses
Stars: ✭ 95 (+58.33%)
depth-map-predictionPytorch Implementation of Depth Map Prediction from a Single Image using a Multi-Scale Deep Network
Stars: ✭ 78 (+30%)
SPANSemantics-guided Part Attention Network (ECCV 2020 Oral)
Stars: ✭ 19 (-68.33%)
abcnn pytorchImplementation of ABCNN(Attention-Based Convolutional Neural Network) on Pytorch
Stars: ✭ 35 (-41.67%)
Machine-Translation-Hindi-to-english-Machine translation is the task of converting one language to other. Unlike the traditional phrase-based translation system which consists of many small sub-components that are tuned separately, neural machine translation attempts to build and train a single, large neural network that reads a sentence and outputs a correct translation.
Stars: ✭ 19 (-68.33%)
Multi-task-Conditional-Attention-NetworksA prototype version of our submitted paper: Conversion Prediction Using Multi-task Conditional Attention Networks to Support the Creation of Effective Ad Creatives.
Stars: ✭ 21 (-65%)
DOSMAAn AI-powered open-source medical image analysis toolbox
Stars: ✭ 45 (-25%)
ResNet-50-CBAM-PyTorchImplementation of Resnet-50 with and without CBAM in PyTorch v1.8. Implementation tested on Intel Image Classification dataset from https://www.kaggle.com/puneet6060/intel-image-classification.
Stars: ✭ 31 (-48.33%)
extkerasPlayground for implementing custom layers and other components compatible with keras, with the purpose to learn the framework better and perhaps in future offer some utils for others.
Stars: ✭ 18 (-70%)
visdialVisual Dialog: Light-weight Transformer for Many Inputs (ECCV 2020)
Stars: ✭ 27 (-55%)
hamnetPyTorch implementation of AAAI 2021 paper: A Hybrid Attention Mechanism for Weakly-Supervised Temporal Action Localization
Stars: ✭ 30 (-50%)
SANETArbitrary Style Transfer with Style-Attentional Networks
Stars: ✭ 105 (+75%)
covid19.MIScnnRobust Chest CT Image Segmentation of COVID-19 Lung Infection based on limited data
Stars: ✭ 77 (+28.33%)
LMFD-PADLearnable Multi-level Frequency Decomposition and Hierarchical Attention Mechanism for Generalized Face Presentation Attack Detection
Stars: ✭ 27 (-55%)
CrabNetPredict materials properties using only the composition information!
Stars: ✭ 57 (-5%)
DeepAtlasJoint Semi-supervised Learning of Image Registration and Segmentation
Stars: ✭ 38 (-36.67%)
PyTorchAn open source deep learning platform that provides a seamless path from research prototyping to production deployment
Stars: ✭ 17 (-71.67%)
Brain-Tumor-SegmentationAttention-Guided Version of 2D UNet for Automatic Brain Tumor Segmentation
Stars: ✭ 125 (+108.33%)
visualizationa collection of visualization function
Stars: ✭ 189 (+215%)
Magic-VNetVNet for 3d volume segmentation
Stars: ✭ 45 (-25%)
DualStudentCode for Paper ''Dual Student: Breaking the Limits of the Teacher in Semi-Supervised Learning'' [ICCV 2019]
Stars: ✭ 106 (+76.67%)
nvaeAn unofficial toy implementation for NVAE 《A Deep Hierarchical Variational Autoencoder》
Stars: ✭ 83 (+38.33%)
domain-attentioncodes for paper "Domain Attention Model for Multi-Domain Sentiment Classification"
Stars: ✭ 22 (-63.33%)
tfvaegan[ECCV 2020] Official Pytorch implementation for "Latent Embedding Feedback and Discriminative Features for Zero-Shot Classification". SOTA results for ZSL and GZSL
Stars: ✭ 107 (+78.33%)
dodrioExploring attention weights in transformer-based models with linguistic knowledge.
Stars: ✭ 233 (+288.33%)
loc2vecPytorch implementation of the Loc2Vec with some modifications for speed
Stars: ✭ 40 (-33.33%)
halonet-pytorchImplementation of the 😇 Attention layer from the paper, Scaling Local Self-Attention For Parameter Efficient Visual Backbones
Stars: ✭ 181 (+201.67%)
h-transformer-1dImplementation of H-Transformer-1D, Hierarchical Attention for Sequence Learning
Stars: ✭ 121 (+101.67%)
kg one2setCode for our ACL 2021 paper "One2Set: Generating Diverse Keyphrases as a Set"
Stars: ✭ 58 (-3.33%)
AdaSpeechAdaSpeech: Adaptive Text to Speech for Custom Voice
Stars: ✭ 108 (+80%)
efficient-attentionAn implementation of the efficient attention module.
Stars: ✭ 191 (+218.33%)
SequenceToSequenceA seq2seq with attention dialogue/MT model implemented by TensorFlow.
Stars: ✭ 11 (-81.67%)
stanford-cs231n-assignments-2020This repository contains my solutions to the assignments for Stanford's CS231n "Convolutional Neural Networks for Visual Recognition" (Spring 2020).
Stars: ✭ 84 (+40%)
OverlapPredator[CVPR 2021, Oral] PREDATOR: Registration of 3D Point Clouds with Low Overlap.
Stars: ✭ 293 (+388.33%)
egfr-attDrug effect prediction using neural network
Stars: ✭ 17 (-71.67%)