ntua-slp-semeval2018Deep-learning models of NTUA-SLP team submitted in SemEval 2018 tasks 1, 2 and 3.
Stars: ✭ 79 (-42.34%)
knowledge-neuronsA library for finding knowledge neurons in pretrained transformer models.
Stars: ✭ 72 (-47.45%)
DAF3DDeep Attentive Features for Prostate Segmentation in 3D Transrectal Ultrasound
Stars: ✭ 60 (-56.2%)
trapperState-of-the-art NLP through transformer models in a modular design and consistent APIs.
Stars: ✭ 28 (-79.56%)
BERT-NERUsing pre-trained BERT models for Chinese and English NER with 🤗Transformers
Stars: ✭ 114 (-16.79%)
TorchBlocksA PyTorch-based toolkit for natural language processing
Stars: ✭ 85 (-37.96%)
remixer-pytorchImplementation of the Remixer Block from the Remixer paper, in Pytorch
Stars: ✭ 37 (-72.99%)
optimum🏎️ Accelerate training and inference of 🤗 Transformers with easy to use hardware optimization tools
Stars: ✭ 567 (+313.87%)
SelfAttentiveImplementation of A Structured Self-attentive Sentence Embedding
Stars: ✭ 107 (-21.9%)
iPerceiveApplying Common-Sense Reasoning to Multi-Modal Dense Video Captioning and Video Question Answering | Python3 | PyTorch | CNNs | Causality | Reasoning | LSTMs | Transformers | Multi-Head Self Attention | Published in IEEE Winter Conference on Applications of Computer Vision (WACV) 2021
Stars: ✭ 52 (-62.04%)
vista-netCode for the paper "VistaNet: Visual Aspect Attention Network for Multimodal Sentiment Analysis", AAAI'19
Stars: ✭ 67 (-51.09%)
SANETArbitrary Style Transfer with Style-Attentional Networks
Stars: ✭ 105 (-23.36%)
halonet-pytorchImplementation of the 😇 Attention layer from the paper, Scaling Local Self-Attention For Parameter Efficient Visual Backbones
Stars: ✭ 181 (+32.12%)
TermiNetwork🌏 A zero-dependency networking solution for building modern and secure iOS, watchOS, macOS and tvOS applications.
Stars: ✭ 80 (-41.61%)
TransCenterThis is the official implementation of TransCenter. The code and pretrained models are now available here: https://gitlab.inria.fr/yixu/TransCenter_official.
Stars: ✭ 82 (-40.15%)
PAM[TPAMI 2020] Parallax Attention for Unsupervised Stereo Correspondence Learning
Stars: ✭ 62 (-54.74%)
pynmta simple and complete pytorch implementation of neural machine translation system
Stars: ✭ 13 (-90.51%)
stylegan-v[CVPR 2022] StyleGAN-V: A Continuous Video Generator with the Price, Image Quality and Perks of StyleGAN2
Stars: ✭ 136 (-0.73%)
spark-transformersSpark-Transformers: Library for exporting Apache Spark MLLIB models to use them in any Java application with no other dependencies.
Stars: ✭ 39 (-71.53%)
OpenDialogAn Open-Source Package for Chinese Open-domain Conversational Chatbot (中文闲聊对话系统,一键部署微信闲聊机器人)
Stars: ✭ 94 (-31.39%)
deepfrogAn NLP-suite powered by deep learning
Stars: ✭ 16 (-88.32%)
pytorch-vitAn Image is Worth 16x16 Words: Transformers for Image Recognition at Scale
Stars: ✭ 250 (+82.48%)
BangalASRTransformer based Bangla Speech Recognition
Stars: ✭ 20 (-85.4%)
domain-attentioncodes for paper "Domain Attention Model for Multi-Domain Sentiment Classification"
Stars: ✭ 22 (-83.94%)
abcnn pytorchImplementation of ABCNN(Attention-Based Convolutional Neural Network) on Pytorch
Stars: ✭ 35 (-74.45%)
C-TranGeneral Multi-label Image Classification with Transformers
Stars: ✭ 106 (-22.63%)
converseConversational text Analysis using various NLP techniques
Stars: ✭ 147 (+7.3%)
SentimentAnalysisSentiment Analysis: Deep Bi-LSTM+attention model
Stars: ✭ 32 (-76.64%)
simple transformersSimple transformer implementations that I can understand
Stars: ✭ 18 (-86.86%)
modulesThe official repository for our paper "Are Neural Nets Modular? Inspecting Functional Modularity Through Differentiable Weight Masks". We develop a method for analyzing emerging functional modularity in neural networks based on differentiable weight masks and use it to point out important issues in current-day neural networks.
Stars: ✭ 25 (-81.75%)
FragmentVCAny-to-any voice conversion by end-to-end extracting and fusing fine-grained voice fragments with attention
Stars: ✭ 134 (-2.19%)
extkerasPlayground for implementing custom layers and other components compatible with keras, with the purpose to learn the framework better and perhaps in future offer some utils for others.
Stars: ✭ 18 (-86.86%)
text2textText2Text: Cross-lingual natural language processing and generation toolkit
Stars: ✭ 188 (+37.23%)
enformer-pytorchImplementation of Enformer, Deepmind's attention network for predicting gene expression, in Pytorch
Stars: ✭ 146 (+6.57%)
hamnetPyTorch implementation of AAAI 2021 paper: A Hybrid Attention Mechanism for Weakly-Supervised Temporal Action Localization
Stars: ✭ 30 (-78.1%)
HugsVisionHugsVision is a easy to use huggingface wrapper for state-of-the-art computer vision
Stars: ✭ 154 (+12.41%)
datastories-semeval2017-task6Deep-learning model presented in "DataStories at SemEval-2017 Task 6: Siamese LSTM with Attention for Humorous Text Comparison".
Stars: ✭ 20 (-85.4%)
text2keywordsTrained T5 and T5-large model for creating keywords from text
Stars: ✭ 53 (-61.31%)
xpandasUniversal 1d/2d data containers with Transformers functionality for data analysis.
Stars: ✭ 25 (-81.75%)
NTUA-slp-nlp💻Speech and Natural Language Processing (SLP & NLP) Lab Assignments for ECE NTUA
Stars: ✭ 19 (-86.13%)
WellcomeMLRepository for Machine Learning utils at the Wellcome Trust
Stars: ✭ 31 (-77.37%)
bert-squeeze🛠️ Tools for Transformers compression using PyTorch Lightning ⚡
Stars: ✭ 56 (-59.12%)
fusion ganCodes for the paper 'Learning to Fuse Music Genres with Generative Adversarial Dual Learning' ICDM 17
Stars: ✭ 18 (-86.86%)
text2classMulti-class text categorization using state-of-the-art pre-trained contextualized language models, e.g. BERT
Stars: ✭ 15 (-89.05%)
eve-botEVE bot, a customer service chatbot to enhance virtual engagement for Twitter Apple Support
Stars: ✭ 31 (-77.37%)
lightning-transformersFlexible components pairing 🤗 Transformers with Pytorch Lightning
Stars: ✭ 551 (+302.19%)
dodrioExploring attention weights in transformer-based models with linguistic knowledge.
Stars: ✭ 233 (+70.07%)
GAN-Project-2018GAN in Tensorflow to be run via Linux command line
Stars: ✭ 21 (-84.67%)
Patient2VecPatient2Vec: A Personalized Interpretable Deep Representation of the Longitudinal Electronic Health Record
Stars: ✭ 85 (-37.96%)