The Story Of HeadsThis is a repository with the code for the ACL 2019 paper "Analyzing Multi-Head Self-Attention: Specialized Heads Do the Heavy Lifting, the Rest Can Be Pruned" and the paper "Analyzing Source and Target Contributions to NMT Predictions".
Stars: ✭ 146 (-40.41%)
ViTs-vs-CNNs[NeurIPS 2021]: Are Transformers More Robust Than CNNs? (Pytorch implementation & checkpoints)
Stars: ✭ 145 (-40.82%)
R-MeNTransformer-based Memory Networks for Knowledge Graph Embeddings (ACL 2020) (Pytorch and Tensorflow)
Stars: ✭ 74 (-69.8%)
Symfony JsonapiJSON API Transformer Bundle for Symfony 2 and Symfony 3
Stars: ✭ 114 (-53.47%)
udacity-cvnd-projectsMy solutions to the projects assigned for the Udacity Computer Vision Nanodegree
Stars: ✭ 36 (-85.31%)
Overlappredator[CVPR 2021, Oral] PREDATOR: Registration of 3D Point Clouds with Low Overlap.
Stars: ✭ 106 (-56.73%)
wisdomifyA BERT-based reverse dictionary of Korean proverbs
Stars: ✭ 95 (-61.22%)
KissCode for the paper "KISS: Keeping it Simple for Scene Text Recognition"
Stars: ✭ 108 (-55.92%)
Word-Level-Eng-Mar-NMTTranslating English sentences to Marathi using Neural Machine Translation
Stars: ✭ 37 (-84.9%)
awesome-transformer-searchA curated list of awesome resources combining Transformers with Neural Architecture Search
Stars: ✭ 194 (-20.82%)
transformerBuild English-Vietnamese machine translation with ProtonX Transformer. :D
Stars: ✭ 41 (-83.27%)
avsr-tf1Audio-Visual Speech Recognition using Sequence to Sequence Models
Stars: ✭ 76 (-68.98%)
Esbuild JestA Jest transformer using esbuild
Stars: ✭ 100 (-59.18%)
tensorflow-chatbot-chinese網頁聊天機器人 | tensorflow implementation of seq2seq model with bahdanau attention and Word2Vec pretrained embedding
Stars: ✭ 50 (-79.59%)
question generatorAn NLP system for generating reading comprehension questions
Stars: ✭ 188 (-23.27%)
Njunmt TfAn open-source neural machine translation system developed by Natural Language Processing Group, Nanjing University.
Stars: ✭ 97 (-60.41%)
Laravel Oh GeneratorsThis package extends the core file generators that are included with Laravel 5 or later.
Stars: ✭ 96 (-60.82%)
imdb-transformerA simple Neural Network for sentiment analysis, embedding sentences using a Transformer network.
Stars: ✭ 26 (-89.39%)
EqtransformerEQTransformer, a python package for earthquake signal detection and phase picking using AI.
Stars: ✭ 95 (-61.22%)
DiscEvalDiscourse Based Evaluation of Language Understanding
Stars: ✭ 18 (-92.65%)
DrFAQDrFAQ is a plug-and-play question answering NLP chatbot that can be generally applied to any organisation's text corpora.
Stars: ✭ 29 (-88.16%)
Smiles TransformerOriginal implementation of the paper "SMILES Transformer: Pre-trained Molecular Fingerprint for Low Data Drug Discovery" by Shion Honda et al.
Stars: ✭ 86 (-64.9%)
bytekitJava 字节操作的工具库(不是字节码的工具库)
Stars: ✭ 40 (-83.67%)
Pytorch Openai Transformer Lm🐥A PyTorch implementation of OpenAI's finetuned transformer language model with a script to import the weights pre-trained by OpenAI
Stars: ✭ 1,268 (+417.55%)
catacombThe simplest machine learning library for launching UIs, running evaluations, and comparing model performance.
Stars: ✭ 13 (-94.69%)
Gpt2 ChitchatGPT2 for Chinese chitchat/用于中文闲聊的GPT2模型(实现了DialoGPT的MMI思想)
Stars: ✭ 1,230 (+402.04%)
VideoTransformer-pytorchPyTorch implementation of a collections of scalable Video Transformer Benchmarks.
Stars: ✭ 159 (-35.1%)
npo classifierAutomated coding using machine-learning and remapping the U.S. nonprofit sector: A guide and benchmark
Stars: ✭ 18 (-92.65%)
KB-ALBERTKB국민은행에서 제공하는 경제/금융 도메인에 특화된 한국어 ALBERT 모델
Stars: ✭ 215 (-12.24%)
DialogptLarge-scale pretraining for dialogue
Stars: ✭ 1,177 (+380.41%)
ExpBERTCode for our ACL '20 paper "Representation Engineering with Natural Language Explanations"
Stars: ✭ 28 (-88.57%)
PresentoPresento - Transformer & Presenter Package for PHP
Stars: ✭ 71 (-71.02%)
TextSummareimplementing Neural Summarization by Extracting Sentences and Words
Stars: ✭ 16 (-93.47%)
Mixture Of ExpertsA Pytorch implementation of Sparsely-Gated Mixture of Experts, for massively increasing the parameter count of language models
Stars: ✭ 68 (-72.24%)
SkillNERA (smart) rule based NLP module to extract job skills from text
Stars: ✭ 69 (-71.84%)
nemesystGeneralised and highly customisable, hybrid-parallelism, database based, deep learning framework.
Stars: ✭ 17 (-93.06%)
catrImage Captioning Using Transformer
Stars: ✭ 206 (-15.92%)
Transformer DynetAn Implementation of Transformer (Attention Is All You Need) in DyNet
Stars: ✭ 57 (-76.73%)
MASTER-pytorchCode for the paper "MASTER: Multi-Aspect Non-local Network for Scene Text Recognition" (Pattern Recognition 2021)
Stars: ✭ 263 (+7.35%)
transformer-lsOfficial PyTorch Implementation of Long-Short Transformer (NeurIPS 2021).
Stars: ✭ 201 (-17.96%)
Selected StoriesAn experimental web text editor that runs a LSTM model while you write to suggest new lines
Stars: ✭ 39 (-84.08%)
molminerPython library and command-line tool for extracting compounds from scientific literature. Written in Python.
Stars: ✭ 38 (-84.49%)
jseTF2 Jump Server Essentials
Stars: ✭ 16 (-93.47%)
danifojo-2018-repeatrnnComparing Fixed and Adaptive Computation Time for Recurrent Neural Networks
Stars: ✭ 32 (-86.94%)
KoBERT-NERNER Task with KoBERT (with Naver NLP Challenge dataset)
Stars: ✭ 76 (-68.98%)
AiSpaceAiSpace: Better practices for deep learning model development and deployment For Tensorflow 2.0
Stars: ✭ 28 (-88.57%)