deep-learningProjects include the application of transfer learning to build a convolutional neural network (CNN) that identifies the artist of a painting, the building of predictive models for Bitcoin price data using Long Short-Term Memory recurrent neural networks (LSTMs) and a tutorial explaining how to build two types of neural network using as input the…
Stars: ✭ 43 (+22.86%)
ProteinLMProtein Language Model
Stars: ✭ 76 (+117.14%)
SEFR CUTDomain Adaptation of Thai Word Segmentation Models using Stacked Ensemble (EMNLP2020)
Stars: ✭ 18 (-48.57%)
MoeFlowRepository for anime characters recognition website, powered by TensorFlow
Stars: ✭ 113 (+222.86%)
sisterSImple SenTence EmbeddeR
Stars: ✭ 66 (+88.57%)
NLP-paper🎨 🎨NLP 自然语言处理教程 🎨🎨 https://dataxujing.github.io/NLP-paper/
Stars: ✭ 23 (-34.29%)
question generatorAn NLP system for generating reading comprehension questions
Stars: ✭ 188 (+437.14%)
AB distillationKnowledge Transfer via Distillation of Activation Boundaries Formed by Hidden Neurons (AAAI 2019)
Stars: ✭ 105 (+200%)
neural-ranking-kdImproving Efficient Neural Ranking Models with Cross-Architecture Knowledge Distillation
Stars: ✭ 74 (+111.43%)
NAG-BERT[EACL'21] Non-Autoregressive with Pretrained Language Model
Stars: ✭ 47 (+34.29%)
neuro-evolutionA project on improving Neural Networks performance by using Genetic Algorithms.
Stars: ✭ 25 (-28.57%)
paper annotationsA place to keep track of all the annotated papers.
Stars: ✭ 96 (+174.29%)
nlp workshop odsc europe20Extensive tutorials for the Advanced NLP Workshop in Open Data Science Conference Europe 2020. We will leverage machine learning, deep learning and deep transfer learning to learn and solve popular tasks using NLP including NER, Classification, Recommendation \ Information Retrieval, Summarization, Classification, Language Translation, Q&A and T…
Stars: ✭ 127 (+262.86%)
FasterTransformerTransformer related optimization, including BERT, GPT
Stars: ✭ 1,571 (+4388.57%)
NLPDataAugmentationChinese NLP Data Augmentation, BERT Contextual Augmentation
Stars: ✭ 94 (+168.57%)
BERT-QECode and resources for the paper "BERT-QE: Contextualized Query Expansion for Document Re-ranking".
Stars: ✭ 43 (+22.86%)
korpatbert특허분야 특화된 한국어 AI언어모델 KorPatBERT
Stars: ✭ 48 (+37.14%)
sparsezooNeural network model repository for highly sparse and sparse-quantized models with matching sparsification recipes
Stars: ✭ 264 (+654.29%)
TransTQAAuthor: Wenhao Yu (
[email protected]). EMNLP'20. Transfer Learning for Technical Question Answering.
Stars: ✭ 12 (-65.71%)
DE-LIMITDeEpLearning models for MultIlingual haTespeech (DELIMIT): Benchmarking multilingual models across 9 languages and 16 datasets.
Stars: ✭ 90 (+157.14%)
cups-rlCustomisable Unified Physical Simulations (CUPS) for Reinforcement Learning. Experiments run on the ai2thor environment (http://ai2thor.allenai.org/) e.g. using A3C, RainbowDQN and A3C_GA (Gated Attention multi-modal fusion) for Task-Oriented Language Grounding (tasks specified by natural language instructions) e.g. "Pick up the Cup or else"
Stars: ✭ 38 (+8.57%)
TabFormerCode & Data for "Tabular Transformers for Modeling Multivariate Time Series" (ICASSP, 2021)
Stars: ✭ 209 (+497.14%)
KitanaQAKitanaQA: Adversarial training and data augmentation for neural question-answering models
Stars: ✭ 58 (+65.71%)
sign2textReal-time AI-powered translation of American sign language to text
Stars: ✭ 132 (+277.14%)
rasa milktea chatbotChatbot with bert chinese model, base on rasa framework(中文聊天机器人,结合bert意图分析,基于rasa框架)
Stars: ✭ 97 (+177.14%)
consistencyImplementation of models in our EMNLP 2019 paper: A Logic-Driven Framework for Consistency of Neural Models
Stars: ✭ 26 (-25.71%)
ulm-basenetImplementation of ULMFit algorithm for text classification via transfer learning
Stars: ✭ 94 (+168.57%)
image-background-remove-tool✂️ Automated high-quality background removal framework for an image using neural networks. ✂️
Stars: ✭ 767 (+2091.43%)
favorite-research-papersListing my favorite research papers 📝 from different fields as I read them.
Stars: ✭ 12 (-65.71%)
contextualLSTMContextual LSTM for NLP tasks like word prediction and word embedding creation for Deep Learning
Stars: ✭ 28 (-20%)
bert attn vizVisualize BERT's self-attention layers on text classification tasks
Stars: ✭ 41 (+17.14%)
LAMB Optimizer TFLAMB Optimizer for Large Batch Training (TensorFlow version)
Stars: ✭ 119 (+240%)
troveWeakly supervised medical named entity classification
Stars: ✭ 55 (+57.14%)
NeuralNetworksImplementation of a Neural Network that can detect whether a video is in-game or not
Stars: ✭ 64 (+82.86%)
TransferSegUnseen Object Segmentation in Videos via Transferable Representations, ACCV 2018 (oral)
Stars: ✭ 25 (-28.57%)
Context-TransformerContext-Transformer: Tackling Object Confusion for Few-Shot Detection, AAAI 2020
Stars: ✭ 89 (+154.29%)
Fill-the-GAP[ACL-WS] 4th place solution to gendered pronoun resolution challenge on Kaggle
Stars: ✭ 13 (-62.86%)
roberta-wwm-base-distillthis is roberta wwm base distilled model which was distilled from roberta wwm by roberta wwm large
Stars: ✭ 61 (+74.29%)
ganbertEnhancing the BERT training with Semi-supervised Generative Adversarial Networks
Stars: ✭ 205 (+485.71%)