charformer-pytorchImplementation of the GBST block from the Charformer paper, in Pytorch
Stars: ✭ 74 (+131.25%)
Transformer-in-TransformerAn Implementation of Transformer in Transformer in TensorFlow for image classification, attention inside local patches
Stars: ✭ 40 (+25%)
trapperState-of-the-art NLP through transformer models in a modular design and consistent APIs.
Stars: ✭ 28 (-12.5%)
galerkin-transformer[NeurIPS 2021] Galerkin Transformer: a linear attention without softmax
Stars: ✭ 111 (+246.88%)
tutelTutel MoE: An Optimized Mixture-of-Experts Implementation
Stars: ✭ 183 (+471.88%)
linformerImplementation of Linformer for Pytorch
Stars: ✭ 119 (+271.88%)
mcQA🔮 Answering multiple choice questions with Language Models.
Stars: ✭ 23 (-28.12%)
KARENKAREN: Unifying Hatespeech Detection and Benchmarking
Stars: ✭ 18 (-43.75%)
T3[EMNLP 2020] "T3: Tree-Autoencoder Constrained Adversarial Text Generation for Targeted Attack" by Boxin Wang, Hengzhi Pei, Boyuan Pan, Qian Chen, Shuohang Wang, Bo Li
Stars: ✭ 25 (-21.87%)
bernA neural named entity recognition and multi-type normalization tool for biomedical text mining
Stars: ✭ 151 (+371.88%)
KLUE📖 Korean NLU Benchmark
Stars: ✭ 420 (+1212.5%)
WSDM-Cup-2019[ACM-WSDM] 3rd place solution at WSDM Cup 2019, Fake News Classification on Kaggle.
Stars: ✭ 62 (+93.75%)
kwxBERT, LDA, and TFIDF based keyword extraction in Python
Stars: ✭ 33 (+3.13%)
robo-vlnPytorch code for ICRA'21 paper: "Hierarchical Cross-Modal Agent for Robotics Vision-and-Language Navigation"
Stars: ✭ 34 (+6.25%)
FinBERTA Pretrained BERT Model for Financial Communications. https://arxiv.org/abs/2006.08097
Stars: ✭ 193 (+503.13%)
Restormer[CVPR 2022--Oral] Restormer: Efficient Transformer for High-Resolution Image Restoration. SOTA for motion deblurring, image deraining, denoising (Gaussian/real data), and defocus deblurring.
Stars: ✭ 586 (+1731.25%)
TextPrunerA PyTorch-based model pruning toolkit for pre-trained language models
Stars: ✭ 94 (+193.75%)
Learning-Lab-C-LibraryThis library provides a set of basic functions for different type of deep learning (and other) algorithms in C.This deep learning library will be constantly updated
Stars: ✭ 20 (-37.5%)
cdQA-ui⛔ [NOT MAINTAINED] A web interface for cdQA and other question answering systems.
Stars: ✭ 19 (-40.62%)
Walk-TransformerFrom Random Walks to Transformer for Learning Node Embeddings (ECML-PKDD 2020) (In Pytorch and Tensorflow)
Stars: ✭ 26 (-18.75%)
policy-data-analyzerBuilding a model to recognize incentives for landscape restoration in environmental policies from Latin America, the US and India. Bringing NLP to the world of policy analysis through an extensible framework that includes scraping, preprocessing, active learning and text analysis pipelines.
Stars: ✭ 22 (-31.25%)
FragmentVCAny-to-any voice conversion by end-to-end extracting and fusing fine-grained voice fragments with attention
Stars: ✭ 134 (+318.75%)
FewCLUEFewCLUE 小样本学习测评基准,中文版
Stars: ✭ 251 (+684.38%)
uformer-pytorchImplementation of Uformer, Attention-based Unet, in Pytorch
Stars: ✭ 54 (+68.75%)
PIEFast + Non-Autoregressive Grammatical Error Correction using BERT. Code and Pre-trained models for paper "Parallel Iterative Edit Models for Local Sequence Transduction": www.aclweb.org/anthology/D19-1435.pdf (EMNLP-IJCNLP 2019)
Stars: ✭ 164 (+412.5%)
KoBERT-nsmcNaver movie review sentiment classification with KoBERT
Stars: ✭ 57 (+78.13%)
kosrKorean speech recognition based on transformer (트랜스포머 기반 한국어 음성 인식)
Stars: ✭ 25 (-21.87%)
enformer-pytorchImplementation of Enformer, Deepmind's attention network for predicting gene expression, in Pytorch
Stars: ✭ 146 (+356.25%)
SOLQ"SOLQ: Segmenting Objects by Learning Queries", SOLQ is an end-to-end instance segmentation framework with Transformer.
Stars: ✭ 159 (+396.88%)
TS-CAMCodes for TS-CAM: Token Semantic Coupled Attention Map for Weakly Supervised Object Localization.
Stars: ✭ 96 (+200%)
RSTNetRSTNet: Captioning with Adaptive Attention on Visual and Non-Visual Words (CVPR 2021)
Stars: ✭ 71 (+121.88%)
VideoBERTUsing VideoBERT to tackle video prediction
Stars: ✭ 56 (+75%)
text2keywordsTrained T5 and T5-large model for creating keywords from text
Stars: ✭ 53 (+65.63%)
DocProductMedical Q&A with Deep Language Models
Stars: ✭ 527 (+1546.88%)
OpenDialogAn Open-Source Package for Chinese Open-domain Conversational Chatbot (中文闲聊对话系统,一键部署微信闲聊机器人)
Stars: ✭ 94 (+193.75%)
deformer[ACL 2020] DeFormer: Decomposing Pre-trained Transformers for Faster Question Answering
Stars: ✭ 111 (+246.88%)
knowledge-graph-nlp-in-action从模型训练到部署,实战知识图谱(Knowledge Graph)&自然语言处理(NLP)。涉及 Tensorflow, Bert+Bi-LSTM+CRF,Neo4j等 涵盖 Named Entity Recognition,Text Classify,Information Extraction,Relation Extraction 等任务。
Stars: ✭ 58 (+81.25%)
classyclassy is a simple-to-use library for building high-performance Machine Learning models in NLP.
Stars: ✭ 61 (+90.63%)
CSV2RDFStreaming, transforming, SPARQL-based CSV to RDF converter. Apache license.
Stars: ✭ 48 (+50%)
TextPair文本对关系比较 - 语义相似度、字面相似度、文本蕴含等等
Stars: ✭ 44 (+37.5%)
MinTLMinTL: Minimalist Transfer Learning for Task-Oriented Dialogue Systems
Stars: ✭ 61 (+90.63%)
ParsBigBirdPersian Bert For Long-Range Sequences
Stars: ✭ 58 (+81.25%)
ADL2019Applied Deep Learning (2019 Spring) @ NTU
Stars: ✭ 20 (-37.5%)
text2textText2Text: Cross-lingual natural language processing and generation toolkit
Stars: ✭ 188 (+487.5%)
HRFormerThis is an official implementation of our NeurIPS 2021 paper "HRFormer: High-Resolution Transformer for Dense Prediction".
Stars: ✭ 357 (+1015.63%)
ark-nlpA private nlp coding package, which quickly implements the SOTA solutions.
Stars: ✭ 232 (+625%)
NER-FunTool本NER项目包含多个中文数据集,模型采用BiLSTM+CRF、BERT+Softmax、BERT+Cascade、BERT+WOL等,最后用TFServing进行模型部署,线上推理和线下推理。
Stars: ✭ 56 (+75%)
LaTeX-OCRpix2tex: Using a ViT to convert images of equations into LaTeX code.
Stars: ✭ 1,566 (+4793.75%)
Swin-Transformer-TensorflowUnofficial implementation of "Swin Transformer: Hierarchical Vision Transformer using Shifted Windows" (https://arxiv.org/abs/2103.14030)
Stars: ✭ 45 (+40.63%)
iamQA中文wiki百科QA阅读理解问答系统,使用了CCKS2016数据的NER模型和CMRC2018的阅读理解模型,还有W2V词向量搜索,使用torchserve部署
Stars: ✭ 46 (+43.75%)