ALBERT-PytorchPytorch Implementation of ALBERT(A Lite BERT for Self-supervised Learning of Language Representations)
Stars: ✭ 214 (+67.19%)
LMMSLanguage Modelling Makes Sense - WSD (and more) with Contextual Embeddings
Stars: ✭ 79 (-38.28%)
transformer-sltSign Language Translation with Transformers (COLING'2020, ECCV'20 SLRTP Workshop)
Stars: ✭ 92 (-28.12%)
M3DETRCode base for M3DeTR: Multi-representation, Multi-scale, Mutual-relation 3D Object Detection with Transformers
Stars: ✭ 47 (-63.28%)
visualizationa collection of visualization function
Stars: ✭ 189 (+47.66%)
KLUE📖 Korean NLU Benchmark
Stars: ✭ 420 (+228.13%)
TadTREnd-to-end Temporal Action Detection with Transformer. [Under review for a journal publication]
Stars: ✭ 55 (-57.03%)
DeepNERAn Easy-to-use, Modular and Prolongable package of deep-learning based Named Entity Recognition Models.
Stars: ✭ 9 (-92.97%)
muse-as-serviceREST API for sentence tokenization and embedding using Multilingual Universal Sentence Encoder.
Stars: ✭ 45 (-64.84%)
EmbeddingEmbedding模型代码和学习笔记总结
Stars: ✭ 25 (-80.47%)
FNet-pytorchUnofficial implementation of Google's FNet: Mixing Tokens with Fourier Transforms
Stars: ✭ 204 (+59.38%)
classyclassy is a simple-to-use library for building high-performance Machine Learning models in NLP.
Stars: ✭ 61 (-52.34%)
max-deeplabUnofficial implementation of MaX-DeepLab for Instance Segmentation
Stars: ✭ 84 (-34.37%)
KARENKAREN: Unifying Hatespeech Detection and Benchmarking
Stars: ✭ 18 (-85.94%)
Walk-TransformerFrom Random Walks to Transformer for Learning Node Embeddings (ECML-PKDD 2020) (In Pytorch and Tensorflow)
Stars: ✭ 26 (-79.69%)
towheeTowhee is a framework that is dedicated to making neural data processing pipelines simple and fast.
Stars: ✭ 821 (+541.41%)
mcQA🔮 Answering multiple choice questions with Language Models.
Stars: ✭ 23 (-82.03%)
bernA neural named entity recognition and multi-type normalization tool for biomedical text mining
Stars: ✭ 151 (+17.97%)
image-classificationA collection of SOTA Image Classification Models in PyTorch
Stars: ✭ 70 (-45.31%)
TorchBlocksA PyTorch-based toolkit for natural language processing
Stars: ✭ 85 (-33.59%)
WSDM-Cup-2019[ACM-WSDM] 3rd place solution at WSDM Cup 2019, Fake News Classification on Kaggle.
Stars: ✭ 62 (-51.56%)
parsbert-ner🤗 ParsBERT Persian NER Tasks
Stars: ✭ 15 (-88.28%)
ai web RISKOUT BTS국방 리스크 관리 플랫폼 (🏅 국방부장관상/Minister of National Defense Award)
Stars: ✭ 18 (-85.94%)
transform-graphql⚙️ Transformer function to transform GraphQL Directives. Create model CRUD directive for example
Stars: ✭ 23 (-82.03%)
text2textText2Text: Cross-lingual natural language processing and generation toolkit
Stars: ✭ 188 (+46.88%)
deep-molecular-optimizationMolecular optimization by capturing chemist’s intuition using the Seq2Seq with attention and the Transformer
Stars: ✭ 60 (-53.12%)
YOLOv5-Lite🍅🍅🍅YOLOv5-Lite: lighter, faster and easier to deploy. Evolved from yolov5 and the size of model is only 930+kb (int8) and 1.7M (fp16). It can reach 10+ FPS on the Raspberry Pi 4B when the input size is 320×320~
Stars: ✭ 1,230 (+860.94%)
FragmentVCAny-to-any voice conversion by end-to-end extracting and fusing fine-grained voice fragments with attention
Stars: ✭ 134 (+4.69%)
textwiser[AAAI 2021] TextWiser: Text Featurization Library
Stars: ✭ 26 (-79.69%)
tfbert基于tensorflow1.x的预训练模型调用,支持单机多卡、梯度累积,XLA加速,混合精度。可灵活训练、验证、预测。
Stars: ✭ 54 (-57.81%)
textgoText preprocessing, representation, similarity calculation, text search and classification. Let's go and play with text!
Stars: ✭ 33 (-74.22%)
JD2Skills-BERT-XMLCCode and Dataset for the Bhola et al. (2020) Retrieving Skills from Job Descriptions: A Language Model Based Extreme Multi-label Classification Framework
Stars: ✭ 33 (-74.22%)
TransPosePyTorch Implementation for "TransPose: Keypoint localization via Transformer", ICCV 2021.
Stars: ✭ 250 (+95.31%)
transformerA PyTorch Implementation of "Attention Is All You Need"
Stars: ✭ 28 (-78.12%)
kosrKorean speech recognition based on transformer (트랜스포머 기반 한국어 음성 인식)
Stars: ✭ 25 (-80.47%)
NER-FunTool本NER项目包含多个中文数据集,模型采用BiLSTM+CRF、BERT+Softmax、BERT+Cascade、BERT+WOL等,最后用TFServing进行模型部署,线上推理和线下推理。
Stars: ✭ 56 (-56.25%)
Transformer-in-TransformerAn Implementation of Transformer in Transformer in TensorFlow for image classification, attention inside local patches
Stars: ✭ 40 (-68.75%)
enformer-pytorchImplementation of Enformer, Deepmind's attention network for predicting gene expression, in Pytorch
Stars: ✭ 146 (+14.06%)
BossNAS(ICCV 2021) BossNAS: Exploring Hybrid CNN-transformers with Block-wisely Self-supervised Neural Architecture Search
Stars: ✭ 125 (-2.34%)
OpenPromptAn Open-Source Framework for Prompt-Learning.
Stars: ✭ 1,769 (+1282.03%)
TS-CAMCodes for TS-CAM: Token Semantic Coupled Attention Map for Weakly Supervised Object Localization.
Stars: ✭ 96 (-25%)
graph-transformer-pytorchImplementation of Graph Transformer in Pytorch, for potential use in replicating Alphafold2
Stars: ✭ 81 (-36.72%)
GTSRB Keras STNGerman Traffic Sign Recognition Benchmark, Keras implementation with Spatial Transformer Networks
Stars: ✭ 48 (-62.5%)
segmenter[ICCV2021] Official PyTorch implementation of Segmenter: Transformer for Semantic Segmentation
Stars: ✭ 463 (+261.72%)