Hardware Aware Transformers[ACL 2020] HAT: Hardware-Aware Transformers for Efficient Natural Language Processing
Stars: ✭ 206 (+123.91%)
Nlp Interview Notes本项目是作者们根据个人面试和经验总结出的自然语言处理(NLP)面试准备的学习笔记与资料,该资料目前包含 自然语言处理各领域的 面试题积累。
Stars: ✭ 207 (+125%)
MusicTransformer-PytorchMusicTransformer written for MaestroV2 using the Pytorch framework for music generation
Stars: ✭ 106 (+15.22%)
Hrnet Semantic SegmentationThe OCR approach is rephrased as Segmentation Transformer: https://arxiv.org/abs/1909.11065. This is an official implementation of semantic segmentation for HRNet. https://arxiv.org/abs/1908.07919
Stars: ✭ 2,369 (+2475%)
Linear Attention TransformerTransformer based on a variant of attention that is linear complexity in respect to sequence length
Stars: ✭ 205 (+122.83%)
GhostnetCV backbones including GhostNet, TinyNet and TNT, developed by Huawei Noah's Ark Lab.
Stars: ✭ 1,744 (+1795.65%)
ClusterTransformerTopic clustering library built on Transformer embeddings and cosine similarity metrics.Compatible with all BERT base transformers from huggingface.
Stars: ✭ 36 (-60.87%)
bert in a flaskA dockerized flask API, serving ALBERT and BERT predictions using TensorFlow 2.0.
Stars: ✭ 32 (-65.22%)
Lumen Api StarterLumen 8 基础上扩展出的API 启动项目,精心设计的目录结构,规范统一的响应数据格式,Repository 模式架构的最佳实践。
Stars: ✭ 197 (+114.13%)
svgs2fontsnpm-svgs2fonts。svg图标转字体图标库(svgs -> svg,ttf,eot,woff,woff2),nodejs。
Stars: ✭ 29 (-68.48%)
Swin-Transformer-TensorflowUnofficial implementation of "Swin Transformer: Hierarchical Vision Transformer using Shifted Windows" (https://arxiv.org/abs/2103.14030)
Stars: ✭ 45 (-51.09%)
GraphtransformerGraph Transformer Architecture. Source code for "A Generalization of Transformer Networks to Graphs", DLG-AAAI'21.
Stars: ✭ 187 (+103.26%)
galerkin-transformer[NeurIPS 2021] Galerkin Transformer: a linear attention without softmax
Stars: ✭ 111 (+20.65%)
TextPrunerA PyTorch-based model pruning toolkit for pre-trained language models
Stars: ✭ 94 (+2.17%)
Graph TransformerTransformer for Graph Classification (Pytorch and Tensorflow)
Stars: ✭ 191 (+107.61%)
charformer-pytorchImplementation of the GBST block from the Charformer paper, in Pytorch
Stars: ✭ 74 (-19.57%)
En-transformerImplementation of E(n)-Transformer, which extends the ideas of Welling's E(n)-Equivariant Graph Neural Network to attention
Stars: ✭ 131 (+42.39%)
Filipino-Text-BenchmarksOpen-source benchmark datasets and pretrained transformer models in the Filipino language.
Stars: ✭ 22 (-76.09%)
trapperState-of-the-art NLP through transformer models in a modular design and consistent APIs.
Stars: ✭ 28 (-69.57%)
TransPosePyTorch Implementation for "TransPose: Keypoint localization via Transformer", ICCV 2021.
Stars: ✭ 250 (+171.74%)
semantic-document-relationsImplementation, trained models and result data for the paper "Pairwise Multi-Class Document Classification for Semantic Relations between Wikipedia Articles"
Stars: ✭ 21 (-77.17%)
svelte-jestJest Svelte component transformer
Stars: ✭ 37 (-59.78%)
kosrKorean speech recognition based on transformer (트랜스포머 기반 한국어 음성 인식)
Stars: ✭ 25 (-72.83%)
Eeg DlA Deep Learning library for EEG Tasks (Signals) Classification, based on TensorFlow.
Stars: ✭ 165 (+79.35%)
AdaSpeechAdaSpeech: Adaptive Text to Speech for Custom Voice
Stars: ✭ 108 (+17.39%)
Medical TransformerPytorch Code for "Medical Transformer: Gated Axial-Attention for Medical Image Segmentation"
Stars: ✭ 153 (+66.3%)
deep-molecular-optimizationMolecular optimization by capturing chemist’s intuition using the Seq2Seq with attention and the Transformer
Stars: ✭ 60 (-34.78%)
transformerA simple TensorFlow implementation of the Transformer
Stars: ✭ 25 (-72.83%)
segmenter[ICCV2021] Official PyTorch implementation of Segmenter: Transformer for Semantic Segmentation
Stars: ✭ 463 (+403.26%)
XpersonaXPersona: Evaluating Multilingual Personalized Chatbot
Stars: ✭ 54 (-41.3%)
Image-CaptionUsing LSTM or Transformer to solve Image Captioning in Pytorch
Stars: ✭ 36 (-60.87%)
M3DETRCode base for M3DeTR: Multi-representation, Multi-scale, Mutual-relation 3D Object Detection with Transformers
Stars: ✭ 47 (-48.91%)
imdb-transformerA simple Neural Network for sentiment analysis, embedding sentences using a Transformer network.
Stars: ✭ 26 (-71.74%)
The Story Of HeadsThis is a repository with the code for the ACL 2019 paper "Analyzing Multi-Head Self-Attention: Specialized Heads Do the Heavy Lifting, the Rest Can Be Pruned" and the paper "Analyzing Source and Target Contributions to NMT Predictions".
Stars: ✭ 146 (+58.7%)
Restormer[CVPR 2022--Oral] Restormer: Efficient Transformer for High-Resolution Image Restoration. SOTA for motion deblurring, image deraining, denoising (Gaussian/real data), and defocus deblurring.
Stars: ✭ 586 (+536.96%)
Context-TransformerContext-Transformer: Tackling Object Confusion for Few-Shot Detection, AAAI 2020
Stars: ✭ 89 (-3.26%)
Onnxt5Summarization, translation, sentiment-analysis, text-generation and more at blazing speed using a T5 version implemented in ONNX.
Stars: ✭ 143 (+55.43%)
OverlapPredator[CVPR 2021, Oral] PREDATOR: Registration of 3D Point Clouds with Low Overlap.
Stars: ✭ 293 (+218.48%)
Data-RejuvenationImplementation of our paper "Data Rejuvenation: Exploiting Inactive Training Examples for Neural Machine Translation" in EMNLP-2020.
Stars: ✭ 18 (-80.43%)
FNet-pytorchUnofficial implementation of Google's FNet: Mixing Tokens with Fourier Transforms
Stars: ✭ 204 (+121.74%)
Transformer-ocrHandwritten text recognition using transformers.
Stars: ✭ 92 (+0%)
YOLOSYou Only Look at One Sequence (NeurIPS 2021)
Stars: ✭ 612 (+565.22%)
h-transformer-1dImplementation of H-Transformer-1D, Hierarchical Attention for Sequence Learning
Stars: ✭ 121 (+31.52%)
SegSwap(CVPRW 2022) Learning Co-segmentation by Segment Swapping for Retrieval and Discovery
Stars: ✭ 46 (-50%)
WenetProduction First and Production Ready End-to-End Speech Recognition Toolkit
Stars: ✭ 617 (+570.65%)