Transformer-in-TransformerAn Implementation of Transformer in Transformer in TensorFlow for image classification, attention inside local patches
Stars: ✭ 40 (-87.22%)
SIGIR2021 ConureOne Person, One Model, One World: Learning Continual User Representation without Forgetting
Stars: ✭ 23 (-92.65%)
pynmta simple and complete pytorch implementation of neural machine translation system
Stars: ✭ 13 (-95.85%)
TadTREnd-to-end Temporal Action Detection with Transformer. [Under review for a journal publication]
Stars: ✭ 55 (-82.43%)
uformer-pytorchImplementation of Uformer, Attention-based Unet, in Pytorch
Stars: ✭ 54 (-82.75%)
amrlibA python library that makes AMR parsing, generation and visualization simple.
Stars: ✭ 107 (-65.81%)
tutelTutel MoE: An Optimized Mixture-of-Experts Implementation
Stars: ✭ 183 (-41.53%)
saintThe official PyTorch implementation of recent paper - SAINT: Improved Neural Networks for Tabular Data via Row Attention and Contrastive Pre-Training
Stars: ✭ 209 (-33.23%)
attention-is-all-you-need-paperImplementation of Vaswani, Ashish, et al. "Attention is all you need." Advances in neural information processing systems. 2017.
Stars: ✭ 97 (-69.01%)
Learning-Lab-C-LibraryThis library provides a set of basic functions for different type of deep learning (and other) algorithms in C.This deep learning library will be constantly updated
Stars: ✭ 20 (-93.61%)
AITQAresources for the IBM Airlines Table-Question-Answering Benchmark
Stars: ✭ 12 (-96.17%)
SOLQ"SOLQ: Segmenting Objects by Learning Queries", SOLQ is an end-to-end instance segmentation framework with Transformer.
Stars: ✭ 159 (-49.2%)
SwinIRSwinIR: Image Restoration Using Swin Transformer (official repository)
Stars: ✭ 1,260 (+302.56%)
max-deeplabUnofficial implementation of MaX-DeepLab for Instance Segmentation
Stars: ✭ 84 (-73.16%)
DabData Augmentation by Backtranslation (DAB) ヽ( •_-)ᕗ
Stars: ✭ 294 (-6.07%)
bert-as-a-service TFXEnd-to-end pipeline with TFX to train and deploy a BERT model for sentiment analysis.
Stars: ✭ 32 (-89.78%)
php-halHAL+JSON & HAL+XML API transformer outputting valid (PSR-7) API Responses.
Stars: ✭ 30 (-90.42%)
AllrankallRank is a framework for training learning-to-rank neural models based on PyTorch.
Stars: ✭ 269 (-14.06%)
EmbeddingEmbedding模型代码和学习笔记总结
Stars: ✭ 25 (-92.01%)
linformerImplementation of Linformer for Pytorch
Stars: ✭ 119 (-61.98%)
Filipino-Text-BenchmarksOpen-source benchmark datasets and pretrained transformer models in the Filipino language.
Stars: ✭ 22 (-92.97%)
Walk-TransformerFrom Random Walks to Transformer for Learning Node Embeddings (ECML-PKDD 2020) (In Pytorch and Tensorflow)
Stars: ✭ 26 (-91.69%)
bert in a flaskA dockerized flask API, serving ALBERT and BERT predictions using TensorFlow 2.0.
Stars: ✭ 32 (-89.78%)
trapperState-of-the-art NLP through transformer models in a modular design and consistent APIs.
Stars: ✭ 28 (-91.05%)
TransformerEasy Attributed String Creator
Stars: ✭ 278 (-11.18%)
semantic-document-relationsImplementation, trained models and result data for the paper "Pairwise Multi-Class Document Classification for Semantic Relations between Wikipedia Articles"
Stars: ✭ 21 (-93.29%)
svgs2fontsnpm-svgs2fonts。svg图标转字体图标库(svgs -> svg,ttf,eot,woff,woff2),nodejs。
Stars: ✭ 29 (-90.73%)
VedastrA scene text recognition toolbox based on PyTorch
Stars: ✭ 290 (-7.35%)
kosrKorean speech recognition based on transformer (트랜스포머 기반 한국어 음성 인식)
Stars: ✭ 25 (-92.01%)
Swin-Transformer-TensorflowUnofficial implementation of "Swin Transformer: Hierarchical Vision Transformer using Shifted Windows" (https://arxiv.org/abs/2103.14030)
Stars: ✭ 45 (-85.62%)
TransformerImplementation of Transformer model (originally from Attention is All You Need) applied to Time Series.
Stars: ✭ 273 (-12.78%)
galerkin-transformer[NeurIPS 2021] Galerkin Transformer: a linear attention without softmax
Stars: ✭ 111 (-64.54%)
deep-molecular-optimizationMolecular optimization by capturing chemist’s intuition using the Seq2Seq with attention and the Transformer
Stars: ✭ 60 (-80.83%)
Cognitive Speech TtsMicrosoft Text-to-Speech API sample code in several languages, part of Cognitive Services.
Stars: ✭ 312 (-0.32%)
segmenter[ICCV2021] Official PyTorch implementation of Segmenter: Transformer for Semantic Segmentation
Stars: ✭ 463 (+47.92%)
TextPrunerA PyTorch-based model pruning toolkit for pre-trained language models
Stars: ✭ 94 (-69.97%)
transformerNeutron: A pytorch based implementation of Transformer and its variants.
Stars: ✭ 60 (-80.83%)
Remi"Pop Music Transformer: Beat-based Modeling and Generation of Expressive Pop Piano Compositions", ACM Multimedia 2020
Stars: ✭ 273 (-12.78%)
Image-CaptionUsing LSTM or Transformer to solve Image Captioning in Pytorch
Stars: ✭ 36 (-88.5%)
charformer-pytorchImplementation of the GBST block from the Charformer paper, in Pytorch
Stars: ✭ 74 (-76.36%)
M3DETRCode base for M3DeTR: Multi-representation, Multi-scale, Mutual-relation 3D Object Detection with Transformers
Stars: ✭ 47 (-84.98%)
Restormer[CVPR 2022--Oral] Restormer: Efficient Transformer for High-Resolution Image Restoration. SOTA for motion deblurring, image deraining, denoising (Gaussian/real data), and defocus deblurring.
Stars: ✭ 586 (+87.22%)
Nlp Interview Notes本项目是作者们根据个人面试和经验总结出的自然语言处理(NLP)面试准备的学习笔记与资料,该资料目前包含 自然语言处理各领域的 面试题积累。
Stars: ✭ 207 (-33.87%)
Viewpagertransitionviewpager with parallax pages, together with vertical sliding (or click) and activity transition
Stars: ✭ 3,017 (+863.9%)
PAMLPersonalizing Dialogue Agents via Meta-Learning
Stars: ✭ 114 (-63.58%)