Bert ChainerChainer implementation of "BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding"
Stars: ✭ 205 (-55.82%)
imdb-transformerA simple Neural Network for sentiment analysis, embedding sentences using a Transformer network.
Stars: ✭ 26 (-94.4%)
vietnamese-robertaA Robustly Optimized BERT Pretraining Approach for Vietnamese
Stars: ✭ 22 (-95.26%)
RandLA-Net-pytorch🍀 Pytorch Implementation of RandLA-Net (https://arxiv.org/abs/1911.11236)
Stars: ✭ 69 (-85.13%)
AITQAresources for the IBM Airlines Table-Question-Answering Benchmark
Stars: ✭ 12 (-97.41%)
Gpt ScrollsA collaborative collection of open-source safe GPT-3 prompts that work well
Stars: ✭ 195 (-57.97%)
semantic-tsdfSemantic-TSDF for Self-driving Static Scene Reconstruction
Stars: ✭ 14 (-96.98%)
Asr Stars: ✭ 54 (-88.36%)
ssganSemi Supervised Semantic Segmentation Using Generative Adversarial Network ; Pytorch
Stars: ✭ 25 (-94.61%)
Mojito微信、bilibili大图、长图、gif、视频、自定义view的转场效果,The transition effect of wechat, bilibili large image, long image, GIF, video and custom view
Stars: ✭ 1,068 (+130.17%)
KospeechOpen-Source Toolkit for End-to-End Korean Automatic Speech Recognition.
Stars: ✭ 190 (-59.05%)
SIGIR2021 ConureOne Person, One Model, One World: Learning Continual User Representation without Forgetting
Stars: ✭ 23 (-95.04%)
bert-as-a-service TFXEnd-to-end pipeline with TFX to train and deploy a BERT model for sentiment analysis.
Stars: ✭ 32 (-93.1%)
SentimentanalysisSentiment analysis neural network trained by fine-tuning BERT, ALBERT, or DistilBERT on the Stanford Sentiment Treebank.
Stars: ✭ 186 (-59.91%)
saintThe official PyTorch implementation of recent paper - SAINT: Improved Neural Networks for Tabular Data via Row Attention and Contrastive Pre-Training
Stars: ✭ 209 (-54.96%)
ImageNet21KOfficial Pytorch Implementation of: "ImageNet-21K Pretraining for the Masses"(NeurIPS, 2021) paper
Stars: ✭ 565 (+21.77%)
linformerImplementation of Linformer for Pytorch
Stars: ✭ 119 (-74.35%)
Transformer ClinicUnderstanding the Difficulty of Training Transformers
Stars: ✭ 179 (-61.42%)
PAMLPersonalizing Dialogue Agents via Meta-Learning
Stars: ✭ 114 (-75.43%)
TokenLabelingPytorch implementation of "All Tokens Matter: Token Labeling for Training Better Vision Transformers"
Stars: ✭ 385 (-17.03%)
attention-is-all-you-need-paperImplementation of Vaswani, Ashish, et al. "Attention is all you need." Advances in neural information processing systems. 2017.
Stars: ✭ 97 (-79.09%)
Transformers.jlJulia Implementation of Transformer models
Stars: ✭ 173 (-62.72%)
pynmta simple and complete pytorch implementation of neural machine translation system
Stars: ✭ 13 (-97.2%)
UNETRUnofficial code base for UNETR: Transformers for 3D Medical Image Segmentation
Stars: ✭ 60 (-87.07%)
Gpt 2 Tensorflow2.0OpenAI GPT2 pre-training and sequence prediction implementation in Tensorflow 2.0
Stars: ✭ 172 (-62.93%)
sisterSImple SenTence EmbeddeR
Stars: ✭ 66 (-85.78%)
SOLQ"SOLQ: Segmenting Objects by Learning Queries", SOLQ is an end-to-end instance segmentation framework with Transformer.
Stars: ✭ 159 (-65.73%)
amrlibA python library that makes AMR parsing, generation and visualization simple.
Stars: ✭ 107 (-76.94%)
SyConnToolkit for the generation and analysis of volume eletron microscopy based synaptic connectomes of brain tissue.
Stars: ✭ 31 (-93.32%)
max-deeplabUnofficial implementation of MaX-DeepLab for Instance Segmentation
Stars: ✭ 84 (-81.9%)
Embedding As ServiceOne-Stop Solution to encode sentence to fixed length vectors from various embedding techniques
Stars: ✭ 151 (-67.46%)
Transformer-in-TransformerAn Implementation of Transformer in Transformer in TensorFlow for image classification, attention inside local patches
Stars: ✭ 40 (-91.38%)
kaggle-champsCode for the CHAMPS Predicting Molecular Properties Kaggle competition
Stars: ✭ 49 (-89.44%)
BigbirdTransformers for Longer Sequences
Stars: ✭ 146 (-68.53%)
tutelTutel MoE: An Optimized Mixture-of-Experts Implementation
Stars: ✭ 183 (-60.56%)
SReTOfficial PyTorch implementation of our ECCV 2022 paper "Sliced Recursive Transformer"
Stars: ✭ 51 (-89.01%)
php-halHAL+JSON & HAL+XML API transformer outputting valid (PSR-7) API Responses.
Stars: ✭ 30 (-93.53%)
Tensorflowasr集成了Tensorflow 2版本的端到端语音识别模型,并且RTF(实时率)在0.1左右/Mandarin State-of-the-art Automatic Speech Recognition in Tensorflow 2
Stars: ✭ 145 (-68.75%)
TadTREnd-to-end Temporal Action Detection with Transformer. [Under review for a journal publication]
Stars: ✭ 55 (-88.15%)
capeContinuous Augmented Positional Embeddings (CAPE) implementation for PyTorch
Stars: ✭ 29 (-93.75%)
EmbeddingEmbedding模型代码和学习笔记总结
Stars: ✭ 25 (-94.61%)
TupeTransformer with Untied Positional Encoding (TUPE). Code of paper "Rethinking Positional Encoding in Language Pre-training". Improve existing models like BERT.
Stars: ✭ 143 (-69.18%)
DocuNetCode and dataset for the IJCAI 2021 paper "Document-level Relation Extraction as Semantic Segmentation".
Stars: ✭ 84 (-81.9%)
densecapDense video captioning in PyTorch
Stars: ✭ 37 (-92.03%)
Gpt2 FrenchGPT-2 French demo | Démo française de GPT-2
Stars: ✭ 47 (-89.87%)
Bentools EtlPHP ETL (Extract / Transform / Load) library with SOLID principles + almost no dependency.
Stars: ✭ 45 (-90.3%)
tf-semantic-segmentation-FCN-VGG16Semantic segmentation for classifying road. "Fully Convolutional Networks for Semantic Segmentation (2015)" implemented using TF
Stars: ✭ 30 (-93.53%)
SockeyeSequence-to-sequence framework with a focus on Neural Machine Translation based on Apache MXNet
Stars: ✭ 990 (+113.36%)
MoelMoEL: Mixture of Empathetic Listeners
Stars: ✭ 38 (-91.81%)