pytorch2kerasPyTorch to Keras model convertor
Stars: ✭ 788 (+175.52%)
bert-as-a-service TFXEnd-to-end pipeline with TFX to train and deploy a BERT model for sentiment analysis.
Stars: ✭ 32 (-88.81%)
pynmta simple and complete pytorch implementation of neural machine translation system
Stars: ✭ 13 (-95.45%)
galerkin-transformer[NeurIPS 2021] Galerkin Transformer: a linear attention without softmax
Stars: ✭ 111 (-61.19%)
SOLQ"SOLQ: Segmenting Objects by Learning Queries", SOLQ is an end-to-end instance segmentation framework with Transformer.
Stars: ✭ 159 (-44.41%)
Deep Learning In ProductionIn this repository, I will share some useful notes and references about deploying deep learning-based models in production.
Stars: ✭ 3,104 (+985.31%)
bodymojiDraws an emoji on your face! Powered by Nuxt.js, Tensorflow.js and Posenet
Stars: ✭ 21 (-92.66%)
attention-is-all-you-need-paperImplementation of Vaswani, Ashish, et al. "Attention is all you need." Advances in neural information processing systems. 2017.
Stars: ✭ 97 (-66.08%)
php-halHAL+JSON & HAL+XML API transformer outputting valid (PSR-7) API Responses.
Stars: ✭ 30 (-89.51%)
Swin-Transformer-TensorflowUnofficial implementation of "Swin Transformer: Hierarchical Vision Transformer using Shifted Windows" (https://arxiv.org/abs/2103.14030)
Stars: ✭ 45 (-84.27%)
semantic-document-relationsImplementation, trained models and result data for the paper "Pairwise Multi-Class Document Classification for Semantic Relations between Wikipedia Articles"
Stars: ✭ 21 (-92.66%)
Multiple Relations Extraction Only Look OnceMultiple-Relations-Extraction-Only-Look-Once. Just look at the sentence once and extract the multiple pairs of entities and their corresponding relations. 端到端联合多关系抽取模型,可用于 http://lic2019.ccf.org.cn/kg 信息抽取。
Stars: ✭ 269 (-5.94%)
TextPrunerA PyTorch-based model pruning toolkit for pre-trained language models
Stars: ✭ 94 (-67.13%)
amrlibA python library that makes AMR parsing, generation and visualization simple.
Stars: ✭ 107 (-62.59%)
TransformerImplementation of Transformer model (originally from Attention is All You Need) applied to Time Series.
Stars: ✭ 273 (-4.55%)
saintThe official PyTorch implementation of recent paper - SAINT: Improved Neural Networks for Tabular Data via Row Attention and Contrastive Pre-Training
Stars: ✭ 209 (-26.92%)
Transformer-in-TransformerAn Implementation of Transformer in Transformer in TensorFlow for image classification, attention inside local patches
Stars: ✭ 40 (-86.01%)
bert in a flaskA dockerized flask API, serving ALBERT and BERT predictions using TensorFlow 2.0.
Stars: ✭ 32 (-88.81%)
tutelTutel MoE: An Optimized Mixture-of-Experts Implementation
Stars: ✭ 183 (-36.01%)
linformerImplementation of Linformer for Pytorch
Stars: ✭ 119 (-58.39%)
Filipino-Text-BenchmarksOpen-source benchmark datasets and pretrained transformer models in the Filipino language.
Stars: ✭ 22 (-92.31%)
M3DETRCode base for M3DeTR: Multi-representation, Multi-scale, Mutual-relation 3D Object Detection with Transformers
Stars: ✭ 47 (-83.57%)
uformer-pytorchImplementation of Uformer, Attention-based Unet, in Pytorch
Stars: ✭ 54 (-81.12%)
trapperState-of-the-art NLP through transformer models in a modular design and consistent APIs.
Stars: ✭ 28 (-90.21%)
AllrankallRank is a framework for training learning-to-rank neural models based on PyTorch.
Stars: ✭ 269 (-5.94%)
SwinIRSwinIR: Image Restoration Using Swin Transformer (official repository)
Stars: ✭ 1,260 (+340.56%)
FineGrainedVisualRecognitionFine grained visual recognition tensorflow baseline on CUB, Stanford Cars, Dogs, Aircrafts, and Flower102.
Stars: ✭ 19 (-93.36%)
SIGIR2021 ConureOne Person, One Model, One World: Learning Continual User Representation without Forgetting
Stars: ✭ 23 (-91.96%)
kosrKorean speech recognition based on transformer (트랜스포머 기반 한국어 음성 인식)
Stars: ✭ 25 (-91.26%)
Nlp Interview Notes本项目是作者们根据个人面试和经验总结出的自然语言处理(NLP)面试准备的学习笔记与资料,该资料目前包含 自然语言处理各领域的 面试题积累。
Stars: ✭ 207 (-27.62%)
sdpDeep nonparametric estimation of discrete conditional distributions via smoothed dyadic partitioning
Stars: ✭ 15 (-94.76%)
GAN-Project-2018GAN in Tensorflow to be run via Linux command line
Stars: ✭ 21 (-92.66%)
TransformerEasy Attributed String Creator
Stars: ✭ 278 (-2.8%)
relation-networkTensorflow Implementation of Relation Networks for the bAbI QA Task, detailed in "A Simple Neural Network Module for Relational Reasoning," [https://arxiv.org/abs/1706.01427] by Santoro et. al.
Stars: ✭ 45 (-84.27%)
charformer-pytorchImplementation of the GBST block from the Charformer paper, in Pytorch
Stars: ✭ 74 (-74.13%)
max-deeplabUnofficial implementation of MaX-DeepLab for Instance Segmentation
Stars: ✭ 84 (-70.63%)
deep-molecular-optimizationMolecular optimization by capturing chemist’s intuition using the Seq2Seq with attention and the Transformer
Stars: ✭ 60 (-79.02%)
segmenter[ICCV2021] Official PyTorch implementation of Segmenter: Transformer for Semantic Segmentation
Stars: ✭ 463 (+61.89%)
transformerNeutron: A pytorch based implementation of Transformer and its variants.
Stars: ✭ 60 (-79.02%)
Image-CaptionUsing LSTM or Transformer to solve Image Captioning in Pytorch
Stars: ✭ 36 (-87.41%)
AITQAresources for the IBM Airlines Table-Question-Answering Benchmark
Stars: ✭ 12 (-95.8%)
Viewpagertransitionviewpager with parallax pages, together with vertical sliding (or click) and activity transition
Stars: ✭ 3,017 (+954.9%)
Remi"Pop Music Transformer: Beat-based Modeling and Generation of Expressive Pop Piano Compositions", ACM Multimedia 2020
Stars: ✭ 273 (-4.55%)
svgs2fontsnpm-svgs2fonts。svg图标转字体图标库(svgs -> svg,ttf,eot,woff,woff2),nodejs。
Stars: ✭ 29 (-89.86%)
PAMLPersonalizing Dialogue Agents via Meta-Learning
Stars: ✭ 114 (-60.14%)