EqtransformerEQTransformer, a python package for earthquake signal detection and phase picking using AI.
Stars: ✭ 95 (-33.57%)
dodrioExploring attention weights in transformer-based models with linguistic knowledge.
Stars: ✭ 233 (+62.94%)
ConformerOfficial code for Conformer: Local Features Coupling Global Representations for Visual Recognition
Stars: ✭ 345 (+141.26%)
Mojito微信、bilibili大图、长图、gif、视频、自定义view的转场效果,The transition effect of wechat, bilibili large image, long image, GIF, video and custom view
Stars: ✭ 1,068 (+646.85%)
LM-CNLCChinese Natural Language Correction via Language Model
Stars: ✭ 15 (-89.51%)
Tf chatbot seq2seq antilmSeq2seq chatbot with attention and anti-language model to suppress generic response, option for further improve by deep reinforcement learning.
Stars: ✭ 369 (+158.04%)
gdcCode for the ICLR 2021 paper "A Distributional Approach to Controlled Text Generation"
Stars: ✭ 94 (-34.27%)
Electra中文 预训练 ELECTRA 模型: 基于对抗学习 pretrain Chinese Model
Stars: ✭ 132 (-7.69%)
Transformer-TransducerPyTorch implementation of "Transformer Transducer: A Streamable Speech Recognition Model with Transformer Encoders and RNN-T Loss" (ICASSP 2020)
Stars: ✭ 61 (-57.34%)
SuggestTop-k Approximate String Matching.
Stars: ✭ 50 (-65.03%)
ICON(TPAMI2022) Salient Object Detection via Integrity Learning.
Stars: ✭ 125 (-12.59%)
TransformerA TensorFlow Implementation of the Transformer: Attention Is All You Need
Stars: ✭ 3,646 (+2449.65%)
NLP-paper🎨 🎨NLP 自然语言处理教程 🎨🎨 https://dataxujing.github.io/NLP-paper/
Stars: ✭ 23 (-83.92%)
Vision TransformerTensorflow implementation of the Vision Transformer (An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale)
Stars: ✭ 90 (-37.06%)
CrabNetPredict materials properties using only the composition information!
Stars: ✭ 57 (-60.14%)
PCPMPresenting Collection of Pretrained Models. Links to pretrained models in NLP and voice.
Stars: ✭ 21 (-85.31%)
deformer[ACL 2020] DeFormer: Decomposing Pre-trained Transformers for Faster Question Answering
Stars: ✭ 111 (-22.38%)
TDRGTransformer-based Dual Relation Graph for Multi-label Image Recognition. ICCV 2021
Stars: ✭ 32 (-77.62%)
KissCode for the paper "KISS: Keeping it Simple for Scene Text Recognition"
Stars: ✭ 108 (-24.48%)
KitanaQAKitanaQA: Adversarial training and data augmentation for neural question-answering models
Stars: ✭ 58 (-59.44%)
Gpt2client✍🏻 gpt2-client: Easy-to-use TensorFlow Wrapper for GPT-2 117M, 345M, 774M, and 1.5B Transformer Models 🤖 📝
Stars: ✭ 322 (+125.17%)
subword-lstm-lmLSTM Language Model with Subword Units Input Representations
Stars: ✭ 45 (-68.53%)
Nlp Librarycurated collection of papers for the nlp practitioner 📖👩🔬
Stars: ✭ 1,025 (+616.78%)
RezeroOfficial PyTorch Repo for "ReZero is All You Need: Fast Convergence at Large Depth"
Stars: ✭ 317 (+121.68%)
php-json-apiJSON API transformer outputting valid (PSR-7) API Responses.
Stars: ✭ 68 (-52.45%)
FasterTransformerTransformer related optimization, including BERT, GPT
Stars: ✭ 1,571 (+998.6%)
SockeyeSequence-to-sequence framework with a focus on Neural Machine Translation based on Apache MXNet
Stars: ✭ 990 (+592.31%)
CSV2RDFStreaming, transforming, SPARQL-based CSV to RDF converter. Apache license.
Stars: ✭ 48 (-66.43%)
TRAR-VQA[ICCV 2021] TRAR: Routing the Attention Spans in Transformers for Visual Question Answering -- Official Implementation
Stars: ✭ 49 (-65.73%)
TrankitTrankit is a Light-Weight Transformer-based Python Toolkit for Multilingual Natural Language Processing
Stars: ✭ 311 (+117.48%)
Zero-Shot-TTSUnofficial Implementation of Zero-Shot Text-to-Speech for Text-Based Insertion in Audio Narration
Stars: ✭ 33 (-76.92%)
Haystack🔍 Haystack is an open source NLP framework that leverages Transformer models. It enables developers to implement production-ready neural search, question answering, semantic document search and summarization for a wide range of applications.
Stars: ✭ 3,409 (+2283.92%)
Neural-Machine-TranslationSeveral basic neural machine translation models implemented by PyTorch & TensorFlow
Stars: ✭ 29 (-79.72%)
Xlnet PytorchAn implementation of Google Brain's 2019 XLNet in PyTorch
Stars: ✭ 304 (+112.59%)
keyword-transformerOfficial implementation of the Keyword Transformer: https://arxiv.org/abs/2104.00769
Stars: ✭ 76 (-46.85%)
Boilerplate Dynet Rnn LmBoilerplate code for quickly getting set up to run language modeling experiments
Stars: ✭ 37 (-74.13%)
dasher-webDasher text entry in HTML, CSS, JavaScript, and SVG
Stars: ✭ 34 (-76.22%)
VedastrA scene text recognition toolbox based on PyTorch
Stars: ✭ 290 (+102.8%)
Smiles TransformerOriginal implementation of the paper "SMILES Transformer: Pre-trained Molecular Fingerprint for Low Data Drug Discovery" by Shion Honda et al.
Stars: ✭ 86 (-39.86%)
pyVHDLParserStreaming based VHDL parser.
Stars: ✭ 51 (-64.34%)
Onnxt5Summarization, translation, sentiment-analysis, text-generation and more at blazing speed using a T5 version implemented in ONNX.
Stars: ✭ 143 (+0%)
Nlp researchNLP research:基于tensorflow的nlp深度学习项目,支持文本分类/句子匹配/序列标注/文本生成 四大任务
Stars: ✭ 141 (-1.4%)
Keras Gpt 2Load GPT-2 checkpoint and generate texts
Stars: ✭ 113 (-20.98%)
Bert ocr.pytorchUnofficial PyTorch implementation of 2D Attentional Irregular Scene Text Recognizer
Stars: ✭ 101 (-29.37%)
Nezha chinese pytorchNEZHA: Neural Contextualized Representation for Chinese Language Understanding
Stars: ✭ 65 (-54.55%)