Gpt2 FrenchGPT-2 French demo | Démo française de GPT-2
Stars: ✭ 47 (-85.4%)
amrlibA python library that makes AMR parsing, generation and visualization simple.
Stars: ✭ 107 (-66.77%)
Gpt 2 Tensorflow2.0OpenAI GPT2 pre-training and sequence prediction implementation in Tensorflow 2.0
Stars: ✭ 172 (-46.58%)
Gpt2 ChineseChinese version of GPT2 training code, using BERT tokenizer.
Stars: ✭ 4,592 (+1326.09%)
Gpt2 ChitchatGPT2 for Chinese chitchat/用于中文闲聊的GPT2模型(实现了DialoGPT的MMI思想)
Stars: ✭ 1,230 (+281.99%)
DialogptLarge-scale pretraining for dialogue
Stars: ✭ 1,177 (+265.53%)
Onnxt5Summarization, translation, sentiment-analysis, text-generation and more at blazing speed using a T5 version implemented in ONNX.
Stars: ✭ 143 (-55.59%)
Gpt2 NewstitleChinese NewsTitle Generation Project by GPT2.带有超级详细注释的中文GPT2新闻标题生成项目。
Stars: ✭ 235 (-27.02%)
saintThe official PyTorch implementation of recent paper - SAINT: Improved Neural Networks for Tabular Data via Row Attention and Contrastive Pre-Training
Stars: ✭ 209 (-35.09%)
charformer-pytorchImplementation of the GBST block from the Charformer paper, in Pytorch
Stars: ✭ 74 (-77.02%)
Nlp Interview Notes本项目是作者们根据个人面试和经验总结出的自然语言处理(NLP)面试准备的学习笔记与资料,该资料目前包含 自然语言处理各领域的 面试题积累。
Stars: ✭ 207 (-35.71%)
Kenlg ReadingReading list for knowledge-enhanced text generation, with a survey
Stars: ✭ 257 (-20.19%)
Filipino-Text-BenchmarksOpen-source benchmark datasets and pretrained transformer models in the Filipino language.
Stars: ✭ 22 (-93.17%)
attention-is-all-you-need-paperImplementation of Vaswani, Ashish, et al. "Attention is all you need." Advances in neural information processing systems. 2017.
Stars: ✭ 97 (-69.88%)
pynmta simple and complete pytorch implementation of neural machine translation system
Stars: ✭ 13 (-95.96%)
Cognitive Speech TtsMicrosoft Text-to-Speech API sample code in several languages, part of Cognitive Services.
Stars: ✭ 312 (-3.11%)
TransformerEasy Attributed String Creator
Stars: ✭ 278 (-13.66%)
AITQAresources for the IBM Airlines Table-Question-Answering Benchmark
Stars: ✭ 12 (-96.27%)
kosrKorean speech recognition based on transformer (트랜스포머 기반 한국어 음성 인식)
Stars: ✭ 25 (-92.24%)
bert-as-a-service TFXEnd-to-end pipeline with TFX to train and deploy a BERT model for sentiment analysis.
Stars: ✭ 32 (-90.06%)
AllrankallRank is a framework for training learning-to-rank neural models based on PyTorch.
Stars: ✭ 269 (-16.46%)
DabData Augmentation by Backtranslation (DAB) ヽ( •_-)ᕗ
Stars: ✭ 294 (-8.7%)
linformerImplementation of Linformer for Pytorch
Stars: ✭ 119 (-63.04%)
Accelerated TextAccelerated Text is a no-code natural language generation platform. It will help you construct document plans which define how your data is converted to textual descriptions varying in wording and structure.
Stars: ✭ 256 (-20.5%)
PAMLPersonalizing Dialogue Agents via Meta-Learning
Stars: ✭ 114 (-64.6%)
skip-thought-ganGenerating Text through Adversarial Training(GAN) using Skip-Thought Vectors
Stars: ✭ 44 (-86.34%)
TextboxTextBox is an open-source library for building text generation system.
Stars: ✭ 257 (-20.19%)
trapperState-of-the-art NLP through transformer models in a modular design and consistent APIs.
Stars: ✭ 28 (-91.3%)
Viewpagertransitionviewpager with parallax pages, together with vertical sliding (or click) and activity transition
Stars: ✭ 3,017 (+836.96%)
semantic-document-relationsImplementation, trained models and result data for the paper "Pairwise Multi-Class Document Classification for Semantic Relations between Wikipedia Articles"
Stars: ✭ 21 (-93.48%)
bert in a flaskA dockerized flask API, serving ALBERT and BERT predictions using TensorFlow 2.0.
Stars: ✭ 32 (-90.06%)
ebe-datasetEvidence-based Explanation Dataset (AACL-IJCNLP 2020)
Stars: ✭ 16 (-95.03%)
RezeroOfficial PyTorch Repo for "ReZero is All You Need: Fast Convergence at Large Depth"
Stars: ✭ 317 (-1.55%)
svgs2fontsnpm-svgs2fonts。svg图标转字体图标库(svgs -> svg,ttf,eot,woff,woff2),nodejs。
Stars: ✭ 29 (-90.99%)
SOLQ"SOLQ: Segmenting Objects by Learning Queries", SOLQ is an end-to-end instance segmentation framework with Transformer.
Stars: ✭ 159 (-50.62%)
uformer-pytorchImplementation of Uformer, Attention-based Unet, in Pytorch
Stars: ✭ 54 (-83.23%)
download-tweets-ai-text-gen-plusPython script to download public Tweets from a given Twitter account into a format suitable for AI text generation
Stars: ✭ 26 (-91.93%)
TransformerImplementation of Transformer model (originally from Attention is All You Need) applied to Time Series.
Stars: ✭ 273 (-15.22%)
Swin-Transformer-TensorflowUnofficial implementation of "Swin Transformer: Hierarchical Vision Transformer using Shifted Windows" (https://arxiv.org/abs/2103.14030)
Stars: ✭ 45 (-86.02%)
max-deeplabUnofficial implementation of MaX-DeepLab for Instance Segmentation
Stars: ✭ 84 (-73.91%)
deep-molecular-optimizationMolecular optimization by capturing chemist’s intuition using the Seq2Seq with attention and the Transformer
Stars: ✭ 60 (-81.37%)
SwinIRSwinIR: Image Restoration Using Swin Transformer (official repository)
Stars: ✭ 1,260 (+291.3%)
Transformer-in-TransformerAn Implementation of Transformer in Transformer in TensorFlow for image classification, attention inside local patches
Stars: ✭ 40 (-87.58%)
galerkin-transformer[NeurIPS 2021] Galerkin Transformer: a linear attention without softmax
Stars: ✭ 111 (-65.53%)
segmenter[ICCV2021] Official PyTorch implementation of Segmenter: Transformer for Semantic Segmentation
Stars: ✭ 463 (+43.79%)