Gpt 2 Tensorflow2.0OpenAI GPT2 pre-training and sequence prediction implementation in Tensorflow 2.0
Stars: ✭ 172 (-86.02%)
DialogptLarge-scale pretraining for dialogue
Stars: ✭ 1,177 (-4.31%)
Gpt2client✍🏻 gpt2-client: Easy-to-use TensorFlow Wrapper for GPT-2 117M, 345M, 774M, and 1.5B Transformer Models 🤖 📝
Stars: ✭ 322 (-73.82%)
amrlibA python library that makes AMR parsing, generation and visualization simple.
Stars: ✭ 107 (-91.3%)
Onnxt5Summarization, translation, sentiment-analysis, text-generation and more at blazing speed using a T5 version implemented in ONNX.
Stars: ✭ 143 (-88.37%)
Gpt2 FrenchGPT-2 French demo | Démo française de GPT-2
Stars: ✭ 47 (-96.18%)
Gpt2 NewstitleChinese NewsTitle Generation Project by GPT2.带有超级详细注释的中文GPT2新闻标题生成项目。
Stars: ✭ 235 (-80.89%)
Gpt2 ChineseChinese version of GPT2 training code, using BERT tokenizer.
Stars: ✭ 4,592 (+273.33%)
Transgan[Preprint] "TransGAN: Two Transformers Can Make One Strong GAN", Yifan Jiang, Shiyu Chang, Zhangyang Wang
Stars: ✭ 864 (-29.76%)
Keras Textclassification中文长文本分类、短句子分类、多标签分类、两句子相似度(Chinese Text Classification of Keras NLP, multi-label classify, or sentence classify, long or short),字词句向量嵌入层(embeddings)和网络层(graph)构建基类,FastText,TextCNN,CharCNN,TextRNN, RCNN, DCNN, DPCNN, VDCNN, CRNN, Bert, Xlnet, Albert, Attention, DeepMoji, HAN, 胶囊网络-CapsuleNet, Transformer-encode, Seq2seq, SWEM, LEAM, TextGCN
Stars: ✭ 914 (-25.69%)
Transformer DynetAn Implementation of Transformer (Attention Is All You Need) in DyNet
Stars: ✭ 57 (-95.37%)
OpenasrA pytorch based end2end speech recognition system.
Stars: ✭ 69 (-94.39%)
Cell DetrOfficial and maintained implementation of the paper Attention-Based Transformers for Instance Segmentation of Cells in Microstructures [BIBM 2020].
Stars: ✭ 26 (-97.89%)
Odsc 2020 nlpRepository for ODSC talk related to Deep Learning NLP
Stars: ✭ 20 (-98.37%)
Bert KerasKeras implementation of BERT with pre-trained weights
Stars: ✭ 820 (-33.33%)
Market ReporterAutomatic Generation of Brief Summaries of Time-Series Data
Stars: ✭ 54 (-95.61%)
Getting Things Done With PytorchJupyter Notebook tutorials on solving real-world problems with Machine Learning & Deep Learning using PyTorch. Topics: Face detection with Detectron 2, Time Series anomaly detection with LSTM Autoencoders, Object Detection with YOLO v5, Build your first Neural Network, Time Series forecasting for Coronavirus daily cases, Sentiment Analysis with BERT.
Stars: ✭ 738 (-40%)
TraxTrax — Deep Learning with Clear Code and Speed
Stars: ✭ 6,666 (+441.95%)
Nlp TutorialNatural Language Processing Tutorial for Deep Learning Researchers
Stars: ✭ 9,895 (+704.47%)
Gpt2 MlGPT2 for Multiple Languages, including pretrained models. GPT2 多语言支持, 15亿参数中文预训练模型
Stars: ✭ 1,066 (-13.33%)
Bert For Tf2A Keras TensorFlow 2.0 implementation of BERT, ALBERT and adapter-BERT.
Stars: ✭ 683 (-44.47%)
Texar PytorchIntegrating the Best of TF into PyTorch, for Machine Learning, Natural Language Processing, and Text Generation. This is part of the CASL project: http://casl-project.ai/
Stars: ✭ 636 (-48.29%)
Meta EmbMultilingual Meta-Embeddings for Named Entity Recognition (RepL4NLP & EMNLP 2019)
Stars: ✭ 28 (-97.72%)
MarkovA generic markov chain implementation in Rust.
Stars: ✭ 59 (-95.2%)
WitwickyWitwicky: An implementation of Transformer in PyTorch.
Stars: ✭ 21 (-98.29%)
PresentoPresento - Transformer & Presenter Package for PHP
Stars: ✭ 71 (-94.23%)
Figma TransformerA tiny utility library that makes the Figma API more human friendly.
Stars: ✭ 27 (-97.8%)
Laravel GraphqlGraphQL implementation with power of Laravel
Stars: ✭ 56 (-95.45%)
Concise Ipython Notebooks For Deep LearningIpython Notebooks for solving problems like classification, segmentation, generation using latest Deep learning algorithms on different publicly available text and image data-sets.
Stars: ✭ 23 (-98.13%)
Distre[ACL 19] Fine-tuning Pre-Trained Transformer Language Models to Distantly Supervised Relation Extraction
Stars: ✭ 75 (-93.9%)
Turbotransformersa fast and user-friendly runtime for transformer inference (Bert, Albert, GPT2, Decoders, etc) on CPU and GPU.
Stars: ✭ 826 (-32.85%)
Asr Stars: ✭ 54 (-95.61%)
GroverCode for Defending Against Neural Fake News, https://rowanzellers.com/grover/
Stars: ✭ 774 (-37.07%)
Mixture Of ExpertsA Pytorch implementation of Sparsely-Gated Mixture of Experts, for massively increasing the parameter count of language models
Stars: ✭ 68 (-94.47%)
Rasa chatbot cnbuilding a chinese dialogue system based on the newest version of rasa(基于最新版本rasa搭建的对话系统)
Stars: ✭ 723 (-41.22%)
Mojito微信、bilibili大图、长图、gif、视频、自定义view的转场效果,The transition effect of wechat, bilibili large image, long image, GIF, video and custom view
Stars: ✭ 1,068 (-13.17%)
Easyflipviewpager📖 The library for creating book and card flip animations in ViewPager in Android
Stars: ✭ 698 (-43.25%)
Laravel ResponderA Laravel Fractal package for building API responses, giving you the power of Fractal with Laravel's elegancy.
Stars: ✭ 673 (-45.28%)
Deep Ctr PredictionCTR prediction models based on deep learning(基于深度学习的广告推荐CTR预估模型)
Stars: ✭ 628 (-48.94%)
WenetProduction First and Production Ready End-to-End Speech Recognition Toolkit
Stars: ✭ 617 (-49.84%)
Bentools EtlPHP ETL (Extract / Transform / Load) library with SOLID principles + almost no dependency.
Stars: ✭ 45 (-96.34%)
Cdial Gpt A Large-scale Chinese Short-Text Conversation Dataset and Chinese pre-training dialog models
Stars: ✭ 596 (-51.54%)
Gpt2PyTorch Implementation of OpenAI GPT-2
Stars: ✭ 64 (-94.8%)
React Native Svg TransformerImport SVG files in your React Native project the same way that you would in a Web application.
Stars: ✭ 568 (-53.82%)
Awesome Bert NlpA curated list of NLP resources focused on BERT, attention mechanism, Transformer networks, and transfer learning.
Stars: ✭ 567 (-53.9%)
SockeyeSequence-to-sequence framework with a focus on Neural Machine Translation based on Apache MXNet
Stars: ✭ 990 (-19.51%)
Speech TransformerA PyTorch implementation of Speech Transformer, an End-to-End ASR with Transformer network on Mandarin Chinese.
Stars: ✭ 565 (-54.07%)
Bert paper chinese translationBERT: Pre-training of Deep Bidirectional Transformers for Language Understanding 论文的中文翻译 Chinese Translation!
Stars: ✭ 564 (-54.15%)
Deeplearning Nlp ModelsA small, interpretable codebase containing the re-implementation of a few "deep" NLP models in PyTorch. Colab notebooks to run with GPUs. Models: word2vec, CNNs, transformer, gpt.
Stars: ✭ 64 (-94.8%)
MoelMoEL: Mixture of Empathetic Listeners
Stars: ✭ 38 (-96.91%)