Meta EmbMultilingual Meta-Embeddings for Named Entity Recognition (RepL4NLP & EMNLP 2019)
Stars: ✭ 28 (-48.15%)
OmninetOfficial Pytorch implementation of "OmniNet: A unified architecture for multi-modal multi-task learning" | Authors: Subhojeet Pramanik, Priyanka Agrawal, Aman Hussain
Stars: ✭ 448 (+729.63%)
Rasa chatbot cnbuilding a chinese dialogue system based on the newest version of rasa(基于最新版本rasa搭建的对话系统)
Stars: ✭ 723 (+1238.89%)
Easyflipviewpager📖 The library for creating book and card flip animations in ViewPager in Android
Stars: ✭ 698 (+1192.59%)
Pytorch Original TransformerMy implementation of the original transformer model (Vaswani et al.). I've additionally included the playground.py file for visualizing otherwise seemingly hard concepts. Currently included IWSLT pretrained models.
Stars: ✭ 411 (+661.11%)
Stock Prediction ModelsGathers machine learning and deep learning models for Stock forecasting including trading bots and simulations
Stars: ✭ 4,660 (+8529.63%)
Keras SincnetKeras (tensorflow) implementation of SincNet (Mirco Ravanelli, Yoshua Bengio - https://github.com/mravanelli/SincNet)
Stars: ✭ 47 (-12.96%)
MoelMoEL: Mixture of Empathetic Listeners
Stars: ✭ 38 (-29.63%)
Cell DetrOfficial and maintained implementation of the paper Attention-Based Transformers for Instance Segmentation of Cells in Microstructures [BIBM 2020].
Stars: ✭ 26 (-51.85%)
Open sttOpen STT
Stars: ✭ 584 (+981.48%)
Seq2seq PytorchSequence to Sequence Models with PyTorch
Stars: ✭ 678 (+1155.56%)
Tf Seq2seqSequence to sequence learning using TensorFlow.
Stars: ✭ 387 (+616.67%)
Transgan[Preprint] "TransGAN: Two Transformers Can Make One Strong GAN", Yifan Jiang, Shiyu Chang, Zhangyang Wang
Stars: ✭ 864 (+1500%)
CheetahOn-device streaming speech-to-text engine powered by deep learning
Stars: ✭ 383 (+609.26%)
Libreasr💬 An On-Premises, Streaming Speech Recognition System
Stars: ✭ 633 (+1072.22%)
Figma TransformerA tiny utility library that makes the Figma API more human friendly.
Stars: ✭ 27 (-50%)
Time Series PredictionA collection of time series prediction methods: rnn, seq2seq, cnn, wavenet, transformer, unet, n-beats, gan, kalman-filter
Stars: ✭ 351 (+550%)
React Native Svg TransformerImport SVG files in your React Native project the same way that you would in a Web application.
Stars: ✭ 568 (+951.85%)
Im2latexImage to LaTeX (Seq2seq + Attention with Beam Search) - Tensorflow
Stars: ✭ 342 (+533.33%)
TransformerA TensorFlow Implementation of the Transformer: Attention Is All You Need
Stars: ✭ 3,646 (+6651.85%)
Odsc 2020 nlpRepository for ODSC talk related to Deep Learning NLP
Stars: ✭ 20 (-62.96%)
Awesome Bert NlpA curated list of NLP resources focused on BERT, attention mechanism, Transformer networks, and transfer learning.
Stars: ✭ 567 (+950%)
Neural Combinatorial Rl PytorchPyTorch implementation of Neural Combinatorial Optimization with Reinforcement Learning https://arxiv.org/abs/1611.09940
Stars: ✭ 329 (+509.26%)
Jazz transformerTransformer-XL for Jazz music composition. Paper: "The Jazz Transformer on the Front Line: Exploring the Shortcomings of AI-Composed Music through Quantitative Measures", ISMIR 2020
Stars: ✭ 36 (-33.33%)
Seq2seq Signal PredictionSignal forecasting with a Sequence-to-Sequence (seq2seq) Recurrent Neural Network (RNN) model in TensorFlow - Guillaume Chevalier
Stars: ✭ 890 (+1548.15%)
Bert paper chinese translationBERT: Pre-training of Deep Bidirectional Transformers for Language Understanding 论文的中文翻译 Chinese Translation!
Stars: ✭ 564 (+944.44%)
Contextualized Topic ModelsA python package to run contextualized topic modeling. CTMs combine BERT with topic models to get coherent topics. Also supports multilingual tasks. Cross-lingual Zero-shot model published at EACL 2021.
Stars: ✭ 318 (+488.89%)
Nl2bashGenerating bash command from natural language https://arxiv.org/abs/1802.08979
Stars: ✭ 325 (+501.85%)
Practical seq2seqA simple, minimal wrapper for tensorflow's seq2seq module, for experimenting with datasets rapidly
Stars: ✭ 563 (+942.59%)
Gpt2client✍🏻 gpt2-client: Easy-to-use TensorFlow Wrapper for GPT-2 117M, 345M, 774M, and 1.5B Transformer Models 🤖 📝
Stars: ✭ 322 (+496.3%)
Seq2seq Chatbot For KerasThis repository contains a new generative model of chatbot based on seq2seq modeling.
Stars: ✭ 322 (+496.3%)
Turbotransformersa fast and user-friendly runtime for transformer inference (Bert, Albert, GPT2, Decoders, etc) on CPU and GPU.
Stars: ✭ 826 (+1429.63%)
Seq2seq CoupletPlay couplet with seq2seq model. 用深度学习对对联。
Stars: ✭ 5,149 (+9435.19%)
Transformer TensorflowTensorFlow implementation of 'Attention Is All You Need (2017. 6)'
Stars: ✭ 319 (+490.74%)
RezeroOfficial PyTorch Repo for "ReZero is All You Need: Fast Convergence at Large Depth"
Stars: ✭ 317 (+487.04%)
Seq2seqMinimal Seq2Seq model with Attention for Neural Machine Translation in PyTorch
Stars: ✭ 552 (+922.22%)
PyhgtCode for "Heterogeneous Graph Transformer" (WWW'20), which is based on pytorch_geometric
Stars: ✭ 313 (+479.63%)
Gpt2 FrenchGPT-2 French demo | Démo française de GPT-2
Stars: ✭ 47 (-12.96%)
Nlp Experiments In PytorchPyTorch repository for text categorization and NER experiments in Turkish and English.
Stars: ✭ 35 (-35.19%)
Bert KerasKeras implementation of BERT with pre-trained weights
Stars: ✭ 820 (+1418.52%)
Cognitive Speech TtsMicrosoft Text-to-Speech API sample code in several languages, part of Cognitive Services.
Stars: ✭ 312 (+477.78%)
Seq2seq chatbot基于seq2seq模型的简单对话系统的tf实现,具有embedding、attention、beam_search等功能,数据集是Cornell Movie Dialogs
Stars: ✭ 308 (+470.37%)