WitwickyWitwicky: An implementation of Transformer in PyTorch.
Stars: ✭ 21 (-63.16%)
Tf Seq2seqSequence to sequence learning using TensorFlow.
Stars: ✭ 387 (+578.95%)
Bentools EtlPHP ETL (Extract / Transform / Load) library with SOLID principles + almost no dependency.
Stars: ✭ 45 (-21.05%)
Flow ForecastDeep learning PyTorch library for time series forecasting, classification, and anomaly detection (originally for flood forecasting).
Stars: ✭ 368 (+545.61%)
Transgan[Preprint] "TransGAN: Two Transformers Can Make One Strong GAN", Yifan Jiang, Shiyu Chang, Zhangyang Wang
Stars: ✭ 864 (+1415.79%)
TransformerA TensorFlow Implementation of the Transformer: Attention Is All You Need
Stars: ✭ 3,646 (+6296.49%)
Awesome Bert NlpA curated list of NLP resources focused on BERT, attention mechanism, Transformer networks, and transfer learning.
Stars: ✭ 567 (+894.74%)
Mojito微信、bilibili大图、长图、gif、视频、自定义view的转场效果,The transition effect of wechat, bilibili large image, long image, GIF, video and custom view
Stars: ✭ 1,068 (+1773.68%)
Contextualized Topic ModelsA python package to run contextualized topic modeling. CTMs combine BERT with topic models to get coherent topics. Also supports multilingual tasks. Cross-lingual Zero-shot model published at EACL 2021.
Stars: ✭ 318 (+457.89%)
Bert paper chinese translationBERT: Pre-training of Deep Bidirectional Transformers for Language Understanding 论文的中文翻译 Chinese Translation!
Stars: ✭ 564 (+889.47%)
Transformer TensorflowTensorFlow implementation of 'Attention Is All You Need (2017. 6)'
Stars: ✭ 319 (+459.65%)
Cell DetrOfficial and maintained implementation of the paper Attention-Based Transformers for Instance Segmentation of Cells in Microstructures [BIBM 2020].
Stars: ✭ 26 (-54.39%)
PyhgtCode for "Heterogeneous Graph Transformer" (WWW'20), which is based on pytorch_geometric
Stars: ✭ 313 (+449.12%)
SentencepieceUnsupervised text tokenizer for Neural Network-based text generation.
Stars: ✭ 5,540 (+9619.3%)
Cognitive Speech TtsMicrosoft Text-to-Speech API sample code in several languages, part of Cognitive Services.
Stars: ✭ 312 (+447.37%)
linformerImplementation of Linformer for Pytorch
Stars: ✭ 119 (+108.77%)
VedastrA scene text recognition toolbox based on PyTorch
Stars: ✭ 290 (+408.77%)
Turbotransformersa fast and user-friendly runtime for transformer inference (Bert, Albert, GPT2, Decoders, etc) on CPU and GPU.
Stars: ✭ 826 (+1349.12%)
TransformerEasy Attributed String Creator
Stars: ✭ 278 (+387.72%)
Seq2seq.pytorchSequence-to-Sequence learning using PyTorch
Stars: ✭ 514 (+801.75%)
TransformerImplementation of Transformer model (originally from Attention is All You Need) applied to Time Series.
Stars: ✭ 273 (+378.95%)
Remi"Pop Music Transformer: Beat-based Modeling and Generation of Expressive Pop Piano Compositions", ACM Multimedia 2020
Stars: ✭ 273 (+378.95%)
FormerSimple transformer implementation from scratch in pytorch.
Stars: ✭ 500 (+777.19%)
Nlp Interview Notes本项目是作者们根据个人面试和经验总结出的自然语言处理(NLP)面试准备的学习笔记与资料,该资料目前包含 自然语言处理各领域的 面试题积累。
Stars: ✭ 207 (+263.16%)
MarianFast Neural Machine Translation in C++
Stars: ✭ 777 (+1263.16%)
bert in a flaskA dockerized flask API, serving ALBERT and BERT predictions using TensorFlow 2.0.
Stars: ✭ 32 (-43.86%)
Indian ParallelCorpusCurated list of publicly available parallel corpus for Indian Languages
Stars: ✭ 23 (-59.65%)
Jazz transformerTransformer-XL for Jazz music composition. Paper: "The Jazz Transformer on the Front Line: Exploring the Shortcomings of AI-Composed Music through Quantitative Measures", ISMIR 2020
Stars: ✭ 36 (-36.84%)
uformer-pytorchImplementation of Uformer, Attention-based Unet, in Pytorch
Stars: ✭ 54 (-5.26%)
banglanmtThis repository contains the code and data of the paper titled "Not Low-Resource Anymore: Aligner Ensembling, Batch Filtering, and New Datasets for Bengali-English Machine Translation" published in Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP 2020), November 16 - November 20, 2020.
Stars: ✭ 91 (+59.65%)
Getting Things Done With PytorchJupyter Notebook tutorials on solving real-world problems with Machine Learning & Deep Learning using PyTorch. Topics: Face detection with Detectron 2, Time Series anomaly detection with LSTM Autoencoders, Object Detection with YOLO v5, Build your first Neural Network, Time Series forecasting for Coronavirus daily cases, Sentiment Analysis with BERT.
Stars: ✭ 738 (+1194.74%)
galerkin-transformer[NeurIPS 2021] Galerkin Transformer: a linear attention without softmax
Stars: ✭ 111 (+94.74%)
OmninetOfficial Pytorch implementation of "OmniNet: A unified architecture for multi-modal multi-task learning" | Authors: Subhojeet Pramanik, Priyanka Agrawal, Aman Hussain
Stars: ✭ 448 (+685.96%)
TextPrunerA PyTorch-based model pruning toolkit for pre-trained language models
Stars: ✭ 94 (+64.91%)
Gpt2 FrenchGPT-2 French demo | Démo française de GPT-2
Stars: ✭ 47 (-17.54%)
charformer-pytorchImplementation of the GBST block from the Charformer paper, in Pytorch
Stars: ✭ 74 (+29.82%)
Bert PytorchGoogle AI 2018 BERT pytorch implementation
Stars: ✭ 4,642 (+8043.86%)
Rasa chatbot cnbuilding a chinese dialogue system based on the newest version of rasa(基于最新版本rasa搭建的对话系统)
Stars: ✭ 723 (+1168.42%)
pytorch basic nmtA simple yet strong implementation of neural machine translation in pytorch
Stars: ✭ 66 (+15.79%)
Transformer TtsA Pytorch Implementation of "Neural Speech Synthesis with Transformer Network"
Stars: ✭ 418 (+633.33%)
Meta EmbMultilingual Meta-Embeddings for Named Entity Recognition (RepL4NLP & EMNLP 2019)
Stars: ✭ 28 (-50.88%)
PAMLPersonalizing Dialogue Agents via Meta-Learning
Stars: ✭ 114 (+100%)
TraxTrax — Deep Learning with Clear Code and Speed
Stars: ✭ 6,666 (+11594.74%)
Pytorch Original TransformerMy implementation of the original transformer model (Vaswani et al.). I've additionally included the playground.py file for visualizing otherwise seemingly hard concepts. Currently included IWSLT pretrained models.
Stars: ✭ 411 (+621.05%)
Filipino-Text-BenchmarksOpen-source benchmark datasets and pretrained transformer models in the Filipino language.
Stars: ✭ 22 (-61.4%)
attention-is-all-you-need-paperImplementation of Vaswani, Ashish, et al. "Attention is all you need." Advances in neural information processing systems. 2017.
Stars: ✭ 97 (+70.18%)
Easyflipviewpager📖 The library for creating book and card flip animations in ViewPager in Android
Stars: ✭ 698 (+1124.56%)
Neural spEnd-to-end ASR/LM implementation with PyTorch
Stars: ✭ 408 (+615.79%)
Deepsvg[NeurIPS 2020] Official code for the paper "DeepSVG: A Hierarchical Generative Network for Vector Graphics Animation". Includes a PyTorch library for deep learning with SVG data.
Stars: ✭ 403 (+607.02%)