Transformer-MM-Explainability[ICCV 2021- Oral] Official PyTorch implementation for Generic Attention-model Explainability for Interpreting Bi-Modal and Encoder-Decoder Transformers, a novel method to visualize any Transformer-based network. Including examples for DETR, VQA.
Stars: ✭ 484 (+2204.76%)
PresentoPresento - Transformer & Presenter Package for PHP
Stars: ✭ 71 (+238.1%)
wiki-tuiA simple and easy to use Wikipedia Text User Interface
Stars: ✭ 74 (+252.38%)
Mixture Of ExpertsA Pytorch implementation of Sparsely-Gated Mixture of Experts, for massively increasing the parameter count of language models
Stars: ✭ 68 (+223.81%)
prototype📖Prototype Document
Stars: ✭ 45 (+114.29%)
Deeplearning Nlp ModelsA small, interpretable codebase containing the re-implementation of a few "deep" NLP models in PyTorch. Colab notebooks to run with GPUs. Models: word2vec, CNNs, transformer, gpt.
Stars: ✭ 64 (+204.76%)
alpr utilsALPR model in unconstrained scenarios for Chinese license plates
Stars: ✭ 158 (+652.38%)
Transformer DynetAn Implementation of Transformer (Attention Is All You Need) in DyNet
Stars: ✭ 57 (+171.43%)
YaEtlYet Another ETL in PHP
Stars: ✭ 60 (+185.71%)
Bert-text-classificationThis shows how to fine-tune Bert language model and use PyTorch-transformers for text classififcation
Stars: ✭ 54 (+157.14%)
verssionRSS feeds of stable release versions, as found in Wikipedia.
Stars: ✭ 15 (-28.57%)
Gpt2 FrenchGPT-2 French demo | Démo française de GPT-2
Stars: ✭ 47 (+123.81%)
SockeyeSequence-to-sequence framework with a focus on Neural Machine Translation based on Apache MXNet
Stars: ✭ 990 (+4614.29%)
Jazz transformerTransformer-XL for Jazz music composition. Paper: "The Jazz Transformer on the Front Line: Exploring the Shortcomings of AI-Composed Music through Quantitative Measures", ISMIR 2020
Stars: ✭ 36 (+71.43%)
enformer-pytorchImplementation of Enformer, Deepmind's attention network for predicting gene expression, in Pytorch
Stars: ✭ 146 (+595.24%)
robo-vlnPytorch code for ICRA'21 paper: "Hierarchical Cross-Modal Agent for Robotics Vision-and-Language Navigation"
Stars: ✭ 34 (+61.9%)
php-json-apiJSON API transformer outputting valid (PSR-7) API Responses.
Stars: ✭ 68 (+223.81%)
WikimediaUI-Style-GuideWikimedia Design Style Guide with user interface focus, authored by Wikimedia Foundation Design team.
Stars: ✭ 93 (+342.86%)
WitwickyWitwicky: An implementation of Transformer in PyTorch.
Stars: ✭ 21 (+0%)
ExpBERTCode for our ACL '20 paper "Representation Engineering with Natural Language Explanations"
Stars: ✭ 28 (+33.33%)
Figma TransformerA tiny utility library that makes the Figma API more human friendly.
Stars: ✭ 27 (+28.57%)
GTSRB Keras STNGerman Traffic Sign Recognition Benchmark, Keras implementation with Spatial Transformer Networks
Stars: ✭ 48 (+128.57%)
Odsc 2020 nlpRepository for ODSC talk related to Deep Learning NLP
Stars: ✭ 20 (-4.76%)
wikibotA 🤖 which provides features from Wikipedia like summary, title searches, location API etc.
Stars: ✭ 25 (+19.05%)
Bert KerasKeras implementation of BERT with pre-trained weights
Stars: ✭ 820 (+3804.76%)
FinBERTA Pretrained BERT Model for Financial Communications. https://arxiv.org/abs/2006.08097
Stars: ✭ 193 (+819.05%)
Rasa chatbot cnbuilding a chinese dialogue system based on the newest version of rasa(基于最新版本rasa搭建的对话系统)
Stars: ✭ 723 (+3342.86%)
Easyflipviewpager📖 The library for creating book and card flip animations in ViewPager in Android
Stars: ✭ 698 (+3223.81%)
Laravel ResponderA Laravel Fractal package for building API responses, giving you the power of Fractal with Laravel's elegancy.
Stars: ✭ 673 (+3104.76%)
RSTNetRSTNet: Captioning with Adaptive Attention on Visual and Non-Visual Words (CVPR 2021)
Stars: ✭ 71 (+238.1%)
Awesome Bert NlpA curated list of NLP resources focused on BERT, attention mechanism, Transformer networks, and transfer learning.
Stars: ✭ 567 (+2600%)
pytorch-gpt-xImplementation of autoregressive language model using improved Transformer and DeepSpeed pipeline parallelism.
Stars: ✭ 21 (+0%)
Bert paper chinese translationBERT: Pre-training of Deep Bidirectional Transformers for Language Understanding 论文的中文翻译 Chinese Translation!
Stars: ✭ 564 (+2585.71%)
text simplificationText Simplification Model based on Encoder-Decoder (includes Transformer and Seq2Seq) model.
Stars: ✭ 66 (+214.29%)
Athenaan open-source implementation of sequence-to-sequence based speech processing engine
Stars: ✭ 542 (+2480.95%)
textgoText preprocessing, representation, similarity calculation, text search and classification. Let's go and play with text!
Stars: ✭ 33 (+57.14%)
FormerSimple transformer implementation from scratch in pytorch.
Stars: ✭ 500 (+2280.95%)
tech-seo-crawlerBuild a small, 3 domain internet using Github pages and Wikipedia and construct a crawler to crawl, render, and index.
Stars: ✭ 57 (+171.43%)
Nmt KerasNeural Machine Translation with Keras
Stars: ✭ 501 (+2285.71%)
xtoolsA suite of tools to analyze page, user and project data of MediaWiki websites
Stars: ✭ 78 (+271.43%)
context-cardsWikipedia page previews for any site
Stars: ✭ 29 (+38.1%)
DolboNetРусскоязычный чат-бот для Discord на архитектуре Transformer
Stars: ✭ 53 (+152.38%)
graphtransRepresenting Long-Range Context for Graph Neural Networks with Global Attention
Stars: ✭ 45 (+114.29%)
wikicrushProcessor scripts for Wikipedia dumps to crush them into a dense binary format that is easy to pathfind with.
Stars: ✭ 46 (+119.05%)
pywikibot-scriptsOwn pywikibot scripts (for Wikimedia projects)
Stars: ✭ 16 (-23.81%)
Pg similarityset of functions and operators for executing similarity queries
Stars: ✭ 250 (+1090.48%)