All Projects → Jddc_solution_4th → Similar Projects or Alternatives

6491 Open source projects that are alternatives of or similar to Jddc_solution_4th

Bertqa Attention On Steroids
BertQA - Attention on Steroids
Stars: ✭ 112 (-52.34%)
Mutual labels:  jupyter-notebook, attention, qa, transformer
Deeplearning Nlp Models
A small, interpretable codebase containing the re-implementation of a few "deep" NLP models in PyTorch. Colab notebooks to run with GPUs. Models: word2vec, CNNs, transformer, gpt.
Stars: ✭ 64 (-72.77%)
Nlp Tutorial
Natural Language Processing Tutorial for Deep Learning Researchers
Stars: ✭ 9,895 (+4110.64%)
Pytorch Original Transformer
My implementation of the original transformer model (Vaswani et al.). I've additionally included the playground.py file for visualizing otherwise seemingly hard concepts. Currently included IWSLT pretrained models.
Stars: ✭ 411 (+74.89%)
Pytorch Seq2seq
Tutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
Stars: ✭ 3,418 (+1354.47%)
Dab
Data Augmentation by Backtranslation (DAB) ヽ( •_-)ᕗ
Stars: ✭ 294 (+25.11%)
Mutual labels:  jupyter-notebook, transformer
Text Classification Models Pytorch
Implementation of State-of-the-art Text Classification Models in Pytorch
Stars: ✭ 379 (+61.28%)
Mutual labels:  attention, transformer
Nlp Tutorials
Simple implementations of NLP models. Tutorials are written in Chinese on my website https://mofanpy.com
Stars: ✭ 394 (+67.66%)
Mutual labels:  attention, transformer
Nn
🧑‍🏫 50! Implementations/tutorials of deep learning papers with side-by-side notes 📝; including transformers (original, xl, switch, feedback, vit, ...), optimizers (adam, adabelief, ...), gans(cyclegan, stylegan2, ...), 🎮 reinforcement learning (ppo, dqn), capsnet, distillation, ... 🧠
Stars: ✭ 5,720 (+2334.04%)
Mutual labels:  jupyter-notebook, transformer
Getting Things Done With Pytorch
Jupyter Notebook tutorials on solving real-world problems with Machine Learning & Deep Learning using PyTorch. Topics: Face detection with Detectron 2, Time Series anomaly detection with LSTM Autoencoders, Object Detection with YOLO v5, Build your first Neural Network, Time Series forecasting for Coronavirus daily cases, Sentiment Analysis with BERT.
Stars: ✭ 738 (+214.04%)
Mutual labels:  jupyter-notebook, transformer
Attentive Neural Processes
implementing "recurrent attentive neural processes" to forecast power usage (w. LSTM baseline, MCDropout)
Stars: ✭ 33 (-85.96%)
Mutual labels:  jupyter-notebook, attention
Vietnamese Electra
Electra pre-trained model using Vietnamese corpus
Stars: ✭ 55 (-76.6%)
Mutual labels:  jupyter-notebook, transformer
Keras Transformer
Transformer implemented in Keras
Stars: ✭ 273 (+16.17%)
Mutual labels:  attention, transformer
Transformer
Implementation of Transformer model (originally from Attention is All You Need) applied to Time Series.
Stars: ✭ 273 (+16.17%)
Mutual labels:  jupyter-notebook, transformer
Question generation
Neural question generation using transformers
Stars: ✭ 356 (+51.49%)
Mutual labels:  jupyter-notebook, transformer
Transformer Tensorflow
TensorFlow implementation of 'Attention Is All You Need (2017. 6)'
Stars: ✭ 319 (+35.74%)
Mutual labels:  attention, transformer
Tsai
Time series Timeseries Deep Learning Pytorch fastai - State-of-the-art Deep Learning with Time Series and Sequences in Pytorch / fastai
Stars: ✭ 407 (+73.19%)
Mutual labels:  jupyter-notebook, transformer
Speech Transformer
A PyTorch implementation of Speech Transformer, an End-to-End ASR with Transformer network on Mandarin Chinese.
Stars: ✭ 565 (+140.43%)
Mutual labels:  attention, transformer
Nlp tensorflow project
Use tensorflow to achieve some NLP project, eg: classification chatbot ner attention QAetc.
Stars: ✭ 27 (-88.51%)
Mutual labels:  attention, qa
Deepsvg
[NeurIPS 2020] Official code for the paper "DeepSVG: A Hierarchical Generative Network for Vector Graphics Animation". Includes a PyTorch library for deep learning with SVG data.
Stars: ✭ 403 (+71.49%)
Mutual labels:  jupyter-notebook, transformer
Attention Transfer
Improving Convolutional Networks via Attention Transfer (ICLR 2017)
Stars: ✭ 1,231 (+423.83%)
Mutual labels:  jupyter-notebook, attention
Smiles Transformer
Original implementation of the paper "SMILES Transformer: Pre-trained Molecular Fingerprint for Low Data Drug Discovery" by Shion Honda et al.
Stars: ✭ 86 (-63.4%)
Mutual labels:  jupyter-notebook, transformer
Scientificsummarizationdatasets
Datasets I have created for scientific summarization, and a trained BertSum model
Stars: ✭ 100 (-57.45%)
Mutual labels:  jupyter-notebook, transformer
Nlp Models Tensorflow
Gathers machine learning and Tensorflow deep learning models for NLP problems, 1.13 < Tensorflow < 2.0
Stars: ✭ 1,603 (+582.13%)
Mutual labels:  jupyter-notebook, attention
Multiturndialogzoo
Multi-turn dialogue baselines written in PyTorch
Stars: ✭ 106 (-54.89%)
Mutual labels:  attention, transformer
Sightseq
Computer vision tools for fairseq, containing PyTorch implementation of text recognition and object detection
Stars: ✭ 116 (-50.64%)
Mutual labels:  attention, transformer
Multihead Siamese Nets
Implementation of Siamese Neural Networks built upon multihead attention mechanism for text semantic similarity task.
Stars: ✭ 144 (-38.72%)
Mutual labels:  jupyter-notebook, attention
ai challenger 2018 sentiment analysis
Fine-grained Sentiment Analysis of User Reviews --- AI CHALLENGER 2018
Stars: ✭ 16 (-93.19%)
Mutual labels:  transformer, attention
Visual-Transformer-Paper-Summary
Summary of Transformer applications for computer vision tasks.
Stars: ✭ 51 (-78.3%)
Mutual labels:  transformer, attention
visualization
a collection of visualization function
Stars: ✭ 189 (-19.57%)
Mutual labels:  transformer, attention
Ner Bert
BERT-NER (nert-bert) with google bert https://github.com/google-research.
Stars: ✭ 339 (+44.26%)
Mutual labels:  jupyter-notebook, attention
Beeva Best Practices
Best Practices and Style Guides in BEEVA
Stars: ✭ 335 (+42.55%)
Mutual labels:  jupyter-notebook, qa
Bert Multitask Learning
BERT for Multitask Learning
Stars: ✭ 380 (+61.7%)
Mutual labels:  jupyter-notebook, transformer
Relation-Extraction-Transformer
NLP: Relation extraction with position-aware self-attention transformer
Stars: ✭ 63 (-73.19%)
Mutual labels:  transformer, attention
Neural sp
End-to-end ASR/LM implementation with PyTorch
Stars: ✭ 408 (+73.62%)
Mutual labels:  attention, transformer
Awesome Fast Attention
list of efficient attention modules
Stars: ✭ 627 (+166.81%)
Mutual labels:  attention, transformer
Deep learning nlp
Keras, PyTorch, and NumPy Implementations of Deep Learning Architectures for NLP
Stars: ✭ 407 (+73.19%)
Mutual labels:  jupyter-notebook, attention
Cell Detr
Official and maintained implementation of the paper Attention-Based Transformers for Instance Segmentation of Cells in Microstructures [BIBM 2020].
Stars: ✭ 26 (-88.94%)
Mutual labels:  attention, transformer
Pytorch Gat
My implementation of the original GAT paper (Veličković et al.). I've additionally included the playground.py file for visualizing the Cora dataset, GAT embeddings, an attention mechanism, and entropy histograms. I've supported both Cora (transductive) and PPI (inductive) examples!
Stars: ✭ 908 (+286.38%)
Mutual labels:  jupyter-notebook, attention
Gpt2 French
GPT-2 French demo | Démo française de GPT-2
Stars: ✭ 47 (-80%)
Mutual labels:  jupyter-notebook, transformer
transformer
A PyTorch Implementation of "Attention Is All You Need"
Stars: ✭ 28 (-88.09%)
Mutual labels:  transformer, attention
Machine Learning
My Attempt(s) In The World Of ML/DL....
Stars: ✭ 78 (-66.81%)
Mutual labels:  jupyter-notebook, attention
Rnn For Joint Nlu
Pytorch implementation of "Attention-Based Recurrent Neural Network Models for Joint Intent Detection and Slot Filling" (https://arxiv.org/abs/1609.01454)
Stars: ✭ 176 (-25.11%)
Mutual labels:  jupyter-notebook, attention
Njunmt Tf
An open-source neural machine translation system developed by Natural Language Processing Group, Nanjing University.
Stars: ✭ 97 (-58.72%)
Mutual labels:  attention, transformer
Indonesian Language Models
Indonesian Language Models and its Usage
Stars: ✭ 64 (-72.77%)
Mutual labels:  jupyter-notebook, transformer
Tensorflow Ml Nlp
텐서플로우와 머신러닝으로 시작하는 자연어처리(로지스틱회귀부터 트랜스포머 챗봇까지)
Stars: ✭ 176 (-25.11%)
Mutual labels:  jupyter-notebook, transformer
Graphtransformer
Graph Transformer Architecture. Source code for "A Generalization of Transformer Networks to Graphs", DLG-AAAI'21.
Stars: ✭ 187 (-20.43%)
Mutual labels:  attention, transformer
Getting Started With Google Bert
Build and train state-of-the-art natural language processing models using BERT
Stars: ✭ 107 (-54.47%)
Mutual labels:  jupyter-notebook, transformer
Chinese Chatbot
中文聊天机器人,基于10万组对白训练而成,采用注意力机制,对一般问题都会生成一个有意义的答复。已上传模型,可直接运行,跑不起来直播吃键盘。
Stars: ✭ 124 (-47.23%)
Mutual labels:  jupyter-notebook, attention
Attentionn
All about attention in neural networks. Soft attention, attention maps, local and global attention and multi-head attention.
Stars: ✭ 175 (-25.53%)
Mutual labels:  jupyter-notebook, attention
Transformers.jl
Julia Implementation of Transformer models
Stars: ✭ 173 (-26.38%)
Mutual labels:  attention, transformer
Graph attention pool
Attention over nodes in Graph Neural Networks using PyTorch (NeurIPS 2019)
Stars: ✭ 186 (-20.85%)
Mutual labels:  jupyter-notebook, attention
Hey Jetson
Deep Learning based Automatic Speech Recognition with attention for the Nvidia Jetson.
Stars: ✭ 161 (-31.49%)
Mutual labels:  jupyter-notebook, attention
Sttn
[ECCV'2020] STTN: Learning Joint Spatial-Temporal Transformations for Video Inpainting
Stars: ✭ 211 (-10.21%)
Mutual labels:  jupyter-notebook, transformer
learningspoons
nlp lecture-notes and source code
Stars: ✭ 29 (-87.66%)
Mutual labels:  transformer, attention
CrabNet
Predict materials properties using only the composition information!
Stars: ✭ 57 (-75.74%)
Mutual labels:  transformer, attention
Attention Over Attention Tf Qa
论文“Attention-over-Attention Neural Networks for Reading Comprehension”中AoA模型实现
Stars: ✭ 58 (-75.32%)
Mutual labels:  attention, qa
Medical Transformer
Pytorch Code for "Medical Transformer: Gated Axial-Attention for Medical Image Segmentation"
Stars: ✭ 153 (-34.89%)
Mutual labels:  attention, transformer
Doc Han Att
Hierarchical Attention Networks for Chinese Sentiment Classification
Stars: ✭ 206 (-12.34%)
Mutual labels:  jupyter-notebook, attention
1-60 of 6491 similar projects