TextPrunerA PyTorch-based model pruning toolkit for pre-trained language models
Stars: ✭ 94 (+42.42%)
GraphormerGraphormer is a deep learning package that allows researchers and developers to train custom models for molecule modeling tasks. It aims to accelerate the research and application in AI for molecule science, such as material design, drug discovery, etc.
Stars: ✭ 1,194 (+1709.09%)
Cell DetrOfficial and maintained implementation of the paper Attention-Based Transformers for Instance Segmentation of Cells in Microstructures [BIBM 2020].
Stars: ✭ 26 (-60.61%)
ferFacial Expression Recognition
Stars: ✭ 32 (-51.52%)
charformer-pytorchImplementation of the GBST block from the Charformer paper, in Pytorch
Stars: ✭ 74 (+12.12%)
kaggledatasetsCollection of Kaggle Datasets ready to use for Everyone (Looking for contributors)
Stars: ✭ 44 (-33.33%)
h-transformer-1dImplementation of H-Transformer-1D, Hierarchical Attention for Sequence Learning
Stars: ✭ 121 (+83.33%)
ViewpagertransformerViewpager动画,包括渐变,旋转,缩放,3D,立方体等多种酷炫效果动画,实现原理是自定义ViewpagerTransformer,当然你也可以自定义多种动画
Stars: ✭ 62 (-6.06%)
linformerImplementation of Linformer for Pytorch
Stars: ✭ 119 (+80.3%)
DolboNetРусскоязычный чат-бот для Discord на архитектуре Transformer
Stars: ✭ 53 (-19.7%)
Kaggle Homedepot3rd Place Solution for HomeDepot Product Search Results Relevance Competition on Kaggle.
Stars: ✭ 452 (+584.85%)
PAMLPersonalizing Dialogue Agents via Meta-Learning
Stars: ✭ 114 (+72.73%)
kdsb17Gaussian Mixture Convolutional AutoEncoder applied to CT lung scans from the Kaggle Data Science Bowl 2017
Stars: ✭ 18 (-72.73%)
Deepfake DetectionDeepFake Detection: Detect the video is fake or not using InceptionResNetV2.
Stars: ✭ 23 (-65.15%)
TRAR-VQA[ICCV 2021] TRAR: Routing the Attention Spans in Transformers for Visual Question Answering -- Official Implementation
Stars: ✭ 49 (-25.76%)
attention-is-all-you-need-paperImplementation of Vaswani, Ashish, et al. "Attention is all you need." Advances in neural information processing systems. 2017.
Stars: ✭ 97 (+46.97%)
Data Science Ipython NotebooksData science Python notebooks: Deep learning (TensorFlow, Theano, Caffe, Keras), scikit-learn, Kaggle, big data (Spark, Hadoop MapReduce, HDFS), matplotlib, pandas, NumPy, SciPy, Python essentials, AWS, and various command lines.
Stars: ✭ 22,048 (+33306.06%)
proc-thatproc(ess)-that - easy extendable ETL tool for Node.js. Written in TypeScript.
Stars: ✭ 25 (-62.12%)
pynmta simple and complete pytorch implementation of neural machine translation system
Stars: ✭ 13 (-80.3%)
kaggle-airbnb🌍 Where will a new guest book their first travel experience?
Stars: ✭ 53 (-19.7%)
argus-tgs-saltKaggle | 14th place solution for TGS Salt Identification Challenge
Stars: ✭ 73 (+10.61%)
kospeechOpen-Source Toolkit for End-to-End Korean Automatic Speech Recognition leveraging PyTorch and Hydra.
Stars: ✭ 456 (+590.91%)
JukeboxCode for the paper "Jukebox: A Generative Model for Music"
Stars: ✭ 4,863 (+7268.18%)
digit recognizerCNN digit recognizer implemented in Keras Notebook, Kaggle/MNIST (0.995).
Stars: ✭ 27 (-59.09%)
En-transformerImplementation of E(n)-Transformer, which extends the ideas of Welling's E(n)-Equivariant Graph Neural Network to attention
Stars: ✭ 131 (+98.48%)
Transformer-MM-Explainability[ICCV 2021- Oral] Official PyTorch implementation for Generic Attention-model Explainability for Interpreting Bi-Modal and Encoder-Decoder Transformers, a novel method to visualize any Transformer-based network. Including examples for DETR, VQA.
Stars: ✭ 484 (+633.33%)
alpr utilsALPR model in unconstrained scenarios for Chinese license plates
Stars: ✭ 158 (+139.39%)
JoeynmtMinimalist NMT for educational purposes
Stars: ✭ 420 (+536.36%)
YaEtlYet Another ETL in PHP
Stars: ✭ 60 (-9.09%)
kosrKorean speech recognition based on transformer (트랜스포머 기반 한국어 음성 인식)
Stars: ✭ 25 (-62.12%)
imdb-transformerA simple Neural Network for sentiment analysis, embedding sentences using a Transformer network.
Stars: ✭ 26 (-60.61%)
Data Science Bowl 2018End-to-end one-class instance segmentation based on U-Net architecture for Data Science Bowl 2018 in Kaggle
Stars: ✭ 56 (-15.15%)
catrImage Captioning Using Transformer
Stars: ✭ 206 (+212.12%)
Pytorch Original TransformerMy implementation of the original transformer model (Vaswani et al.). I've additionally included the playground.py file for visualizing otherwise seemingly hard concepts. Currently included IWSLT pretrained models.
Stars: ✭ 411 (+522.73%)
text simplificationText Simplification Model based on Encoder-Decoder (includes Transformer and Seq2Seq) model.
Stars: ✭ 66 (+0%)
amrlibA python library that makes AMR parsing, generation and visualization simple.
Stars: ✭ 107 (+62.12%)
Neural-Scam-ArtistWeb Scraping, Document Deduplication & GPT-2 Fine-tuning with a newly created scam dataset.
Stars: ✭ 18 (-72.73%)
Bert KerasKeras implementation of BERT with pre-trained weights
Stars: ✭ 820 (+1142.42%)
image-classificationA collection of SOTA Image Classification Models in PyTorch
Stars: ✭ 70 (+6.06%)
Deeplearning Nlp ModelsA small, interpretable codebase containing the re-implementation of a few "deep" NLP models in PyTorch. Colab notebooks to run with GPUs. Models: word2vec, CNNs, transformer, gpt.
Stars: ✭ 64 (-3.03%)
Fraud DetectionCredit Card Fraud Detection using ML: IEEE style paper + Jupyter Notebook
Stars: ✭ 58 (-12.12%)
Pytorch ToolbeltPyTorch extensions for fast R&D prototyping and Kaggle farming
Stars: ✭ 942 (+1327.27%)
Data Science CompetitionsGoal of this repo is to provide the solutions of all Data Science Competitions(Kaggle, Data Hack, Machine Hack, Driven Data etc...).
Stars: ✭ 572 (+766.67%)
Viewpagertransitionviewpager with parallax pages, together with vertical sliding (or click) and activity transition
Stars: ✭ 3,017 (+4471.21%)