Walk-TransformerFrom Random Walks to Transformer for Learning Node Embeddings (ECML-PKDD 2020) (In Pytorch and Tensorflow)
Stars: ✭ 26 (-63.89%)
AdaSpeechAdaSpeech: Adaptive Text to Speech for Custom Voice
Stars: ✭ 108 (+50%)
VT-UNet[MICCAI2022] This is an official PyTorch implementation for A Robust Volumetric Transformer for Accurate 3D Tumor Segmentation
Stars: ✭ 151 (+109.72%)
Onnxt5Summarization, translation, sentiment-analysis, text-generation and more at blazing speed using a T5 version implemented in ONNX.
Stars: ✭ 143 (+98.61%)
verseagilityRamp up your custom natural language processing (NLP) task, allowing you to bring your own data, use your preferred frameworks and bring models into production.
Stars: ✭ 23 (-68.06%)
ClusterTransformerTopic clustering library built on Transformer embeddings and cosine similarity metrics.Compatible with all BERT base transformers from huggingface.
Stars: ✭ 36 (-50%)
Pytorch Seq2seqTutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
Stars: ✭ 3,418 (+4647.22%)
StyleGAN demoThe re-implementation of style-based generator idea
Stars: ✭ 22 (-69.44%)
gans-2.0Generative Adversarial Networks in TensorFlow 2.0
Stars: ✭ 76 (+5.56%)
alpr utilsALPR model in unconstrained scenarios for Chinese license plates
Stars: ✭ 158 (+119.44%)
style swap tensorflowtensorflow code for Fast Patch-based Style Transfer of Arbitrary Style
Stars: ✭ 42 (-41.67%)
MobileHumanPoseThis repo is official PyTorch implementation of MobileHumanPose: Toward real-time 3D human pose estimation in mobile devices(CVPRW 2021).
Stars: ✭ 206 (+186.11%)
sparql-transformerA more handy way to use SPARQL data in your web app
Stars: ✭ 38 (-47.22%)
TransBTSThis repo provides the official code for : 1) TransBTS: Multimodal Brain Tumor Segmentation Using Transformer (https://arxiv.org/abs/2103.04430) , accepted by MICCAI2021. 2) TransBTSV2: Towards Better and More Efficient Volumetric Segmentation of Medical Images(https://arxiv.org/abs/2201.12785).
Stars: ✭ 254 (+252.78%)
YaEtlYet Another ETL in PHP
Stars: ✭ 60 (-16.67%)
kaggle-champsCode for the CHAMPS Predicting Molecular Properties Kaggle competition
Stars: ✭ 49 (-31.94%)
onnOnline Deep Learning: Learning Deep Neural Networks on the Fly / Non-linear Contextual Bandit Algorithm (ONN_THS)
Stars: ✭ 139 (+93.06%)
cosine-ood-detectorHyperparameter-Free Out-of-Distribution Detection Using Softmax of Scaled Cosine Similarity
Stars: ✭ 30 (-58.33%)
catrImage Captioning Using Transformer
Stars: ✭ 206 (+186.11%)
Deep-Learning-PytorchA repo containing code covering various aspects of deep learning on Pytorch. Great for beginners and intermediate in the field
Stars: ✭ 59 (-18.06%)
RandLA-Net-pytorch🍀 Pytorch Implementation of RandLA-Net (https://arxiv.org/abs/1911.11236)
Stars: ✭ 69 (-4.17%)
DeepChannelThe pytorch implementation of paper "DeepChannel: Salience Estimation by Contrastive Learning for Extractive Document Summarization"
Stars: ✭ 24 (-66.67%)
StyleCLIPDrawStyled text-to-drawing synthesis method. Featured at IJCAI 2022 and the 2021 NeurIPS Workshop on Machine Learning for Creativity and Design
Stars: ✭ 247 (+243.06%)
Transformer-MM-Explainability[ICCV 2021- Oral] Official PyTorch implementation for Generic Attention-model Explainability for Interpreting Bi-Modal and Encoder-Decoder Transformers, a novel method to visualize any Transformer-based network. Including examples for DETR, VQA.
Stars: ✭ 484 (+572.22%)
text simplificationText Simplification Model based on Encoder-Decoder (includes Transformer and Seq2Seq) model.
Stars: ✭ 66 (-8.33%)
TokenLabelingPytorch implementation of "All Tokens Matter: Token Labeling for Training Better Vision Transformers"
Stars: ✭ 385 (+434.72%)
transformerA simple TensorFlow implementation of the Transformer
Stars: ✭ 25 (-65.28%)
sisterSImple SenTence EmbeddeR
Stars: ✭ 66 (-8.33%)
Text-Classification-LSTMs-PyTorchThe aim of this repository is to show a baseline model for text classification by implementing a LSTM-based model coded in PyTorch. In order to provide a better understanding of the model, it will be used a Tweets dataset provided by Kaggle.
Stars: ✭ 45 (-37.5%)
capeContinuous Augmented Positional Embeddings (CAPE) implementation for PyTorch
Stars: ✭ 29 (-59.72%)
Generative MLZSL[TPAMI Under Submission] Generative Multi-Label Zero-Shot Learning
Stars: ✭ 37 (-48.61%)
svae cf[ WSDM '19 ] Sequential Variational Autoencoders for Collaborative Filtering
Stars: ✭ 38 (-47.22%)
factsummFactSumm: Factual Consistency Scorer for Abstractive Summarization
Stars: ✭ 83 (+15.28%)
libaiLiBai(李白): A Toolbox for Large-Scale Distributed Parallel Training
Stars: ✭ 284 (+294.44%)
style-transfer-video-processorThis code extends the neural style transfer image processing technique to video by generating smooth transitions between several reference style images
Stars: ✭ 113 (+56.94%)
Neural-Scam-ArtistWeb Scraping, Document Deduplication & GPT-2 Fine-tuning with a newly created scam dataset.
Stars: ✭ 18 (-75%)
Cross-lingual-SummarizationZero-Shot Cross-Lingual Abstractive Sentence Summarization through Teaching Generation and Attention
Stars: ✭ 28 (-61.11%)
Face-SketchFace Sketch Synthesis with Style Transfer using Pyramid Column Feature, WACV2018
Stars: ✭ 52 (-27.78%)
ActiveSparseShifts-PyTorchImplementation of Sparse Shift Layer and Active Shift Layer (3D, 4D, 5D tensors) for PyTorch(CPU,GPU)
Stars: ✭ 27 (-62.5%)
imdb-transformerA simple Neural Network for sentiment analysis, embedding sentences using a Transformer network.
Stars: ✭ 26 (-63.89%)
VideoTransformer-pytorchPyTorch implementation of a collections of scalable Video Transformer Benchmarks.
Stars: ✭ 159 (+120.83%)
php-serializerSerialize PHP variables, including objects, in any format. Support to unserialize it too.
Stars: ✭ 47 (-34.72%)
transformer-lsOfficial PyTorch Implementation of Long-Short Transformer (NeurIPS 2021).
Stars: ✭ 201 (+179.17%)
ru-dalleGenerate images from texts. In Russian
Stars: ✭ 1,606 (+2130.56%)
fastT5⚡ boost inference speed of T5 models by 5x & reduce the model size by 3x.
Stars: ✭ 421 (+484.72%)