clip-guided-diffusionA CLI tool/python module for generating images from text using guided diffusion and CLIP from OpenAI.
Stars: ✭ 260 (-83.81%)
aotRussian morphology for Java
Stars: ✭ 41 (-97.45%)
udarUDAR Does Accented Russian: A finite-state morphological analyzer of Russian that handles stressed wordforms.
Stars: ✭ 15 (-99.07%)
ds👨🔬 In Russian: Обновляемая структурированная подборка бесплатных ресурсов по тематикам Data Science: курсы, книги, открытые данные, блоги и готовые решения.
Stars: ✭ 102 (-93.65%)
universum-contractstext-to-image generation gems / libraries incl. moonbirds, cyberpunks, coolcats, shiba inu doge, nouns & more
Stars: ✭ 17 (-98.94%)
RussianNounsJSСклонение существительных по падежам. Обычно требуются только форма в именительном падеже, одушевлённость и род.
Stars: ✭ 29 (-98.19%)
KoDALLE🇰🇷 Text to Image in Korean
Stars: ✭ 55 (-96.58%)
DolboNetРусскоязычный чат-бот для Discord на архитектуре Transformer
Stars: ✭ 53 (-96.7%)
pytorch-lr-schedulerPyTorch implementation of some learning rate schedulers for deep learning researcher.
Stars: ✭ 65 (-95.95%)
query-selectorLONG-TERM SERIES FORECASTING WITH QUERYSELECTOR – EFFICIENT MODEL OF SPARSEATTENTION
Stars: ✭ 63 (-96.08%)
InsightRepository for Project Insight: NLP as a Service
Stars: ✭ 246 (-84.68%)
SSE-PTCodes and Datasets for paper RecSys'20 "SSE-PT: Sequential Recommendation Via Personalized Transformer" and NurIPS'19 "Stochastic Shared Embeddings: Data-driven Regularization of Embedding Layers"
Stars: ✭ 103 (-93.59%)
soft-intro-vae-pytorch[CVPR 2021 Oral] Official PyTorch implementation of Soft-IntroVAE from the paper "Soft-IntroVAE: Analyzing and Improving Introspective Variational Autoencoders"
Stars: ✭ 170 (-89.41%)
Ner Bert PytorchPyTorch solution of named entity recognition task Using Google AI's pre-trained BERT model.
Stars: ✭ 249 (-84.5%)
Relational Rnn PytorchAn implementation of DeepMind's Relational Recurrent Neural Networks in PyTorch.
Stars: ✭ 236 (-85.31%)
densecapDense video captioning in PyTorch
Stars: ✭ 37 (-97.7%)
PosthtmlPostHTML is a tool to transform HTML/XML with JS plugins
Stars: ✭ 2,737 (+70.42%)
R-MeNTransformer-based Memory Networks for Knowledge Graph Embeddings (ACL 2020) (Pytorch and Tensorflow)
Stars: ✭ 74 (-95.39%)
CoCosNet-v2CoCosNet v2: Full-Resolution Correspondence Learning for Image Translation
Stars: ✭ 312 (-80.57%)
Multigraph transformer transformer, multi-graph transformer, graph, graph classification, sketch recognition, sketch classification, free-hand sketch, official code of the paper "Multi-Graph Transformer for Free-Hand Sketch Recognition"
Stars: ✭ 231 (-85.62%)
PaddlenlpNLP Core Library and Model Zoo based on PaddlePaddle 2.0
Stars: ✭ 212 (-86.8%)
BMTSource code for "Bi-modal Transformer for Dense Video Captioning" (BMVC 2020)
Stars: ✭ 192 (-88.04%)
Sttn[ECCV'2020] STTN: Learning Joint Spatial-Temporal Transformations for Video Inpainting
Stars: ✭ 211 (-86.86%)
FAQUnofficial Fedora FAQ in russian
Stars: ✭ 86 (-94.65%)
VT-UNet[MICCAI2022] This is an official PyTorch implementation for A Robust Volumetric Transformer for Accurate 3D Tumor Segmentation
Stars: ✭ 151 (-90.6%)
graphsignalGraphsignal Python agent
Stars: ✭ 158 (-90.16%)
StrataРаскладка клавиатуры для тех, кто любит Markdown и пишет по-русски
Stars: ✭ 70 (-95.64%)
Pytorch Seq2seqTutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
Stars: ✭ 3,418 (+112.83%)
ViTs-vs-CNNs[NeurIPS 2021]: Are Transformers More Robust Than CNNs? (Pytorch implementation & checkpoints)
Stars: ✭ 145 (-90.97%)
BertvizTool for visualizing attention in the Transformer model (BERT, GPT-2, Albert, XLNet, RoBERTa, CTRL, etc.)
Stars: ✭ 3,443 (+114.38%)
vietnamese-robertaA Robustly Optimized BERT Pretraining Approach for Vietnamese
Stars: ✭ 22 (-98.63%)
Gpt2 NewstitleChinese NewsTitle Generation Project by GPT2.带有超级详细注释的中文GPT2新闻标题生成项目。
Stars: ✭ 235 (-85.37%)
TorchnlpEasy to use NLP library built on PyTorch and TorchText
Stars: ✭ 233 (-85.49%)
SegSwap(CVPRW 2022) Learning Co-segmentation by Segment Swapping for Retrieval and Discovery
Stars: ✭ 46 (-97.14%)
learnrxjsРусскоязычная документация RxJS
Stars: ✭ 20 (-98.75%)
Self Attention CvImplementation of various self-attention mechanisms focused on computer vision. Ongoing repository.
Stars: ✭ 209 (-86.99%)
MoCoGAN-HD[ICLR 2021 Spotlight] A Good Image Generator Is What You Need for High-Resolution Video Synthesis
Stars: ✭ 224 (-86.05%)
YinThe efficient and elegant JSON:API 1.1 server library for PHP
Stars: ✭ 214 (-86.67%)
bytekitJava 字节操作的工具库(不是字节码的工具库)
Stars: ✭ 40 (-97.51%)
sb-nmtCode for Synchronous Bidirectional Neural Machine Translation (SB-NMT)
Stars: ✭ 66 (-95.89%)
Hardware Aware Transformers[ACL 2020] HAT: Hardware-Aware Transformers for Efficient Natural Language Processing
Stars: ✭ 206 (-87.17%)
Bert ChainerChainer implementation of "BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding"
Stars: ✭ 205 (-87.24%)
Linear Attention TransformerTransformer based on a variant of attention that is linear complexity in respect to sequence length
Stars: ✭ 205 (-87.24%)
russiannamesRussian names parsers, gender identification and processing tools
Stars: ✭ 102 (-93.65%)
t5-japaneseCodes to pre-train Japanese T5 models
Stars: ✭ 39 (-97.57%)
Transformers-RLAn easy PyTorch implementation of "Stabilizing Transformers for Reinforcement Learning"
Stars: ✭ 107 (-93.34%)
Lumen Api StarterLumen 8 基础上扩展出的API 启动项目,精心设计的目录结构,规范统一的响应数据格式,Repository 模式架构的最佳实践。
Stars: ✭ 197 (-87.73%)
seq2seq-pytorchSequence to Sequence Models in PyTorch
Stars: ✭ 41 (-97.45%)
Gpt ScrollsA collaborative collection of open-source safe GPT-3 prompts that work well
Stars: ✭ 195 (-87.86%)
GraphtransformerGraph Transformer Architecture. Source code for "A Generalization of Transformer Networks to Graphs", DLG-AAAI'21.
Stars: ✭ 187 (-88.36%)
VQGAN-CLIPJust playing with getting VQGAN+CLIP running locally, rather than having to use colab.
Stars: ✭ 2,369 (+47.51%)
awesome-transformer-searchA curated list of awesome resources combining Transformers with Neural Architecture Search
Stars: ✭ 194 (-87.92%)