EmbeddingEmbedding模型代码和学习笔记总结
Stars: ✭ 25 (-88.04%)
image-classificationA collection of SOTA Image Classification Models in PyTorch
Stars: ✭ 70 (-66.51%)
pynmta simple and complete pytorch implementation of neural machine translation system
Stars: ✭ 13 (-93.78%)
transform-graphql⚙️ Transformer function to transform GraphQL Directives. Create model CRUD directive for example
Stars: ✭ 23 (-89%)
Learning-Lab-C-LibraryThis library provides a set of basic functions for different type of deep learning (and other) algorithms in C.This deep learning library will be constantly updated
Stars: ✭ 20 (-90.43%)
DataProfilerWhat's in your data? Extract schema, statistics and entities from datasets
Stars: ✭ 843 (+303.35%)
node-sheetsread rows from google spreadsheet with google's sheets api
Stars: ✭ 16 (-92.34%)
transformerA PyTorch Implementation of "Attention Is All You Need"
Stars: ✭ 28 (-86.6%)
react-keyviewReact components to display the list, table, and grid, without scrolling, use the keyboard keys to navigate through the data
Stars: ✭ 16 (-92.34%)
BossNAS(ICCV 2021) BossNAS: Exploring Hybrid CNN-transformers with Block-wisely Self-supervised Neural Architecture Search
Stars: ✭ 125 (-40.19%)
PAMLPersonalizing Dialogue Agents via Meta-Learning
Stars: ✭ 114 (-45.45%)
OpenPromptAn Open-Source Framework for Prompt-Learning.
Stars: ✭ 1,769 (+746.41%)
golgothaContextualised Embeddings and Language Modelling using BERT and Friends using R
Stars: ✭ 39 (-81.34%)
GTSRB Keras STNGerman Traffic Sign Recognition Benchmark, Keras implementation with Spatial Transformer Networks
Stars: ✭ 48 (-77.03%)
max-deeplabUnofficial implementation of MaX-DeepLab for Instance Segmentation
Stars: ✭ 84 (-59.81%)
pytorch-gpt-xImplementation of autoregressive language model using improved Transformer and DeepSpeed pipeline parallelism.
Stars: ✭ 21 (-89.95%)
TS-CAMCodes for TS-CAM: Token Semantic Coupled Attention Map for Weakly Supervised Object Localization.
Stars: ✭ 96 (-54.07%)
YOLOSYou Only Look at One Sequence (NeurIPS 2021)
Stars: ✭ 612 (+192.82%)
verseagilityRamp up your custom natural language processing (NLP) task, allowing you to bring your own data, use your preferred frameworks and bring models into production.
Stars: ✭ 23 (-89%)
text2keywordsTrained T5 and T5-large model for creating keywords from text
Stars: ✭ 53 (-74.64%)
tv📺(tv) Tidy Viewer is a cross-platform CLI csv pretty printer that uses column styling to maximize viewer enjoyment.
Stars: ✭ 1,763 (+743.54%)
CSV2RDFStreaming, transforming, SPARQL-based CSV to RDF converter. Apache license.
Stars: ✭ 48 (-77.03%)
linformerImplementation of Linformer for Pytorch
Stars: ✭ 119 (-43.06%)
dodrioExploring attention weights in transformer-based models with linguistic knowledge.
Stars: ✭ 233 (+11.48%)
ConformerOfficial code for Conformer: Local Features Coupling Global Representations for Visual Recognition
Stars: ✭ 345 (+65.07%)
segmenter[ICCV2021] Official PyTorch implementation of Segmenter: Transformer for Semantic Segmentation
Stars: ✭ 463 (+121.53%)
fastapi-csv🏗️ Create APIs from CSV files within seconds, using fastapi
Stars: ✭ 46 (-77.99%)
HRFormerThis is an official implementation of our NeurIPS 2021 paper "HRFormer: High-Resolution Transformer for Dense Prediction".
Stars: ✭ 357 (+70.81%)
Context-TransformerContext-Transformer: Tackling Object Confusion for Few-Shot Detection, AAAI 2020
Stars: ✭ 89 (-57.42%)
tableschema-goA Go library for working with Table Schema.
Stars: ✭ 41 (-80.38%)
set-transformerA neural network architecture for prediction on sets
Stars: ✭ 18 (-91.39%)
Tabula🈸 Pretty printer for maps/structs collections (Elixir)
Stars: ✭ 85 (-59.33%)
Kevinpro-NLP-demoAll NLP you Need Here. 个人实现了一些好玩的NLP demo,目前包含13个NLP应用的pytorch实现
Stars: ✭ 117 (-44.02%)
transformerNeutron: A pytorch based implementation of Transformer and its variants.
Stars: ✭ 60 (-71.29%)
NLP-paper🎨 🎨NLP 自然语言处理教程 🎨🎨 https://dataxujing.github.io/NLP-paper/
Stars: ✭ 23 (-89%)
Vision-Language-TransformerVision-Language Transformer and Query Generation for Referring Segmentation (ICCV 2021)
Stars: ✭ 127 (-39.23%)
CrabNetPredict materials properties using only the composition information!
Stars: ✭ 57 (-72.73%)
attention-is-all-you-need-paperImplementation of Vaswani, Ashish, et al. "Attention is all you need." Advances in neural information processing systems. 2017.
Stars: ✭ 97 (-53.59%)
TDRGTransformer-based Dual Relation Graph for Multi-label Image Recognition. ICCV 2021
Stars: ✭ 32 (-84.69%)
OverlapPredator[CVPR 2021, Oral] PREDATOR: Registration of 3D Point Clouds with Low Overlap.
Stars: ✭ 293 (+40.19%)
Image-CaptionUsing LSTM or Transformer to solve Image Captioning in Pytorch
Stars: ✭ 36 (-82.78%)
SDGymBenchmarking synthetic data generation methods.
Stars: ✭ 177 (-15.31%)
KitanaQAKitanaQA: Adversarial training and data augmentation for neural question-answering models
Stars: ✭ 58 (-72.25%)
kosrKorean speech recognition based on transformer (트랜스포머 기반 한국어 음성 인식)
Stars: ✭ 25 (-88.04%)
FNet-pytorchUnofficial implementation of Google's FNet: Mixing Tokens with Fourier Transforms
Stars: ✭ 204 (-2.39%)
DeepPhonemizerGrapheme to phoneme conversion with deep learning.
Stars: ✭ 152 (-27.27%)
trapperState-of-the-art NLP through transformer models in a modular design and consistent APIs.
Stars: ✭ 28 (-86.6%)
SOLQ"SOLQ: Segmenting Objects by Learning Queries", SOLQ is an end-to-end instance segmentation framework with Transformer.
Stars: ✭ 159 (-23.92%)