All Projects → HVT → Similar Projects or Alternatives

123 Open source projects that are alternatives of or similar to HVT

geometry-free-view-synthesis
Is a geometric model required to synthesize novel views from a single image?
Stars: ✭ 265 (+919.23%)
Mutual labels:  transformers
bert-squeeze
🛠️ Tools for Transformers compression using PyTorch Lightning ⚡
Stars: ✭ 56 (+115.38%)
Mutual labels:  transformers
robustness-vit
Contains code for the paper "Vision Transformers are Robust Learners" (AAAI 2022).
Stars: ✭ 78 (+200%)
Mutual labels:  transformers
language-planner
Official Code for "Language Models as Zero-Shot Planners: Extracting Actionable Knowledge for Embodied Agents"
Stars: ✭ 84 (+223.08%)
Mutual labels:  transformers
wechsel
Code for WECHSEL: Effective initialization of subword embeddings for cross-lingual transfer of monolingual language models.
Stars: ✭ 39 (+50%)
Mutual labels:  transformers
text2keywords
Trained T5 and T5-large model for creating keywords from text
Stars: ✭ 53 (+103.85%)
Mutual labels:  transformers
PyTorch-Model-Compare
Compare neural networks by their feature similarity
Stars: ✭ 119 (+357.69%)
Mutual labels:  transformers
TorchBlocks
A PyTorch-based toolkit for natural language processing
Stars: ✭ 85 (+226.92%)
Mutual labels:  transformers
deepconsensus
DeepConsensus uses gap-aware sequence transformers to correct errors in Pacific Biosciences (PacBio) Circular Consensus Sequencing (CCS) data.
Stars: ✭ 124 (+376.92%)
Mutual labels:  transformers
remixer-pytorch
Implementation of the Remixer Block from the Remixer paper, in Pytorch
Stars: ✭ 37 (+42.31%)
Mutual labels:  transformers
course-content-dl
NMA deep learning course
Stars: ✭ 537 (+1965.38%)
Mutual labels:  transformers
transformer generalization
The official repository for our paper "The Devil is in the Detail: Simple Tricks Improve Systematic Generalization of Transformers". We significantly improve the systematic generalization of transformer models on a variety of datasets using simple tricks and careful considerations.
Stars: ✭ 58 (+123.08%)
Mutual labels:  transformers
ttt
A package for fine-tuning Transformers with TPUs, written in Tensorflow2.0+
Stars: ✭ 35 (+34.62%)
Mutual labels:  transformers
molecule-attention-transformer
Pytorch reimplementation of Molecule Attention Transformer, which uses a transformer to tackle the graph-like structure of molecules
Stars: ✭ 46 (+76.92%)
Mutual labels:  transformers
Text and Audio classification with Bert
Text Classification in Turkish Texts with Bert
Stars: ✭ 34 (+30.77%)
Mutual labels:  transformers
eve-bot
EVE bot, a customer service chatbot to enhance virtual engagement for Twitter Apple Support
Stars: ✭ 31 (+19.23%)
Mutual labels:  transformers
xpandas
Universal 1d/2d data containers with Transformers functionality for data analysis.
Stars: ✭ 25 (-3.85%)
Mutual labels:  transformers
text2text
Text2Text: Cross-lingual natural language processing and generation toolkit
Stars: ✭ 188 (+623.08%)
Mutual labels:  transformers
long-short-transformer
Implementation of Long-Short Transformer, combining local and global inductive biases for attention over long sequences, in Pytorch
Stars: ✭ 103 (+296.15%)
Mutual labels:  transformers
BERT-NER
Using pre-trained BERT models for Chinese and English NER with 🤗Transformers
Stars: ✭ 114 (+338.46%)
Mutual labels:  transformers
anonymisation
Anonymization of legal cases (Fr) based on Flair embeddings
Stars: ✭ 85 (+226.92%)
Mutual labels:  transformers
robo-vln
Pytorch code for ICRA'21 paper: "Hierarchical Cross-Modal Agent for Robotics Vision-and-Language Navigation"
Stars: ✭ 34 (+30.77%)
Mutual labels:  transformers
elastic transformers
Making BERT stretchy. Semantic Elasticsearch with Sentence Transformers
Stars: ✭ 153 (+488.46%)
Mutual labels:  transformers
iPerceive
Applying Common-Sense Reasoning to Multi-Modal Dense Video Captioning and Video Question Answering | Python3 | PyTorch | CNNs | Causality | Reasoning | LSTMs | Transformers | Multi-Head Self Attention | Published in IEEE Winter Conference on Applications of Computer Vision (WACV) 2021
Stars: ✭ 52 (+100%)
Mutual labels:  transformers
TransCenter
This is the official implementation of TransCenter. The code and pretrained models are now available here: https://gitlab.inria.fr/yixu/TransCenter_official.
Stars: ✭ 82 (+215.38%)
Mutual labels:  transformers
text-classification-transformers
Easy text classification for everyone : Bert based models via Huggingface transformers (KR / EN)
Stars: ✭ 32 (+23.08%)
Mutual labels:  transformers
Product-Categorization-NLP
Multi-Class Text Classification for products based on their description with Machine Learning algorithms and Neural Networks (MLP, CNN, Distilbert).
Stars: ✭ 30 (+15.38%)
Mutual labels:  transformers
deepfrog
An NLP-suite powered by deep learning
Stars: ✭ 16 (-38.46%)
Mutual labels:  transformers
text2class
Multi-class text categorization using state-of-the-art pre-trained contextualized language models, e.g. BERT
Stars: ✭ 15 (-42.31%)
Mutual labels:  transformers
pytorch-vit
An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale
Stars: ✭ 250 (+861.54%)
Mutual labels:  transformers
golgotha
Contextualised Embeddings and Language Modelling using BERT and Friends using R
Stars: ✭ 39 (+50%)
Mutual labels:  transformers
converse
Conversational text Analysis using various NLP techniques
Stars: ✭ 147 (+465.38%)
Mutual labels:  transformers
n-grammer-pytorch
Implementation of N-Grammer, augmenting Transformers with latent n-grams, in Pytorch
Stars: ✭ 50 (+92.31%)
Mutual labels:  transformers
modules
The official repository for our paper "Are Neural Nets Modular? Inspecting Functional Modularity Through Differentiable Weight Masks". We develop a method for analyzing emerging functional modularity in neural networks based on differentiable weight masks and use it to point out important issues in current-day neural networks.
Stars: ✭ 25 (-3.85%)
Mutual labels:  transformers
small-text
Active Learning for Text Classification in Python
Stars: ✭ 241 (+826.92%)
Mutual labels:  transformers
gnn-lspe
Source code for GNN-LSPE (Graph Neural Networks with Learnable Structural and Positional Representations), ICLR 2022
Stars: ✭ 165 (+534.62%)
Mutual labels:  transformers
minicons
Utility for analyzing Transformer based representations of language.
Stars: ✭ 28 (+7.69%)
Mutual labels:  transformers
DocSum
A tool to automatically summarize documents abstractively using the BART or PreSumm Machine Learning Model.
Stars: ✭ 58 (+123.08%)
Mutual labels:  transformers
ParsBigBird
Persian Bert For Long-Range Sequences
Stars: ✭ 58 (+123.08%)
Mutual labels:  transformers
WellcomeML
Repository for Machine Learning utils at the Wellcome Trust
Stars: ✭ 31 (+19.23%)
Mutual labels:  transformers
classy
classy is a simple-to-use library for building high-performance Machine Learning models in NLP.
Stars: ✭ 61 (+134.62%)
Mutual labels:  transformers
lightning-transformers
Flexible components pairing 🤗 Transformers with Pytorch Lightning
Stars: ✭ 551 (+2019.23%)
Mutual labels:  transformers
GoEmotions-pytorch
Pytorch Implementation of GoEmotions 😍😢😱
Stars: ✭ 95 (+265.38%)
Mutual labels:  transformers
label-studio-transformers
Label data using HuggingFace's transformers and automatically get a prediction service
Stars: ✭ 117 (+350%)
Mutual labels:  transformers
nuwa-pytorch
Implementation of NÜWA, state of the art attention network for text to video synthesis, in Pytorch
Stars: ✭ 347 (+1234.62%)
Mutual labels:  transformers
Text-Summarization
Abstractive and Extractive Text summarization using Transformers.
Stars: ✭ 38 (+46.15%)
Mutual labels:  transformers
MISE
Multimodal Image Synthesis and Editing: A Survey
Stars: ✭ 214 (+723.08%)
Mutual labels:  transformers
Transformers-Tutorials
This repository contains demos I made with the Transformers library by HuggingFace.
Stars: ✭ 2,828 (+10776.92%)
Mutual labels:  transformers
smaller-transformers
Load What You Need: Smaller Multilingual Transformers for Pytorch and TensorFlow 2.0.
Stars: ✭ 66 (+153.85%)
Mutual labels:  transformers
transformers-interpret
Model explainability that works seamlessly with 🤗 transformers. Explain your transformers model in just 2 lines of code.
Stars: ✭ 861 (+3211.54%)
Mutual labels:  transformers
optimum
🏎️ Accelerate training and inference of 🤗 Transformers with easy to use hardware optimization tools
Stars: ✭ 567 (+2080.77%)
Mutual labels:  transformers
pysentimiento
A Python multilingual toolkit for Sentiment Analysis and Social NLP tasks
Stars: ✭ 274 (+953.85%)
Mutual labels:  transformers
spark-transformers
Spark-Transformers: Library for exporting Apache Spark MLLIB models to use them in any Java application with no other dependencies.
Stars: ✭ 39 (+50%)
Mutual labels:  transformers
chef-transformer
Chef Transformer 🍲 .
Stars: ✭ 29 (+11.54%)
Mutual labels:  transformers
OpenDialog
An Open-Source Package for Chinese Open-domain Conversational Chatbot (中文闲聊对话系统,一键部署微信闲聊机器人)
Stars: ✭ 94 (+261.54%)
Mutual labels:  transformers
C-Tran
General Multi-label Image Classification with Transformers
Stars: ✭ 106 (+307.69%)
Mutual labels:  transformers
serverless-transformers-on-aws-lambda
Deploy transformers serverless on AWS Lambda
Stars: ✭ 100 (+284.62%)
Mutual labels:  transformers
KoELECTRA-Pipeline
Transformers Pipeline with KoELECTRA
Stars: ✭ 37 (+42.31%)
Mutual labels:  transformers
Pytorch-NLU
Pytorch-NLU,一个中文文本分类、序列标注工具包,支持中文长文本、短文本的多类、多标签分类任务,支持中文命名实体识别、词性标注、分词等序列标注任务。 Ptorch NLU, a Chinese text classification and sequence annotation toolkit, supports multi class and multi label classification tasks of Chinese long text and short text, and supports sequence annotation tasks such as Chinese named entity recognition, part of speech ta…
Stars: ✭ 151 (+480.77%)
Mutual labels:  transformers
1-60 of 123 similar projects