factedit🧐 Code & Data for Fact-based Text Editing (Iso et al; ACL 2020)
Stars: ✭ 16 (-95.7%)
3d PointcloudPapers and Datasets about Point Cloud.
Stars: ✭ 179 (-51.88%)
Attention MechanismsImplementations for a family of attention mechanisms, suitable for all kinds of natural language processing tasks and compatible with TensorFlow 2.0 and Keras.
Stars: ✭ 203 (-45.43%)
Accelerated TextAccelerated Text is a no-code natural language generation platform. It will help you construct document plans which define how your data is converted to textual descriptions varying in wording and structure.
Stars: ✭ 256 (-31.18%)
Entity2Topic[NAACL2018] Entity Commonsense Representation for Neural Abstractive Summarization
Stars: ✭ 20 (-94.62%)
Kenlg ReadingReading list for knowledge-enhanced text generation, with a survey
Stars: ✭ 257 (-30.91%)
PlanSum[AAAI2021] Unsupervised Opinion Summarization with Content Planning
Stars: ✭ 25 (-93.28%)
keras-deep-learningVarious implementations and projects on CNN, RNN, LSTM, GAN, etc
Stars: ✭ 22 (-94.09%)
HubDataset format for AI. Build, manage, & visualize datasets for deep learning. Stream data real-time to PyTorch/TensorFlow & version-control it. https://activeloop.ai
Stars: ✭ 4,003 (+976.08%)
TransformerA Pytorch Implementation of "Attention is All You Need" and "Weighted Transformer Network for Machine Translation"
Stars: ✭ 271 (-27.15%)
RoapiCreate full-fledged APIs for static datasets without writing a single line of code.
Stars: ✭ 253 (-31.99%)
TextboxTextBox is an open-source library for building text generation system.
Stars: ✭ 257 (-30.91%)
MirnetOfficial repository for "Learning Enriched Features for Real Image Restoration and Enhancement" (ECCV 2020). SOTA results for image denoising, super-resolution, and image enhancement.
Stars: ✭ 247 (-33.6%)
CleoraCleora AI is a general-purpose model for efficient, scalable learning of stable and inductive entity embeddings for heterogeneous relational data.
Stars: ✭ 303 (-18.55%)
SpeechTransProgressTracking the progress in end-to-end speech translation
Stars: ✭ 139 (-62.63%)
newsletter-archiveMarkdown archive & RSS/Atom feeds for Data Is Plural.
Stars: ✭ 65 (-82.53%)
Nlp Projectsword2vec, sentence2vec, machine reading comprehension, dialog system, text classification, pretrained language model (i.e., XLNet, BERT, ELMo, GPT), sequence labeling, information retrieval, information extraction (i.e., entity, relation and event extraction), knowledge graph, text generation, network embedding
Stars: ✭ 360 (-3.23%)
Keras GatKeras implementation of the graph attention networks (GAT) by Veličković et al. (2017; https://arxiv.org/abs/1710.10903)
Stars: ✭ 334 (-10.22%)
Alphafold2To eventually become an unofficial Pytorch implementation / replication of Alphafold2, as details of the architecture get released
Stars: ✭ 298 (-19.89%)
datasetsTFDS data loaders for sign language datasets.
Stars: ✭ 17 (-95.43%)
galerkin-transformer[NeurIPS 2021] Galerkin Transformer: a linear attention without softmax
Stars: ✭ 111 (-70.16%)
MeglassAn eyeglass face dataset collected and cleaned for face recognition evaluation, CCBR 2018.
Stars: ✭ 281 (-24.46%)
Yolo Multi Backbones AttentionModel Compression—YOLOv3 with multi lightweight backbones(ShuffleNetV2 HuaWei GhostNet), attention, prune and quantization
Stars: ✭ 317 (-14.78%)
Seq2seq chatbot基于seq2seq模型的简单对话系统的tf实现,具有embedding、attention、beam_search等功能,数据集是Cornell Movie Dialogs
Stars: ✭ 308 (-17.2%)
Da Rnn📃 **Unofficial** PyTorch Implementation of DA-RNN (arXiv:1704.02971)
Stars: ✭ 256 (-31.18%)
Medical Datasetstracking medical datasets, with a focus on medical imaging
Stars: ✭ 296 (-20.43%)
dbcollectionA collection of popular datasets for deep learning.
Stars: ✭ 26 (-93.01%)
TransformerA TensorFlow Implementation of the Transformer: Attention Is All You Need
Stars: ✭ 3,646 (+880.11%)
modelinaLibrary for generating data models based on inputs such as AsyncAPI, OpenAPI, or JSON Schema documents.
Stars: ✭ 55 (-85.22%)
Seq2seq SummarizerPointer-generator reinforced seq2seq summarization in PyTorch
Stars: ✭ 306 (-17.74%)
Animal MattingGithub repository for the paper End-to-end Animal Image Matting
Stars: ✭ 363 (-2.42%)
NetEmb-DatasetsA collection of real-world networks/graphs for Network Embedding
Stars: ✭ 18 (-95.16%)
OTMapGenUses random noise to generate realistic OTBM terrain with auto-bordering.
Stars: ✭ 29 (-92.2%)
AdaptiveattentionImplementation of "Knowing When to Look: Adaptive Attention via A Visual Sentinel for Image Captioning"
Stars: ✭ 303 (-18.55%)
NndialNNDial is an open source toolkit for building end-to-end trainable task-oriented dialogue models. It is released by Tsung-Hsien (Shawn) Wen from Cambridge Dialogue Systems Group under Apache License 2.0.
Stars: ✭ 332 (-10.75%)
covid-19-data-cleanupScripts to cleanup data from https://github.com/CSSEGISandData/COVID-19
Stars: ✭ 25 (-93.28%)
Open3d MlAn extension of Open3D to address 3D Machine Learning tasks
Stars: ✭ 284 (-23.66%)
Tg Reading ListA text generation reading list maintained by Tsinghua Natural Language Processing Group.
Stars: ✭ 352 (-5.38%)
ChakinSimple downloader for pre-trained word vectors
Stars: ✭ 323 (-13.17%)
TdcTherapeutics Data Commons: Machine Learning Datasets and Tasks for Therapeutics
Stars: ✭ 291 (-21.77%)
recurrent-defocus-deblurring-synth-dual-pixelReference github repository for the paper "Learning to Reduce Defocus Blur by Realistically Modeling Dual-Pixel Data". We propose a procedure to generate realistic DP data synthetically. Our synthesis approach mimics the optical image formation found on DP sensors and can be applied to virtual scenes rendered with standard computer software. Lev…
Stars: ✭ 30 (-91.94%)
Vit PytorchImplementation of Vision Transformer, a simple way to achieve SOTA in vision classification with only a single transformer encoder, in Pytorch
Stars: ✭ 7,199 (+1835.22%)
linformerImplementation of Linformer for Pytorch
Stars: ✭ 119 (-68.01%)
podiumPodium: a framework agnostic Python NLP library for data loading and preprocessing
Stars: ✭ 55 (-85.22%)
Gpt2client✍🏻 gpt2-client: Easy-to-use TensorFlow Wrapper for GPT-2 117M, 345M, 774M, and 1.5B Transformer Models 🤖 📝
Stars: ✭ 322 (-13.44%)
Cluecorpus2020Large-scale Pre-training Corpus for Chinese 100G 中文预训练语料
Stars: ✭ 278 (-25.27%)
disent🧶 Modular VAE disentanglement framework for python built with PyTorch Lightning ▸ Including metrics and datasets ▸ With strongly supervised, weakly supervised and unsupervised methods ▸ Easily configured and run with Hydra config ▸ Inspired by disentanglement_lib
Stars: ✭ 41 (-88.98%)