Neuralcoref✨Fast Coreference Resolution in spaCy with Neural Networks
Stars: ✭ 2,453 (+2192.52%)
hmrbPython Rule Processing Engine 🏺
Stars: ✭ 65 (-39.25%)
Gpt2 ChineseChinese version of GPT2 training code, using BERT tokenizer.
Stars: ✭ 4,592 (+4191.59%)
spacy hunspell✏️ Hunspell extension for spaCy 2.0.
Stars: ✭ 94 (-12.15%)
Gpt 2 Tensorflow2.0OpenAI GPT2 pre-training and sequence prediction implementation in Tensorflow 2.0
Stars: ✭ 172 (+60.75%)
spacy-langdetectA fully customisable language detection pipeline for spaCy
Stars: ✭ 86 (-19.63%)
spacymoji💙 Emoji handling and meta data for spaCy with custom extension attributes
Stars: ✭ 174 (+62.62%)
Gpt2 ChitchatGPT2 for Chinese chitchat/用于中文闲聊的GPT2模型(实现了DialoGPT的MMI思想)
Stars: ✭ 1,230 (+1049.53%)
extractacySpacy pipeline object for extracting values that correspond to a named entity (e.g., birth dates, account numbers, laboratory results)
Stars: ✭ 47 (-56.07%)
spacy conllPipeline component for spaCy (and other spaCy-wrapped parsers such as spacy-stanza and spacy-udpipe) that adds CoNLL-U properties to a Doc and its sentences and tokens. Can also be used as a command-line tool.
Stars: ✭ 60 (-43.93%)
amrCornell AMR Semantic Parser (Artzi et al., EMNLP 2015)
Stars: ✭ 23 (-78.5%)
springSPRING is a seq2seq model for Text-to-AMR and AMR-to-Text (AAAI2021).
Stars: ✭ 103 (-3.74%)
Gpt2client✍🏻 gpt2-client: Easy-to-use TensorFlow Wrapper for GPT-2 117M, 345M, 774M, and 1.5B Transformer Models 🤖 📝
Stars: ✭ 322 (+200.93%)
Onnxt5Summarization, translation, sentiment-analysis, text-generation and more at blazing speed using a T5 version implemented in ONNX.
Stars: ✭ 143 (+33.64%)
spaczzFuzzy matching and more functionality for spaCy.
Stars: ✭ 215 (+100.93%)
Gpt2 FrenchGPT-2 French demo | Démo française de GPT-2
Stars: ✭ 47 (-56.07%)
DialogptLarge-scale pretraining for dialogue
Stars: ✭ 1,177 (+1000%)
augmentyAugmenty is an augmentation library based on spaCy for augmenting texts.
Stars: ✭ 101 (-5.61%)
PytextrankPython implementation of TextRank for phrase extraction and summarization of text documents
Stars: ✭ 1,675 (+1465.42%)
Gpt2 NewstitleChinese NewsTitle Generation Project by GPT2.带有超级详细注释的中文GPT2新闻标题生成项目。
Stars: ✭ 235 (+119.63%)
spacy-iwnlpGerman lemmatization with IWNLP as extension for spaCy
Stars: ✭ 22 (-79.44%)
CSV2RDFStreaming, transforming, SPARQL-based CSV to RDF converter. Apache license.
Stars: ✭ 48 (-55.14%)
TadTREnd-to-end Temporal Action Detection with Transformer. [Under review for a journal publication]
Stars: ✭ 55 (-48.6%)
deformer[ACL 2020] DeFormer: Decomposing Pre-trained Transformers for Faster Question Answering
Stars: ✭ 111 (+3.74%)
EpiTatorEpiTator annotates epidemiological information in text documents. It is the natural language processing framework that powers GRITS and EIDR Connect.
Stars: ✭ 38 (-64.49%)
RL-based-Graph2Seq-for-NQGCode & data accompanying the ICLR 2020 paper "Reinforcement Learning Based Graph-to-Sequence Model for Natural Question Generation"
Stars: ✭ 104 (-2.8%)
Transformer-in-TransformerAn Implementation of Transformer in Transformer in TensorFlow for image classification, attention inside local patches
Stars: ✭ 40 (-62.62%)
MinTLMinTL: Minimalist Transfer Learning for Task-Oriented Dialogue Systems
Stars: ✭ 61 (-42.99%)
SpectrumSpectrum is an AI that uses machine learning to generate Rap song lyrics
Stars: ✭ 37 (-65.42%)
linorobot2Autonomous mobile robots (2WD, 4WD, Mecanum Drive)
Stars: ✭ 97 (-9.35%)
segmenter[ICCV2021] Official PyTorch implementation of Segmenter: Transformer for Semantic Segmentation
Stars: ✭ 463 (+332.71%)
EmbeddingEmbedding模型代码和学习笔记总结
Stars: ✭ 25 (-76.64%)
RgxGenRegex: generate matching and non matching strings based on regex pattern.
Stars: ✭ 45 (-57.94%)
HRFormerThis is an official implementation of our NeurIPS 2021 paper "HRFormer: High-Resolution Transformer for Dense Prediction".
Stars: ✭ 357 (+233.64%)
ritaWebsite, documentation and examples for RiTa
Stars: ✭ 42 (-60.75%)
AMPEAdaptive Mesh Phase-field Evolution
Stars: ✭ 18 (-83.18%)
Restormer[CVPR 2022--Oral] Restormer: Efficient Transformer for High-Resolution Image Restoration. SOTA for motion deblurring, image deraining, denoising (Gaussian/real data), and defocus deblurring.
Stars: ✭ 586 (+447.66%)
LaTeX-OCRpix2tex: Using a ViT to convert images of equations into LaTeX code.
Stars: ✭ 1,566 (+1363.55%)
keras-deep-learningVarious implementations and projects on CNN, RNN, LSTM, GAN, etc
Stars: ✭ 22 (-79.44%)
Learning-Lab-C-LibraryThis library provides a set of basic functions for different type of deep learning (and other) algorithms in C.This deep learning library will be constantly updated
Stars: ✭ 20 (-81.31%)
Vision-Language-TransformerVision-Language Transformer and Query Generation for Referring Segmentation (ICCV 2021)
Stars: ✭ 127 (+18.69%)
transformer-sltSign Language Translation with Transformers (COLING'2020, ECCV'20 SLRTP Workshop)
Stars: ✭ 92 (-14.02%)
spaCyTextBlobA TextBlob sentiment analysis pipeline component for spaCy.
Stars: ✭ 30 (-71.96%)
transformerNeutron: A pytorch based implementation of Transformer and its variants.
Stars: ✭ 60 (-43.93%)
Walk-TransformerFrom Random Walks to Transformer for Learning Node Embeddings (ECML-PKDD 2020) (In Pytorch and Tensorflow)
Stars: ✭ 26 (-75.7%)
fiction generatorFiction generator with Tensorflow. 模仿王小波的风格的小说生成器
Stars: ✭ 27 (-74.77%)
MAESTROA low Mach number stellar hydrodynamics code
Stars: ✭ 29 (-72.9%)
presidio-researchThis package features data-science related tasks for developing new recognizers for Presidio. It is used for the evaluation of the entire system, as well as for evaluating specific PII recognizers or PII detection models.
Stars: ✭ 62 (-42.06%)
OverlapPredator[CVPR 2021, Oral] PREDATOR: Registration of 3D Point Clouds with Low Overlap.
Stars: ✭ 293 (+173.83%)
visualizationa collection of visualization function
Stars: ✭ 189 (+76.64%)
tutelTutel MoE: An Optimized Mixture-of-Experts Implementation
Stars: ✭ 183 (+71.03%)
FragmentVCAny-to-any voice conversion by end-to-end extracting and fusing fine-grained voice fragments with attention
Stars: ✭ 134 (+25.23%)
Entity2Topic[NAACL2018] Entity Commonsense Representation for Neural Abstractive Summarization
Stars: ✭ 20 (-81.31%)