PaddleslimPaddleSlim is an open-source library for deep model compression and architecture search.
Stars: ✭ 677 (+357.43%)
Pytorch gbw lmPyTorch Language Model for 1-Billion Word (LM1B / GBW) Dataset
Stars: ✭ 101 (-31.76%)
Bert NerUse Google's BERT for named entity recognition (CoNLL-2003 as the dataset).
Stars: ✭ 1,012 (+583.78%)
Transfer NlpNLP library designed for reproducible experimentation management
Stars: ✭ 287 (+93.92%)
Word-Prediction-NgramNext Word Prediction using n-gram Probabilistic Model with various Smoothing Techniques
Stars: ✭ 25 (-83.11%)
TextpipeTextpipe: clean and extract metadata from text
Stars: ✭ 284 (+91.89%)
Micronetmicronet, a model compression and deploy lib. compression: 1、quantization: quantization-aware-training(QAT), High-Bit(>2b)(DoReFa/Quantization and Training of Neural Networks for Efficient Integer-Arithmetic-Only Inference)、Low-Bit(≤2b)/Ternary and Binary(TWN/BNN/XNOR-Net); post-training-quantization(PTQ), 8-bit(tensorrt); 2、 pruning: normal、regular and group convolutional channel pruning; 3、 group convolution structure; 4、batch-normalization fuse for quantization. deploy: tensorrt, fp32/fp16/int8(ptq-calibration)、op-adapt(upsample)、dynamic_shape
Stars: ✭ 1,232 (+732.43%)
Dl Nlp ReadingsMy Reading Lists of Deep Learning and Natural Language Processing
Stars: ✭ 656 (+343.24%)
NeuronerNamed-entity recognition using neural networks. Easy-to-use and state-of-the-art results.
Stars: ✭ 1,579 (+966.89%)
Etaggerreference tensorflow code for named entity tagging
Stars: ✭ 100 (-32.43%)
Model OptimizationA toolkit to optimize ML models for deployment for Keras and TensorFlow, including quantization and pruning.
Stars: ✭ 992 (+570.27%)
trinity-ieInformation extraction pipeline containing coreference resolution, named entity linking, and relationship extraction
Stars: ✭ 59 (-60.14%)
StanzaOfficial Stanford NLP Python Library for Many Human Languages
Stars: ✭ 5,887 (+3877.7%)
2020CCF-NER2020 CCF大数据与计算智能大赛-非结构化商业文本信息中隐私信息识别-第7名方案
Stars: ✭ 66 (-55.41%)
nejiFlexible and powerful platform for biomedical information extraction from text
Stars: ✭ 37 (-75%)
KobertKorean BERT pre-trained cased (KoBERT)
Stars: ✭ 591 (+299.32%)
Keras Gpt 2Load GPT-2 checkpoint and generate texts
Stars: ✭ 113 (-23.65%)
slotminerTool for slot extraction from text
Stars: ✭ 15 (-89.86%)
Black-Box-TuningICML'2022: Black-Box Tuning for Language-Model-as-a-Service
Stars: ✭ 99 (-33.11%)
Full stack transformerPytorch library for end-to-end transformer models training, inference and serving
Stars: ✭ 71 (-52.03%)
InformationExtractionSystemInformation Extraction System can perform NLP tasks like Named Entity Recognition, Sentence Simplification, Relation Extraction etc.
Stars: ✭ 27 (-81.76%)
Hanlp中文分词 词性标注 命名实体识别 依存句法分析 成分句法分析 语义依存分析 语义角色标注 指代消解 风格转换 语义相似度 新词发现 关键词短语提取 自动摘要 文本分类聚类 拼音简繁转换 自然语言处理
Stars: ✭ 24,626 (+16539.19%)
open clipAn open source implementation of CLIP.
Stars: ✭ 1,534 (+936.49%)
Ner AnnotatorNamed Entity Recognition (NER) Annotation tool for SpaCy. Generates Traning Data as a JSON which can be readily used.
Stars: ✭ 127 (-14.19%)
DebertaThe implementation of DeBERTa
Stars: ✭ 541 (+265.54%)
cscgCode Generation as a Dual Task of Code Summarization.
Stars: ✭ 28 (-81.08%)
Seq2annotation基于 TensorFlow & PaddlePaddle 的通用序列标注算法库(目前包含 BiLSTM+CRF, Stacked-BiLSTM+CRF 和 IDCNN+CRF,更多算法正在持续添加中)实现中文分词(Tokenizer / segmentation)、词性标注(Part Of Speech, POS)和命名实体识别(Named Entity Recognition, NER)等序列标注任务。
Stars: ✭ 70 (-52.7%)
verseagilityRamp up your custom natural language processing (NLP) task, allowing you to bring your own data, use your preferred frameworks and bring models into production.
Stars: ✭ 23 (-84.46%)
Ner LstmNamed Entity Recognition using multilayered bidirectional LSTM
Stars: ✭ 532 (+259.46%)
RerankNERNeural Reranking for Named Entity Recognition, accepted as regular paper at RANLP 2017
Stars: ✭ 22 (-85.14%)
Nlp PapersPapers and Book to look at when starting NLP 📚
Stars: ✭ 111 (-25%)
Shukongdashi使用知识图谱,自然语言处理,卷积神经网络等技术,基于python语言,设计了一个数控领域故障诊断专家系统
Stars: ✭ 109 (-26.35%)
ESNACLearnable Embedding Space for Efficient Neural Architecture Compression
Stars: ✭ 27 (-81.76%)
Cross Domain nerCross-domain NER using cross-domain language modeling, code for ACL 2019 paper
Stars: ✭ 67 (-54.73%)
ATMC[NeurIPS'2019] Shupeng Gui, Haotao Wang, Haichuan Yang, Chen Yu, Zhangyang Wang, Ji Liu, “Model Compression with Adversarial Robustness: A Unified Optimization Framework”
Stars: ✭ 41 (-72.3%)
Rnn For Joint NluTensorflow implementation of "Attention-Based Recurrent Neural Network Models for Joint Intent Detection and Slot Filling" (https://arxiv.org/abs/1609.01454)
Stars: ✭ 281 (+89.86%)
memex-gateGeneral Architecture for Text Engineering
Stars: ✭ 47 (-68.24%)
Tokenizers💥 Fast State-of-the-Art Tokenizers optimized for Research and Production
Stars: ✭ 5,077 (+3330.41%)
Awesome Persian Nlp IrCurated List of Persian Natural Language Processing and Information Retrieval Tools and Resources
Stars: ✭ 460 (+210.81%)
RnnsharpRNNSharp is a toolkit of deep recurrent neural network which is widely used for many different kinds of tasks, such as sequence labeling, sequence-to-sequence and so on. It's written by C# language and based on .NET framework 4.6 or above versions. RNNSharp supports many different types of networks, such as forward and bi-directional network, sequence-to-sequence network, and different types of layers, such as LSTM, Softmax, sampled Softmax and others.
Stars: ✭ 277 (+87.16%)
NniAn open source AutoML toolkit for automate machine learning lifecycle, including feature engineering, neural architecture search, model compression and hyper-parameter tuning.
Stars: ✭ 10,698 (+7128.38%)
Knowledge Distillation PytorchA PyTorch implementation for exploring deep and shallow knowledge distillation (KD) experiments with flexibility
Stars: ✭ 986 (+566.22%)
BluebertBlueBERT, pre-trained on PubMed abstracts and clinical notes (MIMIC-III).
Stars: ✭ 273 (+84.46%)
Boilerplate Dynet Rnn LmBoilerplate code for quickly getting set up to run language modeling experiments
Stars: ✭ 37 (-75%)
Sequence taggingusing bilstm-crf,bert and other methods to do sequence tagging task
Stars: ✭ 263 (+77.7%)
Nlp Interview Notes本项目是作者们根据个人面试和经验总结出的自然语言处理(NLP)面试准备的学习笔记与资料,该资料目前包含 自然语言处理各领域的 面试题积累。
Stars: ✭ 207 (+39.86%)
Mt DnnMulti-Task Deep Neural Networks for Natural Language Understanding
Stars: ✭ 1,871 (+1164.19%)
MicroexpnetMicroExpNet: An Extremely Small and Fast Model For Expression Recognition From Frontal Face Images
Stars: ✭ 121 (-18.24%)
Universal Data ToolCollaborate & label any type of data, images, text, or documents, in an easy web interface or desktop app.
Stars: ✭ 1,356 (+816.22%)