Neural spEnd-to-end ASR/LM implementation with PyTorch
Stars: ✭ 408 (+206.77%)
NeuronblocksNLP DNN Toolkit - Building Your NLP DNN Models Like Playing Lego
Stars: ✭ 1,356 (+919.55%)
clfzooA deep text classifiers library.
Stars: ✭ 37 (-72.18%)
dodrioExploring attention weights in transformer-based models with linguistic knowledge.
Stars: ✭ 233 (+75.19%)
Fasttext.pyA Python interface for Facebook fastText
Stars: ✭ 1,091 (+720.3%)
Multi-task-Conditional-Attention-NetworksA prototype version of our submitted paper: Conversion Prediction Using Multi-task Conditional Attention Networks to Support the Creation of Effective Ad Creatives.
Stars: ✭ 21 (-84.21%)
classifier multi labelmulti-label,classifier,text classification,多标签文本分类,文本分类,BERT,ALBERT,multi-label-classification
Stars: ✭ 127 (-4.51%)
Image Caption GeneratorA neural network to generate captions for an image using CNN and RNN with BEAM Search.
Stars: ✭ 126 (-5.26%)
alter-nluNatural language understanding library for chatbots with intent recognition and entity extraction.
Stars: ✭ 45 (-66.17%)
PaperrobotCode for PaperRobot: Incremental Draft Generation of Scientific Ideas
Stars: ✭ 372 (+179.7%)
TextFeatureSelectionPython library for feature selection for text features. It has filter method, genetic algorithm and TextFeatureSelectionEnsemble for improving text classification models. Helps improve your machine learning models
Stars: ✭ 42 (-68.42%)
Nlp Projectsword2vec, sentence2vec, machine reading comprehension, dialog system, text classification, pretrained language model (i.e., XLNet, BERT, ELMo, GPT), sequence labeling, information retrieval, information extraction (i.e., entity, relation and event extraction), knowledge graph, text generation, network embedding
Stars: ✭ 360 (+170.68%)
resolutions-2019A list of data mining and machine learning papers that I implemented in 2019.
Stars: ✭ 19 (-85.71%)
Dhf1kRevisiting Video Saliency: A Large-scale Benchmark and a New Model (CVPR18, PAMI19)
Stars: ✭ 96 (-27.82%)
SimgnnA PyTorch implementation of "SimGNN: A Neural Network Approach to Fast Graph Similarity Computation" (WSDM 2019).
Stars: ✭ 351 (+163.91%)
Artificial Adversary🗣️ Tool to generate adversarial text examples and test machine learning models against them
Stars: ✭ 348 (+161.65%)
Bdci2017 MinglueBDCI2017-让AI当法官,决赛第四(4/415)https://www.datafountain.cn/competitions/277/details
Stars: ✭ 118 (-11.28%)
comparable-text-minerComparable documents miner: Arabic-English morphological analysis, text processing, n-gram features extraction, POS tagging, dictionary translation, documents alignment, corpus information, text classification, tf-idf computation, text similarity computation, html documents cleaning
Stars: ✭ 31 (-76.69%)
TransformerA TensorFlow Implementation of the Transformer: Attention Is All You Need
Stars: ✭ 3,646 (+2641.35%)
TextclassificationAll kinds of neural text classifiers implemented by Keras
Stars: ✭ 51 (-61.65%)
deepnlp小时候练手的nlp项目
Stars: ✭ 11 (-91.73%)
Keras GatKeras implementation of the graph attention networks (GAT) by Veličković et al. (2017; https://arxiv.org/abs/1710.10903)
Stars: ✭ 334 (+151.13%)
Compact-Global-DescriptorPytorch implementation of "Compact Global Descriptor for Neural Networks" (CGD).
Stars: ✭ 22 (-83.46%)
EqtransformerEQTransformer, a python package for earthquake signal detection and phase picking using AI.
Stars: ✭ 95 (-28.57%)
Brain-Tumor-SegmentationAttention-Guided Version of 2D UNet for Automatic Brain Tumor Segmentation
Stars: ✭ 125 (-6.02%)
Yolo Multi Backbones AttentionModel Compression—YOLOv3 with multi lightweight backbones(ShuffleNetV2 HuaWei GhostNet), attention, prune and quantization
Stars: ✭ 317 (+138.35%)
seededldaSemisupervided LDA for theory-driven text analysis
Stars: ✭ 46 (-65.41%)
h-transformer-1dImplementation of H-Transformer-1D, Hierarchical Attention for Sequence Learning
Stars: ✭ 121 (-9.02%)
Absa kerasKeras Implementation of Aspect based Sentiment Analysis
Stars: ✭ 126 (-5.26%)
backpropBackprop makes it simple to use, finetune, and deploy state-of-the-art ML models.
Stars: ✭ 229 (+72.18%)
Alphafold2To eventually become an unofficial Pytorch implementation / replication of Alphafold2, as details of the architecture get released
Stars: ✭ 298 (+124.06%)
SequenceToSequenceA seq2seq with attention dialogue/MT model implemented by TensorFlow.
Stars: ✭ 11 (-91.73%)
SiGATsource code for signed graph attention networks (ICANN2019) & SDGNN (AAAI2021)
Stars: ✭ 37 (-72.18%)
Attention is all you needTransformer of "Attention Is All You Need" (Vaswani et al. 2017) by Chainer.
Stars: ✭ 303 (+127.82%)
axial-attentionImplementation of Axial attention - attending to multi-dimensional data efficiently
Stars: ✭ 245 (+84.21%)
Nlp RecipesNatural Language Processing Best Practices & Examples
Stars: ✭ 5,783 (+4248.12%)
Video-Cap🎬 Video Captioning: ICCV '15 paper implementation
Stars: ✭ 44 (-66.92%)
Text Cnn嵌入Word2vec词向量的CNN中文文本分类
Stars: ✭ 298 (+124.06%)
Nlp estimator tutorialEducational material on using the TensorFlow Estimator framework for text classification
Stars: ✭ 131 (-1.5%)
Abstractive SummarizationImplementation of abstractive summarization using LSTM in the encoder-decoder architecture with local attention.
Stars: ✭ 128 (-3.76%)
Ml ProjectsML based projects such as Spam Classification, Time Series Analysis, Text Classification using Random Forest, Deep Learning, Bayesian, Xgboost in Python
Stars: ✭ 127 (-4.51%)
DrlnDensely Residual Laplacian Super-resolution, IEEE Pattern Analysis and Machine Intelligence (TPAMI), 2020
Stars: ✭ 120 (-9.77%)
KadotKadot, the unsupervised natural language processing library.
Stars: ✭ 108 (-18.8%)