All Projects → linkedin → Detext

linkedin / Detext

Licence: bsd-2-clause
DeText: A Deep Neural Text Understanding Framework for Ranking and Classification Tasks

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Detext

X Ray Classification
X-ray Images (Chest images) analysis and anomaly detection using Transfer learning with inception v2
Stars: ✭ 83 (-92.01%)
Mutual labels:  classification, deep-neural-networks
Paddlex
PaddlePaddle End-to-End Development Toolkit(『飞桨』深度学习全流程开发工具)
Stars: ✭ 3,399 (+227.14%)
Mutual labels:  classification, deep-neural-networks
Selfdrivingcar
A collection of all projects pertaining to different layers in the SDC software stack
Stars: ✭ 107 (-89.7%)
Mutual labels:  classification, deep-neural-networks
Invoicenet
Deep neural network to extract intelligent information from invoice documents.
Stars: ✭ 1,886 (+81.52%)
Mutual labels:  classification, deep-neural-networks
Rmdl
RMDL: Random Multimodel Deep Learning for Classification
Stars: ✭ 375 (-63.91%)
Mutual labels:  classification, deep-neural-networks
Deep Atrous Cnn Sentiment
Deep-Atrous-CNN-Text-Network: End-to-end word level model for sentiment analysis and other text classifications
Stars: ✭ 64 (-93.84%)
Mutual labels:  classification, deep-neural-networks
Glasses
High-quality Neural Networks for Computer Vision 😎
Stars: ✭ 138 (-86.72%)
Mutual labels:  classification, deep-neural-networks
Pointcnn
PointCNN: Convolution On X-Transformed Points (NeurIPS 2018)
Stars: ✭ 1,120 (+7.8%)
Mutual labels:  classification, deep-neural-networks
Caffe2 Ios
Caffe2 on iOS Real-time Demo. Test with Your Own Model and Photos.
Stars: ✭ 221 (-78.73%)
Mutual labels:  classification, deep-neural-networks
Sparse Evolutionary Artificial Neural Networks
Always sparse. Never dense. But never say never. A repository for the Adaptive Sparse Connectivity concept and its algorithmic instantiation, i.e. Sparse Evolutionary Training, to boost Deep Learning scalability on various aspects (e.g. memory and computational time efficiency, representation and generalization power).
Stars: ✭ 182 (-82.48%)
Mutual labels:  classification, deep-neural-networks
Guesslang
Detect the programming language of a source code
Stars: ✭ 159 (-84.7%)
Mutual labels:  classification, deep-neural-networks
Randwire tensorflow
tensorflow implementation of Exploring Randomly Wired Neural Networks for Image Recognition
Stars: ✭ 29 (-97.21%)
Mutual labels:  classification, deep-neural-networks
Servenet
Service Classification based on Service Description
Stars: ✭ 21 (-97.98%)
Mutual labels:  classification, deep-neural-networks
Constrained attention filter
(ECCV 2020) Tensorflow implementation of A Generic Visualization Approach for Convolutional Neural Networks
Stars: ✭ 36 (-96.54%)
Mutual labels:  classification, deep-neural-networks
Urban Sound Classification
Urban sound source tagging from an aggregation of four second noisy audio clips via 1D and 2D CNN (Xception)
Stars: ✭ 39 (-96.25%)
Mutual labels:  classification
Predicting Myers Briggs Type Indicator With Recurrent Neural Networks
Stars: ✭ 43 (-95.86%)
Mutual labels:  classification
Fullstackmachinelearning
Mostly free resources for end-to-end machine learning engineering, including open courses from CalTech, Columbia, Berkeley, MIT, and Stanford (in alphabetical order).
Stars: ✭ 39 (-96.25%)
Mutual labels:  deep-neural-networks
Timbl
TiMBL implements several memory-based learning algorithms.
Stars: ✭ 38 (-96.34%)
Mutual labels:  classification
Gradient Centralization Tensorflow
Instantly improve your training performance of TensorFlow models with just 2 lines of code!
Stars: ✭ 45 (-95.67%)
Mutual labels:  deep-neural-networks
Ludwig
Data-centric declarative deep learning framework
Stars: ✭ 8,018 (+671.7%)
Mutual labels:  deep-neural-networks

Python 3.6 application Python 3.7 application tensorflow License

DeText: A Deep Neural Text Understanding Framework

Relax like a sloth, let DeText do the understanding for you

What is it

DeText is a Deep Text understanding framework for NLP related ranking, classification, and language generation tasks. It leverages semantic matching using deep neural networks to understand member intents in search and recommender systems. As a general NLP framework, currently DeText can be applied to many tasks, including search & recommendation ranking, multi-class classification and query understanding tasks. More details can be found in this blog post.

Highlight

Design principles for DeText framework:

  • Natural language understanding powered by state-of-the-art deep neural networks

    • Automatic feature extraction with deep models
    • End-to-end training
    • Interaction modeling between ranking sources and targets
  • A general framework with great flexibility to meet requirement of different production applications.

    • Flexible deep model types
    • Multiple loss function choices
    • User defined source/target fields
    • Configurable network structure (layer sizes and #layers)
    • Tunable hyperparameters ...
  • Reaching a good balance between effectiveness and efficiency to meet the industry requirements.

The framework

The DeText framework contains multiple components:

Word embedding layer. It converts the sequence of words into a d by n matrix.

CNN/BERT/LSTM for text encoding layer. It takes into the word embedding matrix as input, and maps the text data into a fixed length embedding. It is worth noting that we adopt the representation based methods over the interaction based methods. The main reason is the computational complexity: The time complexity of interaction based methods is at least O(mnd), which is one order higher than the representation based methods max(O(md), O(nd).

Interaction layer. It generates deep features based on the text embeddings. Many options are provided, such as concatenation, cosine similarity, etc.

Wide & Deep Feature Processing. We combine the traditional features with the interaction features (deep features) in a wide & deep fashion.

MLP layer. The MLP layer is to combine wide features and deep features.

It is an end-to-end model where all the parameters are jointly updated to optimize the click probability.

Model Flexibility

DeText is a general ranking framework that offers great flexibility for clients to build customized networks for their own use cases:

LTR/classification layer: in-house LTR loss implementation, or tf-ranking LTR loss, multi-class classification support.

MLP layer: customizable number of layers and number of dimensions.

Interaction layer: support Cosine Similarity, Outer Product, Hadamard Product, and Concatenation.

Text embedding layer: support CNN, BERT, LSTM-Language-Model with customized parameters on filters, layers, dimensions, etc.

Continuous feature normalization: element-wise scaling, value normalization.

Categorical feature processing: modeled as entity embedding.

All these can be customized via hyper-parameters in the DeText template. Note that tf-ranking is supported in the DeText framework, i.e., users can choose the LTR loss and metrics defined in DeText.

How to use it

Setup dev environment

  1. Create & source your virtualenv
  2. Run setup for DeText:
python setup.py develop

Run tests

Run all tests:

pytest 

Checkout the demo notebooks

notebooks/text_classification_demo.ipynb shows how to use DeText to train a production ready multi-class text classification model. A public query intent classification dataset is used. The notebook includes detailed steps on data preparation, model training, model inference examples.

[TODO] Add a ranking demo notebook

DeText training manual

Users have full control for custom designing DeText models. In the training manual (TRAINING.md), users can find information about the following:

  • Training data format and preparation
  • Key parameters to customize and train DeText models
  • Detailed information about all DeText training parameters for full customization

References

Please cite DeText in your publications if it helps your research:

@manual{guo-liu20,
  author    = {Weiwei Guo and
               Xiaowei Liu and
               Sida Wang and 
               Huiji Gao and
               Bo Long},
  title     = {DeText: A Deep NLP Framework for Intelligent Text Understanding},
  url       = {https://engineering.linkedin.com/blog/2020/open-sourcing-detext},
  year      = {2020}
}

@inproceedings{guo-gao19,
  author    = {Weiwei Guo and
               Huiji Gao and
               Jun Shi and 
               Bo Long},
  title     = {Deep Natural Language Processing for Search Systems},
  booktitle = {ACM SIGIR 2019},
  year      = {2019}
}

@inproceedings{guo-gao19,
  author    = {Weiwei Guo and
               Huiji Gao and
               Jun Shi and 
               Bo Long and 
               Liang Zhang and
               Bee-Chung Chen and
               Deepak Agarwal},
  title     = {Deep Natural Language Processing for Search and Recommender Systems},
  booktitle = {ACM SIGKDD 2019},
  year      = {2019}
}

@inproceedings{guo-liu20,
  author    = {Weiwei Guo and
               Xiaowei Liu and
               Sida Wang and 
               Huiji Gao and
               Ananth Sankar and 
               Zimeng Yang and 
               Qi Guo and 
               Liang Zhang and
               Bo Long and 
               Bee-Chung Chen and 
               Deepak Agarwal},
  title     = {DeText: A Deep Text Ranking Framework with BERT},
  booktitle = {ACM CIKM 2020},
  year      = {2020}
}

@inproceedings{jia-long20,
  author    = {Jun Jia and
               Bo Long and
               Huiji Gao and 
               Weiwei Guo and 
               Jun Shi and
               Xiaowei Liu and
               Mingzhou Zhou and
               Zhoutong Fu and
               Sida Wang and
               Sandeep Kumar Jha},
  title     = {Deep Learning for Search and Recommender Systems in Practice},
  booktitle = {ACM SIGKDD 2020},
  year      = {2020}
}

@inproceedings{wang-guo20,
  author    = {Sida Wang and
               Weiwei Guo and
               Huiji Gao and
               Bo Long},
  title     = {Efficient Neural Query Auto Completion},
  booktitle = {ACM CIKM 2020},
  year      = {2020}
}

@inproceedings{liu-guo20,
  author    = {Xiaowei Liu and
               Weiwei Guo and
               Huiji Gao and
               Bo Long},
  title     = {Deep Search Query Intent Understanding},
  booktitle = {arXiv:2008.06759},
  year      = {2020}
}

Contributing

Please read CONTRIBUTING.md for details on our code of conduct, and the process for submitting pull requests to us.

License

This project is licensed under the BSD 2-CLAUSE LICENSE - see the LICENSE.md file for details

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].