All Projects → qiaoguan → Deep Ctr Prediction

qiaoguan / Deep Ctr Prediction

CTR prediction models based on deep learning(基于深度学习的广告推荐CTR预估模型)

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Deep Ctr Prediction

Rezero
Official PyTorch Repo for "ReZero is All You Need: Fast Convergence at Large Depth"
Stars: ✭ 317 (-49.52%)
Mutual labels:  resnet, transformer
Eeg Dl
A Deep Learning library for EEG Tasks (Signals) Classification, based on TensorFlow.
Stars: ✭ 165 (-73.73%)
Mutual labels:  resnet, transformer
Former
Simple transformer implementation from scratch in pytorch.
Stars: ✭ 500 (-20.38%)
Mutual labels:  transformer
Cifar Zoo
PyTorch implementation of CNNs for CIFAR benchmark
Stars: ✭ 584 (-7.01%)
Mutual labels:  resnet
Medicalzoopytorch
A pytorch-based deep learning framework for multi-modal 2D/3D medical image segmentation
Stars: ✭ 546 (-13.06%)
Mutual labels:  resnet
Tf Tutorials
A collection of deep learning tutorials using Tensorflow and Python
Stars: ✭ 524 (-16.56%)
Mutual labels:  resnet
Speech Transformer
A PyTorch implementation of Speech Transformer, an End-to-End ASR with Transformer network on Mandarin Chinese.
Stars: ✭ 565 (-10.03%)
Mutual labels:  transformer
Nmt Keras
Neural Machine Translation with Keras
Stars: ✭ 501 (-20.22%)
Mutual labels:  transformer
Wenet
Production First and Production Ready End-to-End Speech Recognition Toolkit
Stars: ✭ 617 (-1.75%)
Mutual labels:  transformer
Android Viewpager Transformers
A collection of view pager transformers
Stars: ✭ 546 (-13.06%)
Mutual labels:  transformer
React Native Svg Transformer
Import SVG files in your React Native project the same way that you would in a Web application.
Stars: ✭ 568 (-9.55%)
Mutual labels:  transformer
Video Classification
Tutorial for video classification/ action recognition using 3D CNN/ CNN+RNN on UCF101
Stars: ✭ 543 (-13.54%)
Mutual labels:  resnet
Mmclassification
OpenMMLab Image Classification Toolbox and Benchmark
Stars: ✭ 532 (-15.29%)
Mutual labels:  resnet
Awesome Bert Nlp
A curated list of NLP resources focused on BERT, attention mechanism, Transformer networks, and transfer learning.
Stars: ✭ 567 (-9.71%)
Mutual labels:  transformer
Rust Bert
Rust native ready-to-use NLP pipelines and transformer-based models (BERT, DistilBERT, GPT2,...)
Stars: ✭ 510 (-18.79%)
Mutual labels:  transformer
Typescript Is
Stars: ✭ 595 (-5.25%)
Mutual labels:  transformer
Lightseq
LightSeq: A High Performance Inference Library for Sequence Processing and Generation
Stars: ✭ 501 (-20.22%)
Mutual labels:  transformer
Athena
an open-source implementation of sequence-to-sequence based speech processing engine
Stars: ✭ 542 (-13.69%)
Mutual labels:  transformer
Bert paper chinese translation
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding 论文的中文翻译 Chinese Translation!
Stars: ✭ 564 (-10.19%)
Mutual labels:  transformer
Awesome Fast Attention
list of efficient attention modules
Stars: ✭ 627 (-0.16%)
Mutual labels:  transformer

deep-ctr-prediction

一些广告算法(CTR预估)相关的DNN模型

  • wide&deep 可以参考official/wide_deep

  • deep&cross

  • deepfm

  • ESMM

  • Deep Interest Network

  • ResNet

  • xDeepFM

  • AFM(Attentional FM)

  • Transformer

  • FiBiNET

代码使用tf.estimator构建, 数据存储为tfrecord格式(字典,key:value), 采用tf.Dataset API, 加快IO速度,支持工业级的应用。特征工程定义在input_fn,模型定义在model_fn,实现特征和模型代码分离,特征工程代码只用修改input_fn,模型代码只用修改model_fn。数据默认都是存在hadoop,可以根据自己需求存在本地, 特征工程和数据的处理可以参考Google开源的wide&deep模型(不使用tfrecord格式, 代码在official/wide_deep)

All codes are written based on tf.estimator API, the data is stored in tfrecord format(dictionary, key:value), and the tf.Dataset API is used to speed up IO speed, it support industrial applications. Feature engineering is defined in input_fn, model function is defined in model_fn, the related code of feature engineering and model function is completely separated, the data is stored in hadoop by default, and can be locally stored according to your own need. The feature engineering and data processing can refer to Google's open source wide&deep model(without tfrecord format, codes are available at official/wide_deep)

Requirements

  • Tensorflow 1.10

参考文献

【1】Heng-Tze Cheng, Levent Koc et all. "Wide & Deep Learning for Recommender Systems," In 1st Workshop on Deep Learning for Recommender Systems,2016.

【2】Huifeng Guo et all. "DeepFM: A Factorization-Machine based Neural Network for CTR Prediction," In IJCAI,2017.

【3】Ruoxi Wang et all. "Deep & Cross Network for Ad Click Predictions," In ADKDD,2017.

【4】Xiao Ma et all. "Entire Space Multi-Task Model: An Effective Approach for Estimating Post-Click Conversion Rate," In SIGIR,2018.

【5】Guorui Zhou et all. "Deep Interest Network for Click-Through Rate Prediction," In KDD,2018.

【6】Kaiming He et all. "Deep Residual Learning for Image Recognition," In CVPR,2016.

【7】Jianxun Lian et all. "xDeepFM: Combining Explicit and Implicit Feature Interactions for Recommender Systems," In KDD,2018.

【8】Jun Xiao et all. "Attentional Factorization Machines: Learning the Weight of Feature Interactions via Attention Networks," In IJCAI, 2017.

【9】Ashish Vasmani et all. "Attention is All You Need," In NIPS, 2017.

【10】Tongwen et all. "FiBiNET: Combining Feature Importance and Bilinear feature Interaction for Click-Through Rate Prediction," In RecSys, 2019.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].