All Projects → easezyc → MetaHeac

easezyc / MetaHeac

Licence: other
This is an official implementation for "Learning to Expand Audience via Meta Hybrid Experts and Critics for Recommendation and Advertising"(KDD2021).

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to MetaHeac

Meta Transfer Learning
TensorFlow and PyTorch implementation of "Meta-Transfer Learning for Few-Shot Learning" (CVPR2019)
Stars: ✭ 439 (+1119.44%)
Mutual labels:  transfer-learning, meta-learning
SIGIR2021 Conure
One Person, One Model, One World: Learning Continual User Representation without Forgetting
Stars: ✭ 23 (-36.11%)
Mutual labels:  recommendation, transfer-learning
Multitask Learning
Awesome Multitask Learning Resources
Stars: ✭ 361 (+902.78%)
Mutual labels:  transfer-learning, meta-learning
mailcoach-support
Questions and support for Mailcoach
Stars: ✭ 32 (-11.11%)
Mutual labels:  marketing, campaign
affiliate
Add affiliation tags to links automatically in the browser
Stars: ✭ 77 (+113.89%)
Mutual labels:  marketing, advertising
meta-learning-progress
Repository to track the progress in Meta-Learning (MtL), including the datasets and the current state-of-the-art for the most common MtL problems.
Stars: ✭ 26 (-27.78%)
Mutual labels:  transfer-learning, meta-learning
Transferlearning
Transfer learning / domain adaptation / domain generalization / multi-task learning etc. Papers, codes, datasets, applications, tutorials.-迁移学习
Stars: ✭ 8,481 (+23458.33%)
Mutual labels:  transfer-learning, meta-learning
pykale
Knowledge-Aware machine LEarning (KALE): accessible machine learning from multiple sources for interdisciplinary research, part of the 🔥PyTorch ecosystem
Stars: ✭ 381 (+958.33%)
Mutual labels:  transfer-learning, meta-learning
tag-manager
Website analytics, JavaScript error tracking + analytics, tag manager, data ingest endpoint creation (tracking pixels). GDPR + CCPA compliant.
Stars: ✭ 279 (+675%)
Mutual labels:  marketing, advertising
campaign-manager
The Camapign Management UI for RTB4Free, the open source bidder / DSP.
Stars: ✭ 24 (-33.33%)
Mutual labels:  campaign, advertising
WSDM2022-PTUPCDR
This is the official implementation of our paper Personalized Transfer of User Preferences for Cross-domain Recommendation (PTUPCDR), which has been accepted by WSDM2022.
Stars: ✭ 65 (+80.56%)
Mutual labels:  recommendation, transfer-learning
tamnun-ml
An easy to use open-source library for advanced Deep Learning and Natural Language Processing
Stars: ✭ 109 (+202.78%)
Mutual labels:  transfer-learning
Deep-Learning-Experiments-implemented-using-Google-Colab
Colab Compatible FastAI notebooks for NLP and Computer Vision Datasets
Stars: ✭ 16 (-55.56%)
Mutual labels:  transfer-learning
super-gradients
Easily train or fine-tune SOTA computer vision models with one open source training library
Stars: ✭ 429 (+1091.67%)
Mutual labels:  transfer-learning
LearningToCompare-Tensorflow
Tensorflow implementation for paper: Learning to Compare: Relation Network for Few-Shot Learning.
Stars: ✭ 17 (-52.78%)
Mutual labels:  meta-learning
translearn
Code implementation of the paper "With Great Training Comes Great Vulnerability: Practical Attacks against Transfer Learning", at USENIX Security 2018
Stars: ✭ 18 (-50%)
Mutual labels:  transfer-learning
TrainCaffeCustomDataset
Transfer learning in Caffe: example on how to train CaffeNet on custom dataset
Stars: ✭ 20 (-44.44%)
Mutual labels:  transfer-learning
mindware
An efficient open-source AutoML system for automating machine learning lifecycle, including feature engineering, neural architecture search, and hyper-parameter tuning.
Stars: ✭ 34 (-5.56%)
Mutual labels:  meta-learning
NaiveNASflux.jl
Your local Flux surgeon
Stars: ✭ 20 (-44.44%)
Mutual labels:  transfer-learning
django-gsheets
Django app for keeping models and google sheets synced
Stars: ✭ 50 (+38.89%)
Mutual labels:  marketing

Learning to Expand Audience via Meta Hybrid Experts and Critics for Recommendation and Advertising

This is an official implementation for Learning to Expand Audience via Meta Hybrid Experts and Critics for Recommendation and Advertising which has been published in KDD2021. paper

Introduction

In recommender systems and advertising platforms, marketers always want to deliver products, contents, or advertisements to potential audiences over media channels such as display, video, or social. Given a set of audiences or customers (seed users), the audience expansion technique (look-alike modeling) is a promising solution to identify more potential audiences, who are similar to the seed users and likely to finish the business goal of the target campaign. However, look-alike modeling faces two challenges: (1) In practice, a company could run hundreds of marketing campaigns to promote various contents within completely different categories every day, e.g., sports, politics, society. Thus, it is difficult to utilize a common method to expand audiences for all campaigns. (2) The seed set of a certain campaign could only cover limited users. Therefore, a customized approach based on such a seed set is likely to be overfitting.

In this paper, to address these challenges, we propose a novel two-stage framework named Meta Hybrid Experts and Critics (MetaHeac) which has been deployed in WeChat Look-alike System. In the offline stage, a general model which can capture the relationships among various tasks is trained from a meta-learning perspective on all existing campaign tasks. In the online stage, for a new campaign, a customized model is learned with the given seed set based on the general model. According to both offline and online experiments, the proposed MetaHeac shows superior effectiveness for both content marketing campaigns in recommender systems and advertising campaigns in advertising platforms. Besides, MetaHeac has been successfully deployed in WeChat for the promotion of both contents and advertisements, leading to great improvement in the quality of marketing.

Requirements

  • Python 3.6
  • Pytorch > 1.0
  • Pandas
  • Numpy

File Structure

.
├── code
│   ├── main.py             # Entry function
│   ├── model.py            # Models
│   ├── metamodel.py        # Training Model from a meta-learning perspective
│   ├── readme.md
│   └── run.py              # Training and Evaluating 
│   └── utils.py            # Some auxiliary classes
└── data
    ├── process.py          # Preprocess the original data
    ├── processed_data      # The folder to contain the processed data

Dataset

We utilized the Tencent Look-alike Dataset. To download the dataset, you can use the following link: Tencent Look-alike Dataset. Then put the data in ./data.

The link of the propocessed dataset: propocessed data

You can use the following command to preprocess the dataset. The final data will be under ./data/processed_data.

python process.py

Run

Parameter Configuration:

  • task_count: the number of tasks in a mini-batch, default for 5
  • num_expert: the number of experts, default for 8
  • num_output: the number of critics, default for 5
  • seed: random seed, default for 2020
  • gpu: the index of gpu you will use, default for 0
  • batchsize: default for 512

You can run this model through:

python main.py --task_count 5 --num_expert 8 --num_output 5 --batchsize 512

Reference

Zhu, Yongchun, et al. "Learning to Expand Audience via Meta Hybrid Experts and Critics for Recommendation and Advertising." Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery & Data Mining. 2021.

or in bibtex style:

@inproceedings{zhu2021learning,
  title={Learning to Expand Audience via Meta Hybrid Experts and Critics for Recommendation and Advertising},
  author={Zhu, Yongchun and Liu, Yudan and Xie, Ruobing and Zhuang, Fuzhen and Hao, Xiaobo and Ge, Kaikai and Zhang, Xu and Lin, Leyu and Cao, Juan},
  booktitle={Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery \& Data Mining},
  pages={4005--4013},
  year={2021}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].