All Projects → zxlzr → Fewshotnlp

zxlzr / Fewshotnlp

The source codes of the paper "Improving Few-shot Text Classification via Pretrained Language Representations" and "When Low Resource NLP Meets Unsupervised Language Model: Meta-pretraining Then Meta-learning for Few-shot Text Classification".

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Fewshotnlp

Transferlearning
Transfer learning / domain adaptation / domain generalization / multi-task learning etc. Papers, codes, datasets, applications, tutorials.-迁移学习
Stars: ✭ 8,481 (+7274.78%)
Mutual labels:  meta-learning
G Meta
Graph meta learning via local subgraphs (NeurIPS 2020)
Stars: ✭ 50 (-56.52%)
Mutual labels:  meta-learning
R2d2
[ICLR'19] Meta-learning with differentiable closed-form solvers
Stars: ✭ 96 (-16.52%)
Mutual labels:  meta-learning
Mt Net
Code accompanying the ICML-2018 paper "Gradient-Based Meta-Learning with Learned Layerwise Metric and Subspace"
Stars: ✭ 30 (-73.91%)
Mutual labels:  meta-learning
L2p Gnn
Codes and datasets for AAAI-2021 paper "Learning to Pre-train Graph Neural Networks"
Stars: ✭ 48 (-58.26%)
Mutual labels:  meta-learning
Neural Process Family
Code for the Neural Processes website and replication of 4 papers on NPs. Pytorch implementation.
Stars: ✭ 53 (-53.91%)
Mutual labels:  meta-learning
Hcn Prototypeloss Pytorch
Hierarchical Co-occurrence Network with Prototype Loss for Few-shot Learning (PyTorch)
Stars: ✭ 17 (-85.22%)
Mutual labels:  meta-learning
What I Have Read
Paper Lists, Notes and Slides, Focus on NLP. For summarization, please refer to https://github.com/xcfcode/Summarization-Papers
Stars: ✭ 110 (-4.35%)
Mutual labels:  meta-learning
Multidigitmnist
Combine multiple MNIST digits to create datasets with 100/1000 classes for few-shot learning/meta-learning
Stars: ✭ 48 (-58.26%)
Mutual labels:  meta-learning
Pytorch Meta
A collection of extensions and data-loaders for few-shot learning & meta-learning in PyTorch
Stars: ✭ 1,239 (+977.39%)
Mutual labels:  meta-learning
Few Shot Text Classification
Few-shot binary text classification with Induction Networks and Word2Vec weights initialization
Stars: ✭ 32 (-72.17%)
Mutual labels:  meta-learning
Maml Tf
Tensorflow Implementation of MAML
Stars: ✭ 44 (-61.74%)
Mutual labels:  meta-learning
Memory Efficient Maml
Memory efficient MAML using gradient checkpointing
Stars: ✭ 60 (-47.83%)
Mutual labels:  meta-learning
Mfe
Meta-Feature Extractor
Stars: ✭ 20 (-82.61%)
Mutual labels:  meta-learning
Gnn Meta Attack
Implementation of the paper "Adversarial Attacks on Graph Neural Networks via Meta Learning".
Stars: ✭ 99 (-13.91%)
Mutual labels:  meta-learning
Looper
A resource list for causality in statistics, data science and physics
Stars: ✭ 23 (-80%)
Mutual labels:  meta-learning
Meta Learning Bert
Meta learning with BERT as a learner
Stars: ✭ 52 (-54.78%)
Mutual labels:  meta-learning
Meta Blocks
A modular toolbox for meta-learning research with a focus on speed and reproducibility.
Stars: ✭ 110 (-4.35%)
Mutual labels:  meta-learning
Maxl
The implementation of "Self-Supervised Generalisation with Meta Auxiliary Learning" [NeurIPS 2019].
Stars: ✭ 101 (-12.17%)
Mutual labels:  meta-learning
Learn2learn
A PyTorch Library for Meta-learning Research
Stars: ✭ 1,193 (+937.39%)
Mutual labels:  meta-learning

Meta-pretraining Then Meta-learning (MTM) Model for FewShot NLP Tasks

GitHub stars GitHub forks

The source codes of the paper "Improving Few-shot Text Classification via Pretrained Language Representations" and "When Low Resource NLP Meets Unsupervised Language Model: Meta-pretraining Then Meta-learning for Few-shot Text Classification".

If you use the code, pleace cite the following paper:

@inproceedings{deng2020low,
  title={When Low Resource NLP Meets Unsupervised Language Model: Meta-Pretraining then Meta-Learning for Few-Shot Text Classification (Student Abstract).},
  author={Deng, Shumin and Zhang, Ningyu and Sun, Zhanlin and Chen, Jiaoyan and Chen, Huajun},
  booktitle={AAAI},
  pages={13773--13774},
  year={2020}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].