All Projects → RL-VIG → LibFewShot

RL-VIG / LibFewShot

Licence: MIT license
LibFewShot: A Comprehensive Library for Few-shot Learning.

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to LibFewShot

FSL-Mate
FSL-Mate: A collection of resources for few-shot learning (FSL).
Stars: ✭ 1,346 (+113.99%)
Mutual labels:  meta-learning, few-shot-learning
SCL
📄 Spatial Contrastive Learning for Few-Shot Classification (ECML/PKDD 2021).
Stars: ✭ 42 (-93.32%)
Mutual labels:  image-classification, few-shot-learning
Meta-TTS
Official repository of https://arxiv.org/abs/2111.04040v1
Stars: ✭ 69 (-89.03%)
Mutual labels:  meta-learning, few-shot-learning
FUSION
PyTorch code for NeurIPSW 2020 paper (4th Workshop on Meta-Learning) "Few-Shot Unsupervised Continual Learning through Meta-Examples"
Stars: ✭ 18 (-97.14%)
Mutual labels:  meta-learning, few-shot-learning
sinkhorn-label-allocation
Sinkhorn Label Allocation is a label assignment method for semi-supervised self-training algorithms. The SLA algorithm is described in full in this ICML 2021 paper: https://arxiv.org/abs/2102.08622.
Stars: ✭ 49 (-92.21%)
Mutual labels:  image-classification, few-shot-learning
sib meta learn
Code of Empirical Bayes Transductive Meta-Learning with Synthetic Gradients
Stars: ✭ 56 (-91.1%)
Mutual labels:  meta-learning, few-shot-learning
CDFSL-ATA
[IJCAI 2021] Cross-Domain Few-Shot Classification via Adversarial Task Augmentation
Stars: ✭ 21 (-96.66%)
Mutual labels:  meta-learning, few-shot-learning
Awesome-Few-shot
Awesome Few-shot learning
Stars: ✭ 50 (-92.05%)
Mutual labels:  meta-learning, few-shot-learning
Skin Lesions Classification DCNNs
Transfer Learning with DCNNs (DenseNet, Inception V3, Inception-ResNet V2, VGG16) for skin lesions classification
Stars: ✭ 47 (-92.53%)
Mutual labels:  image-classification, fine-tuning
Meta Learning Papers
Meta Learning / Learning to Learn / One Shot Learning / Few Shot Learning
Stars: ✭ 2,420 (+284.74%)
Mutual labels:  meta-learning, few-shot-learning
simple-cnaps
Source codes for "Improved Few-Shot Visual Classification" (CVPR 2020), "Enhancing Few-Shot Image Classification with Unlabelled Examples" (WACV 2022), and "Beyond Simple Meta-Learning: Multi-Purpose Models for Multi-Domain, Active and Continual Few-Shot Learning" (Neural Networks 2022 - in submission)
Stars: ✭ 88 (-86.01%)
Mutual labels:  meta-learning, few-shot-learning
Metaoptnet
Meta-Learning with Differentiable Convex Optimization (CVPR 2019 Oral)
Stars: ✭ 412 (-34.5%)
Mutual labels:  image-classification, meta-learning
Learning-To-Compare-For-Text
Learning To Compare For Text , Few shot learning in text classification
Stars: ✭ 38 (-93.96%)
Mutual labels:  meta-learning, few-shot-learning
LearningToCompare-Tensorflow
Tensorflow implementation for paper: Learning to Compare: Relation Network for Few-Shot Learning.
Stars: ✭ 17 (-97.3%)
Mutual labels:  meta-learning, few-shot-learning
awesome-few-shot-meta-learning
awesome few shot / meta learning papers
Stars: ✭ 44 (-93%)
Mutual labels:  meta-learning, few-shot-learning
MeTAL
Official PyTorch implementation of "Meta-Learning with Task-Adaptive Loss Function for Few-Shot Learning" (ICCV2021 Oral)
Stars: ✭ 24 (-96.18%)
Mutual labels:  meta-learning, few-shot-learning
finetuner
Finetuning any DNN for better embedding on neural search tasks
Stars: ✭ 442 (-29.73%)
Mutual labels:  fine-tuning, few-shot-learning
Transferlearning
Transfer learning / domain adaptation / domain generalization / multi-task learning etc. Papers, codes, datasets, applications, tutorials.-迁移学习
Stars: ✭ 8,481 (+1248.33%)
Mutual labels:  meta-learning, few-shot-learning
backprop
Backprop makes it simple to use, finetune, and deploy state-of-the-art ML models.
Stars: ✭ 229 (-63.59%)
Mutual labels:  image-classification, fine-tuning
Multidigitmnist
Combine multiple MNIST digits to create datasets with 100/1000 classes for few-shot learning/meta-learning
Stars: ✭ 48 (-92.37%)
Mutual labels:  image-classification, meta-learning

LibFewShot

Make few-shot learning easy.

LibFewShot: A Comprehensive Library for Few-shot Learning. Wenbin Li, Ziyi Wang, Xuesong Yang, Chuanqi Dong, Pinzhuo Tian, Tiexin Qin, Jing Huo, Yinghuan Shi, Lei Wang, Yang Gao, Jiebo Luo. In arXiv 2022.

Supported Methods

Non-episodic methods (a.k.a Fine-tuning based methods)

Meta-learning based methods

Metric-learning based methods

Quick Installation

Please refer to install.md(安装) for installation.

Complete tutorials can be found at document(中文文档).

Reproduction

We provide some validated configs in reproduce, please refer to ./reproduce/<Method_Name>/README.md for further infomations. The meanings of the symbols are as follows:

📖 The accuracies reproted by the papers.

💻 The accuracies reproted by ourselves.

⬇️ Hyperlinks to download the checkpoints folder. (Containing config.yaml, model_best.pth and the train/test log)

📋 Hyperlinks to the config file.

You can also find these checkpoints at model_zoo.

Datasets

Caltech-UCSD Birds-200-2011, Standford Cars, Standford Dogs, miniImageNet and tieredImageNet are available at Google Drive and 百度网盘(提取码:yr1w).

Contributing

Please feel free to contribute any kind of functions or enhancements, where the coding style follows PEP 8. Please kindly refer to contributing.md(贡献代码) for the contributing guidelines.

License

This project is licensed under the MIT License. See LICENSE for more details.

Acknowledgement

LibFewShot is an open source project designed to help few-shot learning researchers quickly understand the classic methods and code structures. We welcome other contributors to use this framework to implement their own or other impressive methods and add them to LibFewShot. This library can only be used for academic research. We welcome any feedback during using LibFewShot and will try our best to continually improve the library.

Citation

If you use this code for your research, please cite our paper.

@article{li2021LibFewShot,
  title={LibFewShot: A Comprehensive Library for Few-shot Learning},
  author={Li, Wenbin and Wang, Ziyi and Yang, Xuesong and Dong, Chuanqi and Tian, Pinzhuo and Qin, Tiexin and Huo Jing and Shi, Yinghuan and Wang, Lei and Gao, Yang and Luo, Jiebo},
  journal={arXiv preprint arXiv:2109.04898},
  year={2022}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].