All Projects → bertinetto → Cfnet

bertinetto / Cfnet

Licence: mit
[CVPR'17] Training a Correlation Filter end-to-end allows lightweight networks of 2 layers (600 kB) to high performance at fast speed..

Programming Languages

matlab
3953 projects

Projects that are alternatives of or similar to Cfnet

Meta-TTS
Official repository of https://arxiv.org/abs/2111.04040v1
Stars: ✭ 69 (-86.09%)
Mutual labels:  meta-learning
Meta-SelfLearning
Meta Self-learning for Multi-Source Domain Adaptation: A Benchmark
Stars: ✭ 157 (-68.35%)
Mutual labels:  meta-learning
Metaoptnet
Meta-Learning with Differentiable Convex Optimization (CVPR 2019 Oral)
Stars: ✭ 412 (-16.94%)
Mutual labels:  meta-learning
maml-rl-tf2
Implementation of Model-Agnostic Meta-Learning (MAML) applied on Reinforcement Learning problems in TensorFlow 2.
Stars: ✭ 16 (-96.77%)
Mutual labels:  meta-learning
CDFSL-ATA
[IJCAI 2021] Cross-Domain Few-Shot Classification via Adversarial Task Augmentation
Stars: ✭ 21 (-95.77%)
Mutual labels:  meta-learning
Meta-SAC
Auto-tune the Entropy Temperature of Soft Actor-Critic via Metagradient - 7th ICML AutoML workshop 2020
Stars: ✭ 19 (-96.17%)
Mutual labels:  meta-learning
Learning2AdaptForStereo
Code for: "Learning To Adapt For Stereo" accepted at CVPR2019
Stars: ✭ 73 (-85.28%)
Mutual labels:  meta-learning
Awesome Papers Fewshot
Collection for Few-shot Learning
Stars: ✭ 466 (-6.05%)
Mutual labels:  meta-learning
dropclass speaker
DropClass and DropAdapt - repository for the paper accepted to Speaker Odyssey 2020
Stars: ✭ 20 (-95.97%)
Mutual labels:  meta-learning
Multitask Learning
Awesome Multitask Learning Resources
Stars: ✭ 361 (-27.22%)
Mutual labels:  meta-learning
Open-L2O
Open-L2O: A Comprehensive and Reproducible Benchmark for Learning to Optimize Algorithms
Stars: ✭ 108 (-78.23%)
Mutual labels:  meta-learning
PAML
Personalizing Dialogue Agents via Meta-Learning
Stars: ✭ 114 (-77.02%)
Mutual labels:  meta-learning
e-osvos
Implementation of "Make One-Shot Video Object Segmentation Efficient Again” and the semi-supervised fine-tuning "e-OSVOS" approach (NeurIPS 2020).
Stars: ✭ 31 (-93.75%)
Mutual labels:  meta-learning
MeTAL
Official PyTorch implementation of "Meta-Learning with Task-Adaptive Loss Function for Few-Shot Learning" (ICCV2021 Oral)
Stars: ✭ 24 (-95.16%)
Mutual labels:  meta-learning
Meta Transfer Learning
TensorFlow and PyTorch implementation of "Meta-Transfer Learning for Few-Shot Learning" (CVPR2019)
Stars: ✭ 439 (-11.49%)
Mutual labels:  meta-learning
maml-tensorflow
This repository implements the paper, Model-Agnostic Meta-Leanring for Fast Adaptation of Deep Networks.
Stars: ✭ 17 (-96.57%)
Mutual labels:  meta-learning
resilient-swarm-communications-with-meta-graph-convolutional-networks
Meta graph convolutional neural network-assisted resilient swarm communications
Stars: ✭ 49 (-90.12%)
Mutual labels:  meta-learning
Meta Dataset
A dataset of datasets for learning to learn from few examples
Stars: ✭ 483 (-2.62%)
Mutual labels:  meta-learning
Reinforcement learning tutorial with demo
Reinforcement Learning Tutorial with Demo: DP (Policy and Value Iteration), Monte Carlo, TD Learning (SARSA, QLearning), Function Approximation, Policy Gradient, DQN, Imitation, Meta Learning, Papers, Courses, etc..
Stars: ✭ 442 (-10.89%)
Mutual labels:  meta-learning
Matchingnetworks
This repo provides pytorch code which replicates the results of the Matching Networks for One Shot Learning paper on the Omniglot and MiniImageNet dataset
Stars: ✭ 256 (-48.39%)
Mutual labels:  meta-learning

End-to-end representation learning for Correlation Filter based tracking

pipeline image


Project page: [http://www.robots.ox.ac.uk/~luca/cfnet.html]


WARNING: we used Matlab 2015, MatConvNet v1.0beta24, CUDA 8.0 and cudnn 5.1. Other configurations might work, but it is not guaranteed. In particular, we received several reports of problems with Matlab 2017.


Getting started

[ Tracking only ] If you don't care about training, you can simply use one of our pretrained networks with our basic tracker.

  1. Prerequisites: GPU, CUDA (we used 7.5), cuDNN (we used v5.1), Matlab, MatConvNet.
  2. Clone the repository.
  3. Download the pretrained networks from here and unzip the archive in cfnet/pretrained.
  4. Go to cfnet/src/tracking/ and remove the trailing .example from env_paths_tracking.m.example, startup.m.example, editing the files as appropriate.
  5. Be sure to have at least one video sequence in the appropriate format. The easiest thing to do is to download the validation set (from here) that we used for the tracking evaluation and then extract the validation folder in cfnet/data/.
  6. Start from one of the cfnet/src/tracking/run_*_evaluation.m entry points.

[ Training and tracking ] Start here if instead you prefer to DIY and train your own networks.

  1. Prerequisites: GPU, CUDA (we used 7.5), cuDNN (we used v5.1), Matlab, MatConvNet.
  2. Clone the repository.
  3. Follow these step-by-step instructions, which will help you generating a curated dataset compatible with the rest of the code.
  4. If you did not generate your own metadata, download imdb_video_2016-10.mat (6.7GB) with all the metadata and also the dataset stats. Put them in cfnet/data/.
  5. Go to cfnet/src/training and remove the trailing .example from env_paths_training.m.example and startup.m.example, editing the files as appropriate.
  6. The various cfnet/train/run_experiment_*.m are some examples to start training. Default hyper-params are at the start of experiment.m and are overwritten by custom ones specified in run_experiment_*.m.
  7. By default, training plots are saved in cfnet/src/training/data/. When you are happy, grab a network snapshot (net-epoch-X.mat) and save it somewhere (e.g. cfnet/pretrained/).
  8. Go to point 4. of Tracking only, follow the instructions and enjoy the labour of your own GPUs!
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].