All Projects → AmazaspShumik → mtlearn

AmazaspShumik / mtlearn

Licence: other
Multi-Task Learning package built with tensorflow 2 (Multi-Gate Mixture of Experts, Cross-Stitch, Ucertainty Weighting)

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to mtlearn

EasyRec
A framework for large scale recommendation algorithms.
Stars: ✭ 599 (+1231.11%)
Mutual labels:  multi-task-learning
scrabble-gan
Adversarial Generation of Handwritten Text Images
Stars: ✭ 49 (+8.89%)
Mutual labels:  tensorflow2
NARUTO-HandSignDetection
物体検出を用いてNARUTOの印(子~亥、壬、合掌)を検出するモデルとサンプルプログラムです。このリポジトリでは、YOLOXを使用しています(This is a model and sample program that detects NARUTO's hand sign using object detection. This repository use YOLOX.)
Stars: ✭ 186 (+313.33%)
Mutual labels:  tensorflow2
text classifier
Tensorflow2.3的文本分类项目,支持各种分类模型,支持相关tricks。
Stars: ✭ 135 (+200%)
Mutual labels:  tensorflow2
groove2groove
Code for "Groove2Groove: One-Shot Music Style Transfer with Supervision from Synthetic Data"
Stars: ✭ 88 (+95.56%)
Mutual labels:  papers-with-code
Recurrent Interaction Network EMNLP2020
Here is the code for the paper ``Recurrent Interaction Network for Jointly Extracting Entities and Classifying Relations'' accepted by EMNLP2020.
Stars: ✭ 13 (-71.11%)
Mutual labels:  multitask-learning
amazon-sagemaker-mlops-workshop
MLOps workshop with Amazon SageMaker
Stars: ✭ 39 (-13.33%)
Mutual labels:  tensorflow2
tutel
Tutel MoE: An Optimized Mixture-of-Experts Implementation
Stars: ✭ 183 (+306.67%)
Mutual labels:  mixture-of-experts
ttt
A package for fine-tuning Transformers with TPUs, written in Tensorflow2.0+
Stars: ✭ 35 (-22.22%)
Mutual labels:  tensorflow2
CARLA
CARLA: A Python Library to Benchmark Algorithmic Recourse and Counterfactual Explanation Algorithms
Stars: ✭ 166 (+268.89%)
Mutual labels:  tensorflow2
torchMTL
A lightweight module for Multi-Task Learning in pytorch.
Stars: ✭ 84 (+86.67%)
Mutual labels:  multi-task-learning
Spectrum
Spectrum is an AI that uses machine learning to generate Rap song lyrics
Stars: ✭ 37 (-17.78%)
Mutual labels:  tensorflow2
TrackNet-Badminton-Tracking-tensorflow2
TrackNet for badminton tracking using tensorflow2
Stars: ✭ 37 (-17.78%)
Mutual labels:  tensorflow2
Deep-Learning
This repo provides projects on deep-learning mainly using Tensorflow 2.0
Stars: ✭ 22 (-51.11%)
Mutual labels:  tensorflow2
MNIST-multitask
6️⃣6️⃣6️⃣ Reproduce ICLR '18 under-reviewed paper "MULTI-TASK LEARNING ON MNIST IMAGE DATASETS"
Stars: ✭ 34 (-24.44%)
Mutual labels:  multi-task-learning
DeepSegmentor
A Pytorch implementation of DeepCrack and RoadNet projects.
Stars: ✭ 152 (+237.78%)
Mutual labels:  multi-task-learning
LIGHT-SERNET
Light-SERNet: A lightweight fully convolutional neural network for speech emotion recognition
Stars: ✭ 20 (-55.56%)
Mutual labels:  tensorflow2
PCC-Net
PCC Net: Perspective Crowd Counting via Spatial Convolutional Network
Stars: ✭ 63 (+40%)
Mutual labels:  multi-task-learning
tf-blazepose
BlazePose - Super fast human pose detection on Tensorflow 2.x
Stars: ✭ 139 (+208.89%)
Mutual labels:  tensorflow2
CS330-Stanford-Deep-Multi-Task-and-Meta-Learning
My notes and assignment solutions for Stanford CS330 (Fall 2019 & 2020) Deep Multi-Task and Meta Learning
Stars: ✭ 34 (-24.44%)
Mutual labels:  multitask-learning

mtlearn

The idea behind this repository is to build a small package that contains building blocks of many modern Multi-Task Learning algorithms and then use it to reproduce papers. This is still work in progress.

Installing & Upgrading package

pip install https://github.com/AmazaspShumik/mtlearn/archive/master.zip
pip install --upgrade https://github.com/AmazaspShumik/mtlearn/archive/master.zip

Papers Reproduced:

Modeling Task Relationships in Multi-task Learning with Multi-gate Mixture-of-Experts by Jiaqi Ma et al (KDD 2018)

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].