All Projects → hiredscorelabs → tamnun-ml

hiredscorelabs / tamnun-ml

Licence: other
An easy to use open-source library for advanced Deep Learning and Natural Language Processing

Programming Languages

Jupyter Notebook
11667 projects
python
139335 projects - #7 most used programming language
Makefile
30231 projects

Projects that are alternatives of or similar to tamnun-ml

deep-learning
Projects include the application of transfer learning to build a convolutional neural network (CNN) that identifies the artist of a painting, the building of predictive models for Bitcoin price data using Long Short-Term Memory recurrent neural networks (LSTMs) and a tutorial explaining how to build two types of neural network using as input the…
Stars: ✭ 43 (-60.55%)
Mutual labels:  transfer-learning
LegoBrickClassification
Repository to identify Lego bricks automatically only using images
Stars: ✭ 57 (-47.71%)
Mutual labels:  transfer-learning
super-gradients
Easily train or fine-tune SOTA computer vision models with one open source training library
Stars: ✭ 429 (+293.58%)
Mutual labels:  transfer-learning
favorite-research-papers
Listing my favorite research papers 📝 from different fields as I read them.
Stars: ✭ 12 (-88.99%)
Mutual labels:  transfer-learning
MoeFlow
Repository for anime characters recognition website, powered by TensorFlow
Stars: ✭ 113 (+3.67%)
Mutual labels:  transfer-learning
CPCE-3D
Low-dose CT via Transfer Learning from a 2D Trained Network, In IEEE TMI 2018
Stars: ✭ 40 (-63.3%)
Mutual labels:  transfer-learning
WSDM2022-PTUPCDR
This is the official implementation of our paper Personalized Transfer of User Preferences for Cross-domain Recommendation (PTUPCDR), which has been accepted by WSDM2022.
Stars: ✭ 65 (-40.37%)
Mutual labels:  transfer-learning
AU Recognition
AU_Recognition based on CKPlus/CK database
Stars: ✭ 21 (-80.73%)
Mutual labels:  transfer-learning
paper annotations
A place to keep track of all the annotated papers.
Stars: ✭ 96 (-11.93%)
Mutual labels:  transfer-learning
NaiveNASflux.jl
Your local Flux surgeon
Stars: ✭ 20 (-81.65%)
Mutual labels:  transfer-learning
Keras-MultiClass-Image-Classification
Multiclass image classification using Convolutional Neural Network
Stars: ✭ 48 (-55.96%)
Mutual labels:  transfer-learning
aml-keras-image-recognition
A sample Azure Machine Learning project for Transfer Learning-based custom image recognition by utilizing Keras.
Stars: ✭ 14 (-87.16%)
Mutual labels:  transfer-learning
EntityTargetedActiveLearning
No description or website provided.
Stars: ✭ 17 (-84.4%)
Mutual labels:  transfer-learning
Context-Transformer
Context-Transformer: Tackling Object Confusion for Few-Shot Detection, AAAI 2020
Stars: ✭ 89 (-18.35%)
Mutual labels:  transfer-learning
Deep-Learning-Experiments-implemented-using-Google-Colab
Colab Compatible FastAI notebooks for NLP and Computer Vision Datasets
Stars: ✭ 16 (-85.32%)
Mutual labels:  transfer-learning
AB distillation
Knowledge Transfer via Distillation of Activation Boundaries Formed by Hidden Neurons (AAAI 2019)
Stars: ✭ 105 (-3.67%)
Mutual labels:  transfer-learning
task-transferability
Data and code for our paper "Exploring and Predicting Transferability across NLP Tasks", to appear at EMNLP 2020.
Stars: ✭ 35 (-67.89%)
Mutual labels:  transfer-learning
TrainCaffeCustomDataset
Transfer learning in Caffe: example on how to train CaffeNet on custom dataset
Stars: ✭ 20 (-81.65%)
Mutual labels:  transfer-learning
DAN
Code release of "Learning Transferable Features with Deep Adaptation Networks" (ICML 2015)
Stars: ✭ 149 (+36.7%)
Mutual labels:  transfer-learning
Open set domain adaptation
Tensorflow Implementation of open set domain adaptation by backpropagation
Stars: ✭ 27 (-75.23%)
Mutual labels:  transfer-learning

Tamnun ML

PyPI pyversions CircleCI

tamnun is a python framework for Machine and Deep learning algorithms and methods especially in the field of Natural Language Processing and Transfer Learning. The aim of tamnun is to provide an easy to use interfaces to build powerful models based on most recent SOTA methods.

For more about tamnun, feel free to read the introduction to TamnunML on Medium.

Getting Started

tamnun depends on several other machine learning and deep learning frameworks like pytorch, keras and others. To install tamnun and all it's dependencies run:

$ git clone https://github.com/hiredscorelabs/tamnun-ml
$ cd tamnun-ml
$ python setup.py install

Or using PyPI:

pip install tamnun

Jump in and try out an example:

$ cd examples
$ python finetune_bert.py

Or take a look at the Jupyer notebooks here.

BERT

BERT stands for Bidirectional Encoder Representations from Transformers which is a language model trained by Google and introduced in their paper. Here we use the excellent PyTorch-Pretrained-BERT library and wrap it to provide an easy to use scikit-learn interface for easy BERT fine-tuning. At the moment, tamnun BERT classifier supports binary and multi-class classification. To fine-tune BERT on a specific task:

from tamnun.bert import BertClassifier, BertVectorizer
from sklearn.pipeline import make_pipeline

clf = make_pipeline(BertVectorizer(), BertClassifier(num_of_classes=2)).fit(train_X, train_y)
predicted = clf.predict(test_X)

Please see this notebook for full code example.

Fitting (almost) any PyTorch Module using just one line

You can use the TorchEstimator object to fit any pytorch module with just one line:

from torch import nn
from tamnun.core import TorchEstimator

module = nn.Linear(128, 2)
clf = TorchEstimator(module, task_type='classification').fit(train_X, train_y)

See this file for a full example of fitting nn.Linear module on the MNIST (classification of handwritten digits) dataset.

Distiller Transfer Learning

This module distills a very big (like BERT) model into a much smaller model. Inspired by this paper.

from tamnun.bert import BertClassifier, BertVectorizer
from tamnun.transfer import Distiller

bert_clf =  make_pipeline(BertVectorizer(do_truncate=True, max_len=3), BertClassifier(num_of_classes=2))
distilled_clf = make_pipeline(CountVectorizer(ngram_range=(1,3)), LinearRegression())

distiller = Distiller(teacher_model=bert_clf, teacher_predict_func=bert_clf.decision_function, student_model=distilled_clf).fit(train_texts, train_y, unlabeled_X=unlabeled_texts)

predicted_logits = distiller.transform(test_texts)

For full BERT distillation example see this notebook.

Support

Getting Help

You can ask questions and join the development discussion on Github Issues

License

Apache License 2.0 (Same as Tensorflow)

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].