All Projects → volotat → Diffmorph

volotat / Diffmorph

Licence: mit
Image morphing without reference points by applying warp maps and optimizing over them.

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Diffmorph

sgd
An R package for large scale estimation with stochastic gradient descent
Stars: ✭ 55 (-77.91%)
Mutual labels:  gradient-descent
Learned-Turbo-type-Affine-Rank-Minimization
Code for Learned Turbo-ype Affine Rank Minimization
Stars: ✭ 4 (-98.39%)
Mutual labels:  gradient-descent
descent
First-order optimization tools
Stars: ✭ 23 (-90.76%)
Mutual labels:  gradient-descent
ReinforcementLearning Sutton-Barto Solutions
Solutions and figures for problems from Reinforcement Learning: An Introduction Sutton&Barto
Stars: ✭ 20 (-91.97%)
Mutual labels:  gradient-descent
ML-MCU
Code for IoT Journal paper title 'ML-MCU: A Framework to Train ML Classifiers on MCU-based IoT Edge Devices'
Stars: ✭ 28 (-88.76%)
Mutual labels:  gradient-descent
models-by-example
By-hand code for models and algorithms. An update to the 'Miscellaneous-R-Code' repo.
Stars: ✭ 43 (-82.73%)
Mutual labels:  gradient-descent
flatiron-school-data-science-curriculum-resources
Lesson material on data science and machine learning topics/concepts
Stars: ✭ 118 (-52.61%)
Mutual labels:  gradient-descent
Minimalistic-Multiple-Layer-Neural-Network-from-Scratch-in-Python
Minimalistic Multiple Layer Neural Network from Scratch in Python.
Stars: ✭ 24 (-90.36%)
Mutual labels:  gradient-descent
fmin adam
Matlab implementation of the Adam stochastic gradient descent optimisation algorithm
Stars: ✭ 38 (-84.74%)
Mutual labels:  gradient-descent
ML-Optimizers-JAX
Toy implementations of some popular ML optimizers using Python/JAX
Stars: ✭ 37 (-85.14%)
Mutual labels:  gradient-descent
sopt
sopt:A simple python optimization library
Stars: ✭ 42 (-83.13%)
Mutual labels:  gradient-descent
pydata-london-2018
Slides and notebooks for my tutorial at PyData London 2018
Stars: ✭ 22 (-91.16%)
Mutual labels:  gradient-descent
Machine-Learning-in-Python-Workshop
My workshop on machine learning using python language to implement different algorithms
Stars: ✭ 89 (-64.26%)
Mutual labels:  gradient-descent
Regression
Multiple Regression Package for PHP
Stars: ✭ 88 (-64.66%)
Mutual labels:  gradient-descent
interactive-simple-linear-regression
A PureScript, browser-based implementation of simple linear regression.
Stars: ✭ 15 (-93.98%)
Mutual labels:  gradient-descent
machine learning course
Artificial intelligence/machine learning course at UCF in Spring 2020 (Fall 2019 and Spring 2019)
Stars: ✭ 47 (-81.12%)
Mutual labels:  gradient-descent
Deep-Learning-Coursera
Projects from the Deep Learning Specialization from deeplearning.ai provided by Coursera
Stars: ✭ 123 (-50.6%)
Mutual labels:  gradient-descent
least-squares-cpp
A single header-only C++ library for least squares fitting.
Stars: ✭ 46 (-81.53%)
Mutual labels:  gradient-descent
Text-Analysis
Explaining textual analysis tools in Python. Including Preprocessing, Skip Gram (word2vec), and Topic Modelling.
Stars: ✭ 48 (-80.72%)
Mutual labels:  gradient-descent
Image-Classifier
Final Project of the Udacity AI Programming with Python Nanodegree
Stars: ✭ 63 (-74.7%)
Mutual labels:  gradient-descent

Differentiable Morphing

Image morphing without reference points by applying warp maps and optimizing over them.

Differentiable Morphing is machine learning algorithm that can morph any two images without reference points. It called "differentiable morphing" because neural network here is not used in traditional data to label mapping sense, but as an easy way to solve optimization problem where one image is mapped to another via warp maps that are found by gradient descent. So after maps are found there is no need for the network itself.

Results

example 1 example 2 example 3

Dependencies

Tensorflow 2.1.3 and above.

Usage

Install proper dependencies:

pip install -r requirements.txt

Use the program:

morph.py -s images/img_1.jpg -t images/img_2.jpg

-s Source file
-t Target file

Unnecessary parameters:
-e Number of epochs to train maps on training stage
-a Addition map multiplier
-m Multiplication map multiplier
-w Warp map multiplier
-add_first If true add map would be applied to the source image before mult map. (might work better in some cases)

Idea

Suppose we want to produce one image from another in a way that we use as much useful information as possible, so if two given images share any similarities between them we make use of these similarities.

toy_example

After several trials I found out that the best way to achieve such effect is to use following formula.

formula

Here "Mult map" removes unnecessary parts of an image and shifts color balance, "Add map" creates new colors that are not present in original image and "Warp map" distort an image in some way to reproduce shifting, rotation and scaling of objects. W operation is dense_image_warp method that present in tensorflow and usually used for optical flow estimation tasks.

All maps are found by gradient descent using very simple convolution network. Now, by applying alpha scaling parameter to every map we will get smooth transition from one image to another without any loss of useful data (at least for the given toy example).

transition

Thoughts

Notice that all maps produced generate somewhat meaningful interpolation without any understanding of what exactly present in the images. That means that warp operation might be very useful in images processing tasks. In some sense warp operation might be thought as long range convolution, because it can "grab" data from any point of an image and reshape it in some useful way. Therefore it might be beneficial to use warp operation in classification tasks and might allow networks be less susceptible to small perturbations of the data. But especially, it should be beneficial to use in generation task. It should be much easier to produce new data by combining and perturbating several examples of known data points than to learn a function that represents all data points at ones.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].