All Projects → stephencwelch → Learningtosee

stephencwelch / Learningtosee

Projects that are alternatives of or similar to Learningtosee

Cnnvis Pytorch
visualization of CNN in PyTorch
Stars: ✭ 154 (+0%)
Mutual labels:  jupyter-notebook
Jupyter Vim Binding
Jupyter meets Vim. Vimmer will fall in love.
Stars: ✭ 1,965 (+1175.97%)
Mutual labels:  jupyter-notebook
Surgery Robot Detection Segmentation
Object detection and segmentation for a surgery robot using Mask-RCNN on Python 3, Keras, and TensorFlow..
Stars: ✭ 155 (+0.65%)
Mutual labels:  jupyter-notebook
Stock Market Prediction Challenge
Following repo is the solution to Stock Market Prediction using Neural Networks and Sentiment Analysis
Stars: ✭ 154 (+0%)
Mutual labels:  jupyter-notebook
Deepreinforcementlearning
A replica of the AlphaZero methodology for deep reinforcement learning in Python
Stars: ✭ 1,898 (+1132.47%)
Mutual labels:  jupyter-notebook
Your First Kaggle Submission
How to perform an exploratory data analysis on the Kaggle Titanic dataset and make a submission to the leaderboard.
Stars: ✭ 155 (+0.65%)
Mutual labels:  jupyter-notebook
Deep Viz Keras
Implementations of some popular Saliency Maps in Keras
Stars: ✭ 154 (+0%)
Mutual labels:  jupyter-notebook
Fcn For Semantic Segmentation
Implemention of FCN-8 and FCN-16 with Keras and uses CRF as post processing
Stars: ✭ 155 (+0.65%)
Mutual labels:  jupyter-notebook
Pyportfolioopt
Financial portfolio optimisation in python, including classical efficient frontier, Black-Litterman, Hierarchical Risk Parity
Stars: ✭ 2,502 (+1524.68%)
Mutual labels:  jupyter-notebook
Spiking Neural Network Snn With Pytorch Where Backpropagation Engenders Stdp
What about coding a Spiking Neural Network using an automatic differentiation framework? In SNNs, there is a time axis and the neural network sees data throughout time, and activation functions are instead spikes that are raised past a certain pre-activation threshold. Pre-activation values constantly fades if neurons aren't excited enough.
Stars: ✭ 155 (+0.65%)
Mutual labels:  jupyter-notebook
Jupyter Server Proxy
Jupyter notebook server extension to proxy web services.
Stars: ✭ 153 (-0.65%)
Mutual labels:  jupyter-notebook
Neural Style Transfer
Keras Implementation of Neural Style Transfer from the paper "A Neural Algorithm of Artistic Style" (http://arxiv.org/abs/1508.06576) in Keras 2.0+
Stars: ✭ 2,000 (+1198.7%)
Mutual labels:  jupyter-notebook
Stocks
Programs for stock prediction and evaluation
Stars: ✭ 155 (+0.65%)
Mutual labels:  jupyter-notebook
Matplotlib Label Lines
Label line using matplotlib.
Stars: ✭ 154 (+0%)
Mutual labels:  jupyter-notebook
Mgwr
Multiscale Geographically Weighted Regression (MGWR)
Stars: ✭ 155 (+0.65%)
Mutual labels:  jupyter-notebook
Tensorflow Multi Dimensional Lstm
Multi dimensional LSTM as described in Alex Graves' Paper https://arxiv.org/pdf/0705.2011.pdf
Stars: ✭ 154 (+0%)
Mutual labels:  jupyter-notebook
Davsod
Shifting More Attention to Video Salient Objection Detection, CVPR 2019 (Best paper finalist & Oral)
Stars: ✭ 155 (+0.65%)
Mutual labels:  jupyter-notebook
Tencent social ads2017 mobile app pcvr
Tencent Social Ads 2017 contest rank 20
Stars: ✭ 155 (+0.65%)
Mutual labels:  jupyter-notebook
Copulas
A library to model multivariate data using copulas.
Stars: ✭ 149 (-3.25%)
Mutual labels:  jupyter-notebook
Pytorchmedicalai
This is the hands-on deep learning tutorial series for the 2018/2019 Medical AI course by DeepOncology AI.
Stars: ✭ 155 (+0.65%)
Mutual labels:  jupyter-notebook

Learning To See

This repository contains companion code to the YouTube Series Learning to See. The majority of the code for this series was developed in the jupyter notebook using anaconda python 2.7. This repository includes the majority of the key code shown in the series, plus some of the code used to create animations. I'm happy to share more code upon request.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].