All Projects → v-iashin → CS231n

v-iashin / CS231n

Licence: other
PyTorch/Tensorflow solutions for Stanford's CS231n: "CNNs for Visual Recognition"

Programming Languages

Jupyter Notebook
11667 projects

Projects that are alternatives of or similar to CS231n

CS231n
My solutions for Assignments of CS231n: Convolutional Neural Networks for Visual Recognition
Stars: ✭ 30 (-36.17%)
Mutual labels:  numpy, lstm, style-transfer
Deep Learning With Python
Example projects I completed to understand Deep Learning techniques with Tensorflow. Please note that I do no longer maintain this repository.
Stars: ✭ 134 (+185.11%)
Mutual labels:  lstm, generative-adversarial-network, style-transfer
Learning-Lab-C-Library
This library provides a set of basic functions for different type of deep learning (and other) algorithms in C.This deep learning library will be constantly updated
Stars: ✭ 20 (-57.45%)
Mutual labels:  lstm, backpropagation, adam-optimizer
Stock Price Predictor
This project seeks to utilize Deep Learning models, Long-Short Term Memory (LSTM) Neural Network algorithm, to predict stock prices.
Stars: ✭ 146 (+210.64%)
Mutual labels:  numpy, recurrent-neural-networks, lstm
Introduction-to-Deep-Learning-and-Neural-Networks-Course
Code snippets and solutions for the Introduction to Deep Learning and Neural Networks Course hosted in educative.io
Stars: ✭ 33 (-29.79%)
Mutual labels:  recurrent-neural-networks, generative-adversarial-network
gans-2.0
Generative Adversarial Networks in TensorFlow 2.0
Stars: ✭ 76 (+61.7%)
Mutual labels:  generative-adversarial-network, style-transfer
LSTM-Time-Series-Analysis
Using LSTM network for time series forecasting
Stars: ✭ 41 (-12.77%)
Mutual labels:  recurrent-neural-networks, lstm
favorite-research-papers
Listing my favorite research papers 📝 from different fields as I read them.
Stars: ✭ 12 (-74.47%)
Mutual labels:  generative-adversarial-network, style-transfer
Rnn lstm from scratch
How to build RNNs and LSTMs from scratch with NumPy.
Stars: ✭ 156 (+231.91%)
Mutual labels:  numpy, recurrent-neural-networks
Coursera-Machine-Learning-Andrew-NG
This is a repository of my coursera Machine Learning by Standford, Andrew NG course's assignments
Stars: ✭ 34 (-27.66%)
Mutual labels:  solutions, assignment
Wasserstein2GenerativeNetworks
PyTorch implementation of "Wasserstein-2 Generative Networks" (ICLR 2021)
Stars: ✭ 38 (-19.15%)
Mutual labels:  generative-adversarial-network, style-transfer
deep-blueberry
If you've always wanted to learn about deep-learning but don't know where to start, then you might have stumbled upon the right place!
Stars: ✭ 17 (-63.83%)
Mutual labels:  recurrent-neural-networks, generative-adversarial-network
NLP-Specialization
NLP Specialization (Natural Language Processing) made by deeplearning.ai
Stars: ✭ 44 (-6.38%)
Mutual labels:  solutions, assignment
Music-Style-Transfer
Source code for "Transferring the Style of Homophonic Music Using Recurrent Neural Networks and Autoregressive Model"
Stars: ✭ 16 (-65.96%)
Mutual labels:  recurrent-neural-networks, style-transfer
Neural Network From Scratch
Ever wondered how to code your Neural Network using NumPy, with no frameworks involved?
Stars: ✭ 230 (+389.36%)
Mutual labels:  numpy, backpropagation
bitcoin-prediction
bitcoin prediction algorithms
Stars: ✭ 21 (-55.32%)
Mutual labels:  recurrent-neural-networks, svm-classifier
SpeakerDiarization RNN CNN LSTM
Speaker Diarization is the problem of separating speakers in an audio. There could be any number of speakers and final result should state when speaker starts and ends. In this project, we analyze given audio file with 2 channels and 2 speakers (on separate channels).
Stars: ✭ 56 (+19.15%)
Mutual labels:  recurrent-neural-networks, lstm
keras-malicious-url-detector
Malicious URL detector using keras recurrent networks and scikit-learn classifiers
Stars: ✭ 24 (-48.94%)
Mutual labels:  recurrent-neural-networks, lstm
sequence-rnn-py
Sequence analyzing using Recurrent Neural Networks (RNN) based on Keras
Stars: ✭ 28 (-40.43%)
Mutual labels:  recurrent-neural-networks, lstm
Teaching Monolith
Data science teaching materials
Stars: ✭ 126 (+168.09%)
Mutual labels:  numpy, backpropagation

CS231n

CS231n: "Convolutional Neural Networks for Visual Recognition"

My solutions to the assignments to the state-of-the-art course CS231n "Convolutional Neural Networks for Visual Recognition". It was hard, but it is cool.

Framework

During the course, there was a choice between two frameworks: TensorFlow and PyTorch. I decided to follow the TensorFlow track. Therefore no solution is provided for PyTorch. However, it might occur someday in the future. Now, the solutions are provided for both frameworks.

Content of the Assignments (Spring 2017)

There were three assignments during the Spring 2017 version of the course. They all are completed.

  1. [Assignment #1]
  • understand the basic Image Classification pipeline and the data-driven approach (train/predict stages)
  • understand the train/val/test splits and the use of validation data for hyperparameter tuning.
  • develop proficiency in writing efficient vectorized code with numpy
  • implement and apply a k-Nearest Neighbor (kNN) classifier
  • implement and apply a Multiclass Support Vector Machine (SVM) classifier
  • implement and apply a Softmax classifier
  • implement and apply a Two layer neural network classifier
  • understand the differences and tradeoffs between these classifiers
  • get a basic understanding of performance improvements from using higher-level representations than raw pixels (e.g. color histograms, Histogram of Gradient (HOG) features)
  1. [Assignment #2]
  • understand Neural Networks and how they are arranged in layered architectures
  • understand and be able to implement (vectorized) backpropagation
  • implement various update rules used to optimize Neural Networks
  • implement batch normalization for training deep networks
  • implement dropout to regularize networks
  • effectively cross-validate and find the best hyperparameters for Neural Network architecture
  • understand the architecture of Convolutional Neural Networks and train gain experience with training these models on data
  1. [Assignment #3]
  • understand the architecture of recurrent neural networks (RNNs) and how they operate on sequences by sharing weights over time
  • understand and implement both Vanilla RNNs and Long-Short Term Memory (LSTM) RNNs
  • understand how to sample from an RNN language model at test-time
  • understand how to combine convolutional neural nets and recurrent nets to implement an image captioning system
  • understand how a trained convolutional network can be used to compute gradients with respect to the input image
  • implement and different applications of image gradients, including saliency maps, fooling images, class visualizations
  • understand and implement style transfer
  • understand how to train and implement a generative adversarial network (GAN) to produce images that look like a dataset
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].