All Projects → aymanibrahim → dltf

aymanibrahim / dltf

Licence: MIT license
Hands-on in-person workshop for Deep Learning with TensorFlow

Programming Languages

Jupyter Notebook
11667 projects
python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to dltf

Ad examples
A collection of anomaly detection methods (iid/point-based, graph and time series) including active learning for anomaly detection/discovery, bayesian rule-mining, description for diversity/explanation/interpretability. Analysis of incorporating label feedback with ensemble and tree-based detectors. Includes adversarial attacks with Graph Convolutional Network.
Stars: ✭ 641 (+4478.57%)
Mutual labels:  lstm, rnn, autoencoder
Natural Language Processing With Tensorflow
Natural Language Processing with TensorFlow, published by Packt
Stars: ✭ 222 (+1485.71%)
Mutual labels:  lstm, rnn
Rnn ctc
Recurrent Neural Network and Long Short Term Memory (LSTM) with Connectionist Temporal Classification implemented in Theano. Includes a Toy training example.
Stars: ✭ 220 (+1471.43%)
Mutual labels:  lstm, rnn
Caption generator
A modular library built on top of Keras and TensorFlow to generate a caption in natural language for any input image.
Stars: ✭ 243 (+1635.71%)
Mutual labels:  lstm, rnn
Haste
Haste: a fast, simple, and open RNN library
Stars: ✭ 214 (+1428.57%)
Mutual labels:  lstm, rnn
Sign Language Gesture Recognition
Sign Language Gesture Recognition From Video Sequences Using RNN And CNN
Stars: ✭ 214 (+1428.57%)
Mutual labels:  lstm, rnn
Pytorch Sentiment Analysis
Tutorials on getting started with PyTorch and TorchText for sentiment analysis.
Stars: ✭ 3,209 (+22821.43%)
Mutual labels:  lstm, rnn
Rnn For Joint Nlu
Pytorch implementation of "Attention-Based Recurrent Neural Network Models for Joint Intent Detection and Slot Filling" (https://arxiv.org/abs/1609.01454)
Stars: ✭ 176 (+1157.14%)
Mutual labels:  lstm, rnn
Nlstm
Nested LSTM Cell
Stars: ✭ 246 (+1657.14%)
Mutual labels:  lstm, rnn
Pytorch Seq2seq
Tutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
Stars: ✭ 3,418 (+24314.29%)
Mutual labels:  lstm, rnn
Har Stacked Residual Bidir Lstms
Using deep stacked residual bidirectional LSTM cells (RNN) with TensorFlow, we do Human Activity Recognition (HAR). Classifying the type of movement amongst 6 categories or 18 categories on 2 different datasets.
Stars: ✭ 250 (+1685.71%)
Mutual labels:  lstm, rnn
Chameleon recsys
Source code of CHAMELEON - A Deep Learning Meta-Architecture for News Recommender Systems
Stars: ✭ 202 (+1342.86%)
Mutual labels:  lstm, rnn
Char Rnn Chinese
Multi-layer Recurrent Neural Networks (LSTM, GRU, RNN) for character-level language models in Torch. Based on code of https://github.com/karpathy/char-rnn. Support Chinese and other things.
Stars: ✭ 192 (+1271.43%)
Mutual labels:  lstm, rnn
Kprn
Reasoning Over Knowledge Graph Paths for Recommendation
Stars: ✭ 220 (+1471.43%)
Mutual labels:  lstm, rnn
Stylenet
A cute multi-layer LSTM that can perform like a human 🎶
Stars: ✭ 187 (+1235.71%)
Mutual labels:  lstm, rnn
Crnn Audio Classification
UrbanSound classification using Convolutional Recurrent Networks in PyTorch
Stars: ✭ 235 (+1578.57%)
Mutual labels:  lstm, rnn
Automatic speech recognition
End-to-end Automatic Speech Recognition for Madarian and English in Tensorflow
Stars: ✭ 2,751 (+19550%)
Mutual labels:  lstm, rnn
Pytorch Kaldi
pytorch-kaldi is a project for developing state-of-the-art DNN/RNN hybrid speech recognition systems. The DNN part is managed by pytorch, while feature extraction, label computation, and decoding are performed with the kaldi toolkit.
Stars: ✭ 2,097 (+14878.57%)
Mutual labels:  lstm, rnn
Eeg Dl
A Deep Learning library for EEG Tasks (Signals) Classification, based on TensorFlow.
Stars: ✭ 165 (+1078.57%)
Mutual labels:  lstm, rnn
Lightnet
Efficient, transparent deep learning in hundreds of lines of code.
Stars: ✭ 243 (+1635.71%)
Mutual labels:  lstm, rnn

Quick Start

The workshop code is available as Jupyter notebooks. You can run the notebooks in the cloud (no installation required) by clicking the "launch binder" button:

Binder

Why

For people who struggle to start in deep learning with TensorFlow

Description

This hands-on in-person workshop is based on Deep Learning with TensorFlow Course by IBM Cognitive Class

Learn how to get started with TensorFlow to capture relevant structure in images, sound, and textual data from unlabeled and unstructured data.

Outline

The workshop will cover core topics:

01 Intro Colab

Data Graph Tensors ReLu
  • HelloWorld with TensorFlow
  • Linear and Logistic Regression with TensorFlow
  • Activation Functions

02 Convolutional Neural Networks (CNN) Colab


  • Introduction to Convolutional Networks
  • Convolution and Feature Learning
  • Convolution with Python and TensorFlow
  • MNIST Dataset
  • Multilayer Perceptron with TensorFlow
  • Convolutional Network with TensorFlow

03 Recurrent Neural Networks (RNN) Colab

Sequentaial Data Recurrent Model LSTM
  • Recurrent Neural Network Model
  • Long Short-Term Memory
  • Recursive Neural Tensor Network Theory
  • Applying Recurrent Networks to Language Modelling

04 Unsupervised Learning Colab

Forward Pass Backward Pass Quality Assessment
  • Applications of Unsupervised Learning
  • Restricted Boltzmann Machine
  • Training a Restricted Boltzman Machine
  • Recommendation System with a Restrictive Boltzman Machine

05 Autoencoders Colab

Encode/Decode Architecture Autoencoder vs RBM
  • Introduction to Autoencoders and Applications
  • Autoencoder Structure
  • Deep Belief Network

Prerequisites

Pre-workshop

You will need a laptop that can access the internet

1: Installation

Install miniconda or install the (larger) Anaconda distribution

Install Python using Miniconda

OR Install Python using Ananconda

2: Setup

2.1: Download workshop code & materials

Clone the repository

git clone [email protected]:aymanibrahim/dltf.git

OR Download the repository as a .zip file

2.2: Change directory to pyds

Change current directory to dltf directory

cd dltf

2.3: Install Python with required packages

Install Python with the required packages into an environment named dltf as per environment.yml YAML file.

conda env create -f environment.yml

When conda asks if you want to proceed, type "y" and press Enter.

3: Activate environment

Change the current default environment (base) into dltf environment.

conda activate dltf

4: Install & Enable ipywidgets extentions

Install ipywidgets JupyterLab extension

jupyter labextension install @jupyter-widgets/jupyterlab-manager

Enable widgetsnbextension

jupyter nbextension enable --py widgetsnbextension --sys-prefix

5: Check installation

Use check_environment.py script to make sure everything was installed correctly, open a terminal, and change its directory (cd) so that your working directory is the workshop directory dltf you cloned or downloaded. Then enter the following:

python check_environment.py

If everything is OK, you will get the following message:

Your workshop environment is set up

6: Start JupyterLab

Start JupyterLab using:

jupyter lab

JupyterLab will open automatically in your browser.

You may access JupyterLab by entering the notebook server’s URL into the browser.

7: Stop JupyterLab

Press CTRL + C in the terminal to stop JupyterLab.

8: Deactivate environment

Change the current environment (dltf) into the previous environment.

conda deactivate

Workshop Instructor

Ayman Ibrahim

References

Contributing

Thanks for your interest in contributing! There are many ways to contribute to this project. Get started here.

License

Workshop Code

License: MIT

Workshop Materials

Creative Commons License

Deep Learning with TensorFlow Workshop by Ayman Ibrahim is licensed under a Creative Commons Attribution 4.0 International License. Based on a work at IBM Cognitive Class Deep Learning with TensorFlow Course by Saeed Aghabozorgi, PhD , Rafael Belo da Silva, Erich Natsubori Sato and Walter Gomes de Amorim Junior.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].