All Projects → meowoodie → Rnn Rbm In Theano

meowoodie / Rnn Rbm In Theano

An implementation of RNN-RBM & GBRBM.

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Rnn Rbm In Theano

Rnn ctc
Recurrent Neural Network and Long Short Term Memory (LSTM) with Connectionist Temporal Classification implemented in Theano. Includes a Toy training example.
Stars: ✭ 220 (+3566.67%)
Mutual labels:  rnn, theano
Theano Kaldi Rnn
THEANO-KALDI-RNNs is a project implementing various Recurrent Neural Networks (RNNs) for RNN-HMM speech recognition. The Theano Code is coupled with the Kaldi decoder.
Stars: ✭ 31 (+416.67%)
Mutual labels:  rnn, theano
sequence-rnn-py
Sequence analyzing using Recurrent Neural Networks (RNN) based on Keras
Stars: ✭ 28 (+366.67%)
Mutual labels:  theano, rnn
Rnn Theano
使用Theano实现的一些RNN代码,包括最基本的RNN,LSTM,以及部分Attention模型,如论文MLSTM等
Stars: ✭ 31 (+416.67%)
Mutual labels:  rnn, theano
Deepjazz
Deep learning driven jazz generation using Keras & Theano!
Stars: ✭ 2,766 (+46000%)
Mutual labels:  rnn, theano
cudnn rnn theano benchmarks
No description or website provided.
Stars: ✭ 22 (+266.67%)
Mutual labels:  theano, rnn
theano-recurrence
Recurrent Neural Networks (RNN, GRU, LSTM) and their Bidirectional versions (BiRNN, BiGRU, BiLSTM) for word & character level language modelling in Theano
Stars: ✭ 40 (+566.67%)
Mutual labels:  theano, rnn
Wtte Rnn
WTTE-RNN a framework for churn and time to event prediction
Stars: ✭ 654 (+10800%)
Mutual labels:  rnn
Rnn Time Series Anomaly Detection
RNN based Time-series Anomaly detector model implemented in Pytorch.
Stars: ✭ 718 (+11866.67%)
Mutual labels:  rnn
Habu
Hacking Toolkit
Stars: ✭ 635 (+10483.33%)
Mutual labels:  network-analysis
Parrot
RNN-based generative models for speech.
Stars: ✭ 601 (+9916.67%)
Mutual labels:  theano
Cs224n
CS224n: Natural Language Processing with Deep Learning Assignments Winter, 2017
Stars: ✭ 656 (+10833.33%)
Mutual labels:  rnn
Tf Rnn Attention
Tensorflow implementation of attention mechanism for text classification tasks.
Stars: ✭ 735 (+12150%)
Mutual labels:  rnn
Ad examples
A collection of anomaly detection methods (iid/point-based, graph and time series) including active learning for anomaly detection/discovery, bayesian rule-mining, description for diversity/explanation/interpretability. Analysis of incorporating label feedback with ensemble and tree-based detectors. Includes adversarial attacks with Graph Convolutional Network.
Stars: ✭ 641 (+10583.33%)
Mutual labels:  rnn
Bmon
bandwidth monitor and rate estimator
Stars: ✭ 787 (+13016.67%)
Mutual labels:  network-analysis
Nfstream
NFStream: a Flexible Network Data Analysis Framework.
Stars: ✭ 622 (+10266.67%)
Mutual labels:  network-analysis
Pgrouting
Repository contains pgRouting library. Development branch is "develop", stable branch is "master"
Stars: ✭ 804 (+13300%)
Mutual labels:  network-analysis
Seq2seq Chatbot
Chatbot in 200 lines of code using TensorLayer
Stars: ✭ 777 (+12850%)
Mutual labels:  rnn
Notes Python
中文 Python 笔记
Stars: ✭ 6,127 (+102016.67%)
Mutual labels:  theano
Tensorflow cookbook
Code for Tensorflow Machine Learning Cookbook
Stars: ✭ 5,984 (+99633.33%)
Mutual labels:  rnn

Decoding Feature Vectors of Network Packets

What's this?

This project intends to implement GBRBM & RNN-RBM to decode a feature vector of network packets into a binary array, in order to discriminate the anomaly packets in network traffic. It mainly focus on converting a real vector to a binary vector, which keep those vectors which contain anomaly values away from the normal vectors in the binary vector space.

The following two pictures indicate that there are more distinct discrimination between anomaly data(blue), and normal data. The pictures are the distribution of data points which were processed by T-SNE which is used to reduce the dimensionality of data.

generated data after T-SNE in 3D decoded data after T-SNE in 3D

How to use this?

I have implemented a experiment component to run a variety of different experiments, also you can implement your experiment in experiment.py.

# use gbrbm to decode a dataset which contains 1000 tuples
# and five of them are anomaly data points.
exp_gbrbm("N6_n1000_t5_e1_gbrbm_h500", T=[0, 1, 2, 3, 4, 5])

The result would be generated at the directory data/N6_n1000_t5_e1_gbrbm_h500/. The results include data file and pictures.

Results

I have tried a lot of combination of parameters and I have got a wonderful result when I generated a 1500 points dataset and there were 5 anomaly points among them. Here is the original feature vectors which are a bunch of 6-dimensional real vectors, and here is the decoded binary vectors which are the same number as original feature vectors, and are 2000-dimension. I used GBRBM to decode the 1500 points from 6-dimensional real vectors to 2000-dimensional binary vectors, and they were visualized by T-SNE. It was surprised that the 2000-dimensional binary vectors has a distinct border between normal data and anomaly data in the perspective of 2D space.

  • 6-dimensional real vectors (origin points) in 2D space. generated data after T-SNE in 2D

  • 2000-dimensional binary vectors (decoded points) in 2D space. decoded data after T-SNE in 2D

Reference

My published paper for this work: https://www.jstage.jst.go.jp/article/transinf/E100.D/8/E100.D_2016ICP0005/_article

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].