All Projects → wzyonggege → Rnn_poetry_generator

wzyonggege / Rnn_poetry_generator

基于RNN生成古诗

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Rnn poetry generator

Cnn lstm for text classify
CNN, LSTM, NBOW, fasttext 中文文本分类
Stars: ✭ 90 (-37.06%)
Mutual labels:  lstm, rnn
Text predictor
Char-level RNN LSTM text generator📄.
Stars: ✭ 99 (-30.77%)
Mutual labels:  lstm, rnn
Word Rnn Tensorflow
Multi-layer Recurrent Neural Networks (LSTM, RNN) for word-level language models in Python using TensorFlow.
Stars: ✭ 1,297 (+806.99%)
Mutual labels:  lstm, rnn
Pytorch Sentiment Analysis Classification
A PyTorch Tutorials of Sentiment Analysis Classification (RNN, LSTM, Bi-LSTM, LSTM+Attention, CNN)
Stars: ✭ 80 (-44.06%)
Mutual labels:  lstm, rnn
Lstm Crypto Price Prediction
Predicting price trends in cryptomarkets using an lstm-RNN for the use of a trading bot
Stars: ✭ 136 (-4.9%)
Mutual labels:  lstm, rnn
Copper price forecast
copper price(time series) prediction using bpnn and lstm
Stars: ✭ 81 (-43.36%)
Mutual labels:  lstm, rnn
Pytorch Learners Tutorial
PyTorch tutorial for learners
Stars: ✭ 97 (-32.17%)
Mutual labels:  lstm, rnn
Char rnn lm zh
language model in Chinese,基于Pytorch官方文档实现
Stars: ✭ 57 (-60.14%)
Mutual labels:  lstm, rnn
Lstms.pth
PyTorch implementations of LSTM Variants (Dropout + Layer Norm)
Stars: ✭ 111 (-22.38%)
Mutual labels:  lstm, rnn
Ml Ai Experiments
All my experiments with AI and ML
Stars: ✭ 107 (-25.17%)
Mutual labels:  lstm, rnn
Machine Learning
My Attempt(s) In The World Of ML/DL....
Stars: ✭ 78 (-45.45%)
Mutual labels:  lstm, rnn
Linear Attention Recurrent Neural Network
A recurrent attention module consisting of an LSTM cell which can query its own past cell states by the means of windowed multi-head attention. The formulas are derived from the BN-LSTM and the Transformer Network. The LARNN cell with attention can be easily used inside a loop on the cell state, just like any other RNN. (LARNN)
Stars: ✭ 119 (-16.78%)
Mutual labels:  lstm, rnn
Hred Attention Tensorflow
An extension on the Hierachical Recurrent Encoder-Decoder for Generative Context-Aware Query Suggestion, our implementation is in Tensorflow and uses an attention mechanism.
Stars: ✭ 68 (-52.45%)
Mutual labels:  lstm, rnn
Lstm chem
Implementation of the paper - Generative Recurrent Networks for De Novo Drug Design.
Stars: ✭ 87 (-39.16%)
Mutual labels:  lstm, rnn
Bitcoin Price Prediction Using Lstm
Bitcoin price Prediction ( Time Series ) using LSTM Recurrent neural network
Stars: ✭ 67 (-53.15%)
Mutual labels:  lstm, rnn
Pytorch Pos Tagging
A tutorial on how to implement models for part-of-speech tagging using PyTorch and TorchText.
Stars: ✭ 96 (-32.87%)
Mutual labels:  lstm, rnn
Deepseqslam
The Official Deep Learning Framework for Route-based Place Recognition
Stars: ✭ 49 (-65.73%)
Mutual labels:  lstm, rnn
Time Attention
Implementation of RNN for Time Series prediction from the paper https://arxiv.org/abs/1704.02971
Stars: ✭ 52 (-63.64%)
Mutual labels:  lstm, rnn
See Rnn
RNN and general weights, gradients, & activations visualization in Keras & TensorFlow
Stars: ✭ 102 (-28.67%)
Mutual labels:  lstm, rnn
Pytorch Rnn Text Classification
Word Embedding + LSTM + FC
Stars: ✭ 112 (-21.68%)
Mutual labels:  lstm, rnn

RNN_poetry_generator

基于RNN生成古诗

环境

  • python3.6
  • tensorflow 1.2.0

使用

  • 训练:

python poetry_gen.py --mode train

  • 生成:

python poetry_gen.py 或者 python poetry_gen.py --mode sample

  • 生成藏头诗:

python poetry_gen.py --mode sample --head 明月别枝惊鹊

生成藏头诗 ---> 明月别枝惊鹊

明年襟宠任,月出画床帘。别有平州伯性悔,枝边折得李桑迷。惊腰每异年三杰,鹊出交钟玉笛频。

帮助

python poetry_gen.py --help

usage: poetry_gen.py [-h] [--mode MODE] [--head HEAD]

optional arguments: -h, --help show this help message and exit --mode MODE usage: train or sample, sample is default --head HEAD 生成藏头诗

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].