All Projects → zheng5yu9 → chinese_ancient_poetry

zheng5yu9 / chinese_ancient_poetry

Licence: other
seq2seq attention tensorflow textrank context

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to chinese ancient poetry

RNNSearch
An implementation of attention-based neural machine translation using Pytorch
Stars: ✭ 43 (+43.33%)
Mutual labels:  seq2seq, attention
Text Classification Models Pytorch
Implementation of State-of-the-art Text Classification Models in Pytorch
Stars: ✭ 379 (+1163.33%)
Mutual labels:  seq2seq, attention
Base-On-Relation-Method-Extract-News-DA-RNN-Model-For-Stock-Prediction--Pytorch
基於關聯式新聞提取方法之雙階段注意力機制模型用於股票預測
Stars: ✭ 33 (+10%)
Mutual labels:  seq2seq, attention
chatbot
一个基于深度学习的中文聊天机器人,这里有详细的教程与代码,每份代码都有详细的注释,作为学习是美好的选择。A Chinese chatbot based on deep learning.
Stars: ✭ 94 (+213.33%)
Mutual labels:  seq2seq, attention
Pytorch Seq2seq
Tutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
Stars: ✭ 3,418 (+11293.33%)
Mutual labels:  seq2seq, attention
transformer
A PyTorch Implementation of "Attention Is All You Need"
Stars: ✭ 28 (-6.67%)
Mutual labels:  seq2seq, attention
Seq2seq Summarizer
Pointer-generator reinforced seq2seq summarization in PyTorch
Stars: ✭ 306 (+920%)
Mutual labels:  seq2seq, attention
classifier multi label seq2seq attention
multi-label,classifier,text classification,多标签文本分类,文本分类,BERT,ALBERT,multi-label-classification,seq2seq,attention,beam search
Stars: ✭ 26 (-13.33%)
Mutual labels:  seq2seq, attention
Pointer Networks Experiments
Sorting numbers with pointer networks
Stars: ✭ 53 (+76.67%)
Mutual labels:  seq2seq, attention
Neural sp
End-to-end ASR/LM implementation with PyTorch
Stars: ✭ 408 (+1260%)
Mutual labels:  seq2seq, attention
tensorflow-chatbot-chinese
網頁聊天機器人 | tensorflow implementation of seq2seq model with bahdanau attention and Word2Vec pretrained embedding
Stars: ✭ 50 (+66.67%)
Mutual labels:  seq2seq, attention
Chinese Chatbot
中文聊天机器人,基于10万组对白训练而成,采用注意力机制,对一般问题都会生成一个有意义的答复。已上传模型,可直接运行,跑不起来直播吃键盘。
Stars: ✭ 124 (+313.33%)
Mutual labels:  seq2seq, attention
Encoder decoder
Four styles of encoder decoder model by Python, Theano, Keras and Seq2Seq
Stars: ✭ 269 (+796.67%)
Mutual labels:  seq2seq, attention
Nlp Tutorials
Simple implementations of NLP models. Tutorials are written in Chinese on my website https://mofanpy.com
Stars: ✭ 394 (+1213.33%)
Mutual labels:  seq2seq, attention
Multiturndialogzoo
Multi-turn dialogue baselines written in PyTorch
Stars: ✭ 106 (+253.33%)
Mutual labels:  seq2seq, attention
Deep Time Series Prediction
Seq2Seq, Bert, Transformer, WaveNet for time series prediction.
Stars: ✭ 183 (+510%)
Mutual labels:  seq2seq, attention
Tensorflow Shakespeare
Neural machine translation between the writings of Shakespeare and modern English using TensorFlow
Stars: ✭ 244 (+713.33%)
Mutual labels:  seq2seq
Naver-AI-Hackathon-Speech
2019 Clova AI Hackathon : Speech - Rank 12 / Team Kai.Lib
Stars: ✭ 26 (-13.33%)
Mutual labels:  seq2seq
Mead Baseline
Deep-Learning Model Exploration and Development for NLP
Stars: ✭ 238 (+693.33%)
Mutual labels:  seq2seq
Debug seq2seq
[unmaintained] Make seq2seq for keras work
Stars: ✭ 233 (+676.67%)
Mutual labels:  seq2seq

思路: 1.模型输入数据:

    利用四句七言古诗
    
    利用textrank提取每句诗中的权重较高的词作为本句诗的训练意图【优化点】
    
    利用不超过上三句作为上下文
    
    ^$作为分割诗句的标志
    
    gensim.model 训练的word2vec 作为预训练字向量【优化点】
    
2.模型结构:

    Seq2Seq+BahdanauAttention,作为网络结构
    
    输入数据 添加每个字的拼音【优化点】作为输入数据
    
3.训练:

    正常训练,每次epoch,打乱数据顺序,重新训练
    
4.测试:

    从输出排序中挑选尾字押韵且possibility最高的字进行输出
    
    降低重复出现的字的权重
    
    提高押韵尾字的权重

损失: tensorboard/lr_and_loss.png

启动文件:

entrance.py

预训练:

    python entrance.py -p
    
训练:

    python entrance.py -t
    
测试:

    python entrance.py -i

示例: 输入keywords,4个,通过空格区分怀归 归心 孤云 望远

赋诗:

    山不风一春水花
    
    毛荡带修庙胜发
    
    极禅苗焚索渠节
    
    就烛危卷经离华
    
输入keywords,4个,通过空格区分怀古 今古 江山 望远

赋诗

    山不风一春水花
    
    毛荡带修庙胜发
    
    极禅苗焚索烂渠
    
    汝底后难莫何他
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].