All Projects → santhoshkolloju → Abstractive Summarization With Transfer Learning

santhoshkolloju / Abstractive Summarization With Transfer Learning

Abstractive summarisation using Bert as encoder and Transformer Decoder

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Abstractive Summarization With Transfer Learning

Nlp Paper
NLP Paper
Stars: ✭ 484 (+35.2%)
Mutual labels:  transfer-learning, transformer
Haystack
🔍 Haystack is an open source NLP framework that leverages Transformer models. It enables developers to implement production-ready neural search, question answering, semantic document search and summarization for a wide range of applications.
Stars: ✭ 3,409 (+852.23%)
Mutual labels:  transfer-learning, summarization
Awesome Bert Nlp
A curated list of NLP resources focused on BERT, attention mechanism, Transformer networks, and transfer learning.
Stars: ✭ 567 (+58.38%)
Mutual labels:  transfer-learning, transformer
Question generation
Neural question generation using transformers
Stars: ✭ 356 (-0.56%)
Mutual labels:  nlg, transformer
MinTL
MinTL: Minimalist Transfer Learning for Task-Oriented Dialogue Systems
Stars: ✭ 61 (-82.96%)
Mutual labels:  transformer, transfer-learning
Nlp Papers
Papers and Book to look at when starting NLP 📚
Stars: ✭ 111 (-68.99%)
Mutual labels:  nlg, summarization
Bert Keras
Keras implementation of BERT with pre-trained weights
Stars: ✭ 820 (+129.05%)
Mutual labels:  transfer-learning, transformer
Getting Things Done With Pytorch
Jupyter Notebook tutorials on solving real-world problems with Machine Learning & Deep Learning using PyTorch. Topics: Face detection with Detectron 2, Time Series anomaly detection with LSTM Autoencoders, Object Detection with YOLO v5, Build your first Neural Network, Time Series forecasting for Coronavirus daily cases, Sentiment Analysis with BERT.
Stars: ✭ 738 (+106.15%)
Mutual labels:  transfer-learning, transformer
verseagility
Ramp up your custom natural language processing (NLP) task, allowing you to bring your own data, use your preferred frameworks and bring models into production.
Stars: ✭ 23 (-93.58%)
Mutual labels:  transformer, summarization
Context-Transformer
Context-Transformer: Tackling Object Confusion for Few-Shot Detection, AAAI 2020
Stars: ✭ 89 (-75.14%)
Mutual labels:  transformer, transfer-learning
Onnxt5
Summarization, translation, sentiment-analysis, text-generation and more at blazing speed using a T5 version implemented in ONNX.
Stars: ✭ 143 (-60.06%)
Mutual labels:  summarization, transformer
SIGIR2021 Conure
One Person, One Model, One World: Learning Continual User Representation without Forgetting
Stars: ✭ 23 (-93.58%)
Mutual labels:  transformer, transfer-learning
Scientificsummarizationdatasets
Datasets I have created for scientific summarization, and a trained BertSum model
Stars: ✭ 100 (-72.07%)
Mutual labels:  summarization, transformer
Flow Forecast
Deep learning PyTorch library for time series forecasting, classification, and anomaly detection (originally for flood forecasting).
Stars: ✭ 368 (+2.79%)
Mutual labels:  transfer-learning, transformer
TitleStylist
Source code for our "TitleStylist" paper at ACL 2020
Stars: ✭ 72 (-79.89%)
Mutual labels:  transformer, summarization
Filipino-Text-Benchmarks
Open-source benchmark datasets and pretrained transformer models in the Filipino language.
Stars: ✭ 22 (-93.85%)
Mutual labels:  transformer, transfer-learning
AITQA
resources for the IBM Airlines Table-Question-Answering Benchmark
Stars: ✭ 12 (-96.65%)
Mutual labels:  transformer, transfer-learning
Pyhgt
Code for "Heterogeneous Graph Transformer" (WWW'20), which is based on pytorch_geometric
Stars: ✭ 313 (-12.57%)
Mutual labels:  transformer
Typescript Plugin Styled Components
TypeScript transformer for improving the debugging experience of styled-components
Stars: ✭ 330 (-7.82%)
Mutual labels:  transformer
Laravel5 Jsonapi
Laravel 5 JSON API Transformer Package
Stars: ✭ 313 (-12.57%)
Mutual labels:  transformer

Abstractive summarization using bert as encoder and transformer decoder

I have used a text generation library called Texar , Its a beautiful library with a lot of abstractions, i would say it to be scikit learn for text generation problems.

The main idea behind this architecture is to use the transfer learning from pretrained BERT a masked language model , I have replaced the Encoder part with BERT Encoder and the deocder is trained from the scratch.

One of the advantages of using Transfomer Networks is training is much faster than LSTM based models as we elimanate sequential behaviour in Transformer models.

Transformer based models generate more gramatically correct and coherent sentences.

To run the model

wget https://storage.googleapis.com/bert_models/2018_10_18/uncased_L-12_H-768_A-12.zip 
unzip uncased_L-12_H-768_A-12.zip

Place the story and summary files under data folder with the following names. -train_story.txt -train_summ.txt -eval_story.txt -eval_summ.txt each story and summary must be in a single line (see sample text given.)

Step1: Run Preprocessing python preprocess.py

This creates two tfrecord files under the data folder.

Step 2: python main.py

Configurations for the model can be changes from config.py file

Step 3: Inference Run the command python inference.py This code runs a flask server Use postman to send the POST request @http://your_ip_address:1118/results with two form parameters story,summary

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].