All Projects → ibatra → Bert Keyword Extractor

ibatra / Bert Keyword Extractor

Deep Keyphrase Extraction using BERT

Projects that are alternatives of or similar to Bert Keyword Extractor

Log Anomaly Detector
Log Anomaly Detection - Machine learning to detect abnormal events logs
Stars: ✭ 169 (-1.17%)
Mutual labels:  jupyter-notebook
Shape Detection
🟣 Object detection of abstract shapes with neural networks
Stars: ✭ 170 (-0.58%)
Mutual labels:  jupyter-notebook
Covid19 Severity Prediction
Extensive and accessible COVID-19 data + forecasting for counties and hospitals. 📈
Stars: ✭ 170 (-0.58%)
Mutual labels:  jupyter-notebook
Imagination Augmented Agents
Building Agents with Imagination: pytorch step-by-step implementation
Stars: ✭ 170 (-0.58%)
Mutual labels:  jupyter-notebook
Car Damage Detective
Assessing car damage with convolution neural networks for a personal auto claims expedition use case
Stars: ✭ 169 (-1.17%)
Mutual labels:  jupyter-notebook
Convolutional autoencoder
Code for a convolutional autoencoder written on python, theano, lasagne, nolearn
Stars: ✭ 170 (-0.58%)
Mutual labels:  jupyter-notebook
Photorealistic Style Transfer
High-Resolution Network for Photorealistic Style Transfer
Stars: ✭ 170 (-0.58%)
Mutual labels:  jupyter-notebook
Deep Learning With Python Notebooks
Jupyter notebooks for the code samples of the book "Deep Learning with Python"
Stars: ✭ 14,243 (+8229.24%)
Mutual labels:  jupyter-notebook
Word vectors game of thrones Live
This is the code for the "How to Make Word Vectors from Game of Thrones (LIVE) " Siraj Raval on Youtube
Stars: ✭ 170 (-0.58%)
Mutual labels:  jupyter-notebook
Vietocr
Transformer OCR
Stars: ✭ 170 (-0.58%)
Mutual labels:  jupyter-notebook
Torch Two Sample
A PyTorch library for two-sample tests
Stars: ✭ 170 (-0.58%)
Mutual labels:  jupyter-notebook
Wbc Classification
Classifying White Blood Cells with CNNs
Stars: ✭ 170 (-0.58%)
Mutual labels:  jupyter-notebook
Ml Training Camp
Stars: ✭ 171 (+0%)
Mutual labels:  jupyter-notebook
Deep Learning Genomics Primer
Contains files for the deep learning in genomics primer.
Stars: ✭ 170 (-0.58%)
Mutual labels:  jupyter-notebook
Shap
A game theoretic approach to explain the output of any machine learning model.
Stars: ✭ 14,917 (+8623.39%)
Mutual labels:  jupyter-notebook
Mimic extract
MIMIC-Extract:A Data Extraction, Preprocessing, and Representation Pipeline for MIMIC-III
Stars: ✭ 168 (-1.75%)
Mutual labels:  jupyter-notebook
Segnet
A Deep Convolutional Encoder-Decoder Architecture for Image Segmentation
Stars: ✭ 170 (-0.58%)
Mutual labels:  jupyter-notebook
Awesome Python Applications
💿 Free software that works great, and also happens to be open-source Python.
Stars: ✭ 13,275 (+7663.16%)
Mutual labels:  jupyter-notebook
Dive Into Dl Pytorch
本项目将《动手学深度学习》(Dive into Deep Learning)原书中的MXNet实现改为PyTorch实现。
Stars: ✭ 14,234 (+8223.98%)
Mutual labels:  jupyter-notebook
Tutorials
MONAI Tutorials
Stars: ✭ 170 (-0.58%)
Mutual labels:  jupyter-notebook

Deep Keyphrase Extraction using BERT

I have used BERT Token Classification Model to extract keywords from a sentence. Feel free to clone and use it. If you face any problems, kindly post it on issues section.

Special credits to BERT authors: Jacob Devlin, Ming-Wei Chang, Kenton Lee, Kristina Toutanova, original repo and Huggingface for PyTorch version original repo.

Requirements

You need:

pytorch 1.0
python 3.6
pytorch-pretrained-bert 0.4.0

Usage

The keyword-extractor.py script can be used to extract keywords from a sentence and accepts the following arguments:

optional arguments:
  -h, --help         show this help message and exit
  --sentence SEN        sentence to extract keywords
  --path LOAD        path to load model from

Example:

python keyword-extractor.py --sentence "BERT is a great model." --path "model.pt"           

Training

You can also train it from scratch using BERT's pre-trained model. The main.py script can be utilized for training and accepts the following arguments:

optional arguments:
  -h, --help         show this help message and exit
  --data DATA        location of the data corpus
  --lr LR            initial learning rate
  --epochs EPOCHS    upper epoch limit
  --batch_size N     batch size
  --seq_len N        sequence length
  --save SAVE        path to save the final model

Example:

python main.py --data "maui-semeval2010-train" --lr 2e-5 --batch_size 32 --save "model.pt" --epochs 3      

This model has been trained on SemEval 2010 dataset (scientific publications). You can swap this with your own custom dataset.

Code explanations

I have provided the explanation of keyphrase extraction in the form of python notebook which you can view here

Hyper-parameter Tuning

I ran ablation experiments according to the BERT paper and these are the results. I suggest to use parameters in line 4. All training was done on batch size of 32.

Learning Rate Number of Epochs Validation loss Validation Accuracy F1-Score
3.00E-05 3 0.05294724515 98.30% 0.5318559557
5.00E-05 3 0.04899719357 98.47% 0.56218628
2.00E-05 3 0.05733459462 98.15% 0.4390547264
3.00E-05 4 0.05020467712 98.48% 0.5528169014
5.00E-05 4 0.05194576555 98.43% 0.5780836421
2.00E-05 4 0.05373481681 98.25% 0.5019740553
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].