All Projects → ebenso → TextSummarizer

ebenso / TextSummarizer

Licence: MIT License
TextRank implementation for C#

Programming Languages

C#
18002 projects
Roff
2310 projects

Projects that are alternatives of or similar to TextSummarizer

TextRank-node
No description or website provided.
Stars: ✭ 21 (-27.59%)
Mutual labels:  textrank, text-summarization, textrank-algorithm
Textrank4zh
🌳从中文文本中自动提取关键词和摘要
Stars: ✭ 2,518 (+8582.76%)
Mutual labels:  textrank, textrank-algorithm
Persian-Summarization
Statistical and Semantical Text Summarizer in Persian Language
Stars: ✭ 38 (+31.03%)
Mutual labels:  text-summarization, textrank-algorithm
Brief
In a nutshell, this is a Text Summarizer
Stars: ✭ 29 (+0%)
Mutual labels:  text-summarization, text-summarizer
TextRankPlus
基于深度学习的中文NLP工具
Stars: ✭ 36 (+24.14%)
Mutual labels:  textrank, textrank-algorithm
allsummarizer
Multilingual automatic text summarizer using statistical approach and extraction
Stars: ✭ 28 (-3.45%)
Mutual labels:  text-summarization
PlanSum
[AAAI2021] Unsupervised Opinion Summarization with Content Planning
Stars: ✭ 25 (-13.79%)
Mutual labels:  text-summarization
frog
Frog is an integration of memory-based natural language processing (NLP) modules developed for Dutch. All NLP modules are based on Timbl, the Tilburg memory-based learning software package.
Stars: ✭ 70 (+141.38%)
Mutual labels:  pos-tagger
NLP-paper
🎨 🎨NLP 自然语言处理教程 🎨🎨 https://dataxujing.github.io/NLP-paper/
Stars: ✭ 23 (-20.69%)
Mutual labels:  textrank
weibo-summary
微博自动摘要系统 Chinese Microblog Automatic Summary System
Stars: ✭ 28 (-3.45%)
Mutual labels:  textrank
gazeta
Gazeta: Dataset for automatic summarization of Russian news / Газета: набор данных для автоматического реферирования на русском языке
Stars: ✭ 25 (-13.79%)
Mutual labels:  text-summarization
Intelligent Document Finder
Document Search Engine Tool
Stars: ✭ 45 (+55.17%)
Mutual labels:  text-summarization
datalinguist
Stanford CoreNLP in idiomatic Clojure.
Stars: ✭ 93 (+220.69%)
Mutual labels:  pos-tagger
xl-sum
This repository contains the code, data, and models of the paper titled "XL-Sum: Large-Scale Multilingual Abstractive Summarization for 44 Languages" published in Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021.
Stars: ✭ 160 (+451.72%)
Mutual labels:  text-summarization
Text-Summarization
Abstractive and Extractive Text summarization using Transformers.
Stars: ✭ 38 (+31.03%)
Mutual labels:  text-summarization
pytorch-translm
An implementation of transformer-based language model for sentence rewriting tasks such as summarization, simplification, and grammatical error correction.
Stars: ✭ 22 (-24.14%)
Mutual labels:  text-summarization
Keywords-Abstract-TFIDF-TextRank4ZH
使用tf-idf, TextRank4ZH等不同方式从中文文本中提取关键字,从中文文本中提取摘要和关键词
Stars: ✭ 26 (-10.34%)
Mutual labels:  textrank
NLP Toolkit
Library of state-of-the-art models (PyTorch) for NLP tasks
Stars: ✭ 92 (+217.24%)
Mutual labels:  text-summarization
nltk-maxent-pos-tagger
maximum entropy based part-of-speech tagger for NLTK
Stars: ✭ 45 (+55.17%)
Mutual labels:  pos-tagger
Text-Summarization-Repo
텍스트 요약 분야의 주요 연구 주제, Must-read Papers, 이용 가능한 model 및 data 등을 추천 자료와 함께 정리한 저장소입니다.
Stars: ✭ 213 (+634.48%)
Mutual labels:  text-summarization

Build Status CodeFactor

TextSummarizer

This is the C# implementation of Automatic TextSummarization and keyword extraction based on TextRank algorithm [1]. The original paper can be found here. This project came out as an intiative to improve the open-source library for C# and is inspired by one of the popular TextRank implementations for Python.

Documentation

Detailed documentation incoming very shorty! 😃

References

[1] R Mihalcea, P Tarau; Proceedings of the 2004 conference on empirical methods in natural language processing; 2004; [URL]

License

TextSummarizer is Open Source software released under the MIT License.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].