All Projects → prajjwal1 → Fluence

prajjwal1 / Fluence

Licence: apache-2.0
A deep learning library based on Pytorch focussed on low resource language research and robustness

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Fluence

Awesome Fast Attention
list of efficient attention modules
Stars: ✭ 627 (+1061.11%)
Mutual labels:  attention
Isab Pytorch
An implementation of (Induced) Set Attention Block, from the Set Transformers paper
Stars: ✭ 21 (-61.11%)
Mutual labels:  attention
Biblosa Pytorch
Re-implementation of Bi-Directional Block Self-Attention for Fast and Memory-Efficient Sequence Modeling (T. Shen et al., ICLR 2018) on Pytorch.
Stars: ✭ 43 (-20.37%)
Mutual labels:  attention
Nlp paper study
研读顶会论文,复现论文相关代码
Stars: ✭ 691 (+1179.63%)
Mutual labels:  attention
Cell Detr
Official and maintained implementation of the paper Attention-Based Transformers for Instance Segmentation of Cells in Microstructures [BIBM 2020].
Stars: ✭ 26 (-51.85%)
Mutual labels:  attention
Defactonlp
DeFactoNLP: An Automated Fact-checking System that uses Named Entity Recognition, TF-IDF vector comparison and Decomposable Attention models.
Stars: ✭ 30 (-44.44%)
Mutual labels:  attention
Simplecvreproduction
Reproduce simple cv project including attention module, classification, object detection, segmentation, keypoint detection, tracking 😄 etc.
Stars: ✭ 602 (+1014.81%)
Mutual labels:  attention
Text Classification Keras
📚 Text classification library with Keras
Stars: ✭ 53 (-1.85%)
Mutual labels:  attention
Nlp tensorflow project
Use tensorflow to achieve some NLP project, eg: classification chatbot ner attention QAetc.
Stars: ✭ 27 (-50%)
Mutual labels:  attention
Attentions
PyTorch implementation of some attentions for Deep Learning Researchers.
Stars: ✭ 39 (-27.78%)
Mutual labels:  attention
Tf Rnn Attention
Tensorflow implementation of attention mechanism for text classification tasks.
Stars: ✭ 735 (+1261.11%)
Mutual labels:  attention
Pytorch Gat
My implementation of the original GAT paper (Veličković et al.). I've additionally included the playground.py file for visualizing the Cora dataset, GAT embeddings, an attention mechanism, and entropy histograms. I've supported both Cora (transductive) and PPI (inductive) examples!
Stars: ✭ 908 (+1581.48%)
Mutual labels:  attention
Attentive Neural Processes
implementing "recurrent attentive neural processes" to forecast power usage (w. LSTM baseline, MCDropout)
Stars: ✭ 33 (-38.89%)
Mutual labels:  attention
Text Classification
Implementation of papers for text classification task on DBpedia
Stars: ✭ 682 (+1162.96%)
Mutual labels:  attention
Sentences pair similarity calculation siamese lstm
A Keras Implementation of Attention_based Siamese Manhattan LSTM
Stars: ✭ 48 (-11.11%)
Mutual labels:  attention
Vad
Voice activity detection (VAD) toolkit including DNN, bDNN, LSTM and ACAM based VAD. We also provide our directly recorded dataset.
Stars: ✭ 622 (+1051.85%)
Mutual labels:  attention
Banglatranslator
Bangla Machine Translator
Stars: ✭ 21 (-61.11%)
Mutual labels:  attention
Pointer Networks Experiments
Sorting numbers with pointer networks
Stars: ✭ 53 (-1.85%)
Mutual labels:  attention
Time Attention
Implementation of RNN for Time Series prediction from the paper https://arxiv.org/abs/1704.02971
Stars: ✭ 52 (-3.7%)
Mutual labels:  attention
Attentioncluster
TensorFlow Implementation of "Attention Clusters: Purely Attention Based Local Feature Integration for Video Classification"
Stars: ✭ 33 (-38.89%)
Mutual labels:  attention



Build Status Latest Release Apache -------------------------------------------------------------------------------

Fluence is a Pytorch based deep learning library focussed on providing computationally efficient, low resource methods and algorithms for NLP. Although the main focus is to provide support with transformers for NLP tasks, it can be extended with other domains and architectures as well. Currently in pre-alpha stage.

List of implemented papers

Adaptive Methods

Debiasing


Why Fluence ?

Fluence is targeted towards two main goals:

  1. Compute efficiency: Low resource research:
  2. Robustness: Algorithms that either enhance our understanding of current methods or show where SoTA methods fail.

It is as straightforward to use as HF Transformers, and fully integrates with Pytorch. Please note that the current modules (meta-trainer, siamese-trainer) which rely on inherited Trainer works with transformers==3.0. Newer version comes with a modified Trainer.

Installing

For stable version:

pip3 install --user fluence

For development version (recommended):

git clone https://github.com/prajjwal1/fluence
cd fluence
python3 setup.py install --user

Overview

The library contains implementation for the following approaches (many more to come):
| Module | Method with documentation | -------------------------------------------------------------------------------------- | ---------------------------- | fluence.adaptive | Adaptive Methods | | fluence.datasets | Datasets |
| fluence.optim | Optimizers | | fluence.sampling | Importance Sampling | | fluence.models | Siamese Methodology, Debiasing | fluence.prune | Pruning|

Documentation

Please head to this link to learn how you can integrate fluence with your workflow. Since it's an early release, there might be bugs. Please file an issue if you encounter one. Docs are a work-in-progress.

Contribution

You can contribute by either filing an issue or sending a Pull Request (if you encounter any bug or want some features to be added). Please checkout the contributing guide for more details.

Tests

Fluence comes with an extensive test suite for high test coverage.

pytest tests/ -v

Author: Prajjwal Bhargava (@prajjwal_1)

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].