All Projects → google → deepconsensus

google / deepconsensus

Licence: BSD-3-Clause license
DeepConsensus uses gap-aware sequence transformers to correct errors in Pacific Biosciences (PacBio) Circular Consensus Sequencing (CCS) data.

Programming Languages

python
139335 projects - #7 most used programming language
Jupyter Notebook
11667 projects
shell
77523 projects
Dockerfile
14818 projects

Projects that are alternatives of or similar to deepconsensus

Introduction-to-Deep-Learning-and-Neural-Networks-Course
Code snippets and solutions for the Introduction to Deep Learning and Neural Networks Course hosted in educative.io
Stars: ✭ 33 (-73.39%)
Mutual labels:  transformers
RETRO-pytorch
Implementation of RETRO, Deepmind's Retrieval based Attention net, in Pytorch
Stars: ✭ 473 (+281.45%)
Mutual labels:  transformers
transformers-interpret
Model explainability that works seamlessly with 🤗 transformers. Explain your transformers model in just 2 lines of code.
Stars: ✭ 861 (+594.35%)
Mutual labels:  transformers
X-Transformer
X-Transformer: Taming Pretrained Transformers for eXtreme Multi-label Text Classification
Stars: ✭ 127 (+2.42%)
Mutual labels:  transformers
BottleneckTransformers
Bottleneck Transformers for Visual Recognition
Stars: ✭ 231 (+86.29%)
Mutual labels:  transformers
text-classification-transformers
Easy text classification for everyone : Bert based models via Huggingface transformers (KR / EN)
Stars: ✭ 32 (-74.19%)
Mutual labels:  transformers
oreilly-bert-nlp
This repository contains code for the O'Reilly Live Online Training for BERT
Stars: ✭ 19 (-84.68%)
Mutual labels:  transformers
anonymisation
Anonymization of legal cases (Fr) based on Flair embeddings
Stars: ✭ 85 (-31.45%)
Mutual labels:  transformers
Chinese-Minority-PLM
CINO: Pre-trained Language Models for Chinese Minority (少数民族语言预训练模型)
Stars: ✭ 133 (+7.26%)
Mutual labels:  transformers
elastic transformers
Making BERT stretchy. Semantic Elasticsearch with Sentence Transformers
Stars: ✭ 153 (+23.39%)
Mutual labels:  transformers
transformers-lightning
A collection of Models, Datasets, DataModules, Callbacks, Metrics, Losses and Loggers to better integrate pytorch-lightning with transformers.
Stars: ✭ 45 (-63.71%)
Mutual labels:  transformers
code-transformer
Implementation of the paper "Language-agnostic representation learning of source code from structure and context".
Stars: ✭ 130 (+4.84%)
Mutual labels:  transformers
transformer generalization
The official repository for our paper "The Devil is in the Detail: Simple Tricks Improve Systematic Generalization of Transformers". We significantly improve the systematic generalization of transformer models on a variety of datasets using simple tricks and careful considerations.
Stars: ✭ 58 (-53.23%)
Mutual labels:  transformers
Ask2Transformers
A Framework for Textual Entailment based Zero Shot text classification
Stars: ✭ 102 (-17.74%)
Mutual labels:  transformers
wechsel
Code for WECHSEL: Effective initialization of subword embeddings for cross-lingual transfer of monolingual language models.
Stars: ✭ 39 (-68.55%)
Mutual labels:  transformers
ginza-transformers
Use custom tokenizers in spacy-transformers
Stars: ✭ 15 (-87.9%)
Mutual labels:  transformers
backprop
Backprop makes it simple to use, finetune, and deploy state-of-the-art ML models.
Stars: ✭ 229 (+84.68%)
Mutual labels:  transformers
Text-Summarization
Abstractive and Extractive Text summarization using Transformers.
Stars: ✭ 38 (-69.35%)
Mutual labels:  transformers
Transformers-Tutorials
This repository contains demos I made with the Transformers library by HuggingFace.
Stars: ✭ 2,828 (+2180.65%)
Mutual labels:  transformers
pysentimiento
A Python multilingual toolkit for Sentiment Analysis and Social NLP tasks
Stars: ✭ 274 (+120.97%)
Mutual labels:  transformers

DeepConsensus

DeepConsensus uses gap-aware sequence transformers to correct errors in Pacific Biosciences (PacBio) Circular Consensus Sequencing (CCS) data.

This results in greater yield of high-quality reads. See yield metrics for results on three full SMRT Cells with different chemistries and read length distributions.

Usage

See the quick start for how to run DeepConsensus, along with guidance on how to shard and parallelize most effectively.

ccs settings matter

To get the most out of DeepConsensus, we highly recommend that you run ccs with the parameters given in the quick start. This is because ccs by default filters out reads below a predicted quality of 20, which then cannot be rescued by DeepConsensus. The runtime of ccs is low enough that it is definitely worth doing this extra step whenever you are using DeepConsensus.

Compute setup

The recommended compute setup for DeepConsensus is to shard each SMRT Cell into at least 500 shards, each of which can run on a 16-CPU machine (or smaller). We find that having more than 16 CPUs available for each shard does not significantly improve runtime. See the runtime metrics page for more information.

Where does DeepConsensus fit into my pipeline?

After a PacBio sequencing run, DeepConsensus is meant to be run on the subreads to create new corrected reads in FASTQ format that can take the place of the CCS/HiFi reads for downstream analyses.

For variant-calling downstream

For context, we are the team that created and maintains both DeepConsensus and DeepVariant. For variant calling with DeepVariant, we tested different models and found that the best performance is with DeepVariant v1.4 using the normal pacbio model rather than the model trained on DeepConsensus v0.1 output. We plan to include DeepConsensus v0.3 outputs when training the next DeepVariant model, so if there is a DeepVariant version later than v1.4 when you read this, we recommend using that latest version.

For assembly downstream

We have confirmed that v0.3 outperforms v0.2 in terms of downstream assembly contiguity and accuracy. See the assembly metrics page for details.

How to cite

If you are using DeepConsensus in your work, please cite:

DeepConsensus: Gap-Aware Sequence Transformers for Sequence Correction

How DeepConsensus works

DeepConsensus overview diagram

Watch How DeepConsensus Works for a quick overview.

See this notebook to inspect some example model inputs and outputs.

Installation

From pip package

If you're on a GPU machine:

pip install deepconsensus[gpu]==0.3.1
# To make sure the `deepconsensus` CLI works, set the PATH:
export PATH="/home/${USER}/.local/bin:${PATH}"

If you're on a CPU machine:

pip install deepconsensus[cpu]==0.3.1
# To make sure the `deepconsensus` CLI works, set the PATH:
export PATH="/home/${USER}/.local/bin:${PATH}"

From Docker image

For GPU:

sudo docker pull google/deepconsensus:0.3.1-gpu

For CPU:

sudo docker pull google/deepconsensus:0.3.1

From source

git clone https://github.com/google/deepconsensus.git
cd deepconsensus
source install.sh

If you have GPU, run source install-gpu.sh instead. Currently the only difference is that the GPU version installs tensorflow-gpu instead of intel-tensorflow.

(Optional) After source install.sh, if you want to run all unit tests, you can do:

./run_all_tests.sh

Disclaimer

This is not an official Google product.

NOTE: the content of this research code repository (i) is not intended to be a medical device; and (ii) is not intended for clinical use of any kind, including but not limited to diagnosis or prognosis.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].