All Projects → KlugerLab → ALRA

KlugerLab / ALRA

Licence: MIT license
Imputation method for scRNA-seq based on low-rank approximation

Programming Languages

r
7636 projects

Projects that are alternatives of or similar to ALRA

TotalLeastSquares.jl
Solve many kinds of least-squares and matrix-recovery problems
Stars: ✭ 23 (-52.08%)
Mutual labels:  imputation, matrix-completion
Ailearnnotes
Artificial Intelligence Learning Notes.
Stars: ✭ 195 (+306.25%)
Mutual labels:  dropout
Theano lstm
🔬 Nano size Theano LSTM module
Stars: ✭ 310 (+545.83%)
Mutual labels:  dropout
Icellr
Single (i) Cell R package (iCellR) is an interactive R package to work with high-throughput single cell sequencing technologies (i.e scRNA-seq, scVDJ-seq, ST and CITE-seq).
Stars: ✭ 80 (+66.67%)
Mutual labels:  dropout
Dropblock
Implementation of DropBlock: A regularization method for convolutional networks in PyTorch.
Stars: ✭ 466 (+870.83%)
Mutual labels:  dropout
Lstms.pth
PyTorch implementations of LSTM Variants (Dropout + Layer Norm)
Stars: ✭ 111 (+131.25%)
Mutual labels:  dropout
dropclass speaker
DropClass and DropAdapt - repository for the paper accepted to Speaker Odyssey 2020
Stars: ✭ 20 (-58.33%)
Mutual labels:  dropout
cardelino
Clone identification from single-cell data
Stars: ✭ 49 (+2.08%)
Mutual labels:  scrna-seq
Tensorflow Mnist Cnn
MNIST classification using Convolutional NeuralNetwork. Various techniques such as data augmentation, dropout, batchnormalization, etc are implemented.
Stars: ✭ 182 (+279.17%)
Mutual labels:  dropout
Deep Learning 101
The tools and syntax you need to code neural networks from day one.
Stars: ✭ 59 (+22.92%)
Mutual labels:  dropout
Cplxmodule
Complex-valued neural networks for pytorch and Variational Dropout for real and complex layers.
Stars: ✭ 51 (+6.25%)
Mutual labels:  dropout
Satania.moe
Satania IS the BEST waifu, no really, she is, if you don't believe me, this website will convince you
Stars: ✭ 486 (+912.5%)
Mutual labels:  dropout
Daguan 2019 rank9
datagrand 2019 information extraction competition rank9
Stars: ✭ 121 (+152.08%)
Mutual labels:  dropout
Tensorflow Tutorial
Tensorflow tutorial from basic to hard, 莫烦Python 中文AI教学
Stars: ✭ 4,122 (+8487.5%)
Mutual labels:  dropout
Targeted Dropout
Complementary code for the Targeted Dropout paper
Stars: ✭ 251 (+422.92%)
Mutual labels:  dropout
Deepnet
Implementation of CNNs, RNNs, and many deep learning techniques in plain Numpy.
Stars: ✭ 285 (+493.75%)
Mutual labels:  dropout
Svhn Cnn
Google Street View House Number(SVHN) Dataset, and classifying them through CNN
Stars: ✭ 44 (-8.33%)
Mutual labels:  dropout
Sdr Densenet Pytorch
Stochastic Delta Rule implemented in Pytorch on DenseNet
Stars: ✭ 102 (+112.5%)
Mutual labels:  dropout
Sierra
Discover differential transcript usage from polyA-captured single cell RNA-seq data
Stars: ✭ 37 (-22.92%)
Mutual labels:  scrna-seq
matrix-completion
Lightweight Python library for in-memory matrix completion.
Stars: ✭ 94 (+95.83%)
Mutual labels:  matrix-completion

Adaptively-thresholded Low Rank Approximation (ALRA)

Introduction

ALRA is a method for imputation of missing values in single cell RNA-sequencing data, described in the preprint, "Zero-preserving imputation of scRNA-seq data using low-rank approximation" available here. Given a scRNA-seq expression matrix, ALRA first computes its rank-k approximation using randomized SVD. Next, each row (gene) is thresholded by the magnitude of the most negative value of that gene. Finally, the matrix is rescaled.

ALRA schematic

This repository contains codes for running ALRA in R. The only prerequisite for ALRA is installation of the randomized SVD package RSVD which can be installed as install.packages('rsvd').

The functions now have a flag use.mkl for users who have installed rpca-mkl, which allows for dramatic speedups over the default rpca-based version. Note that rpca-mkl is still under development and is not on CRAN, so it is not a required package, but if users have already installed it then they can use it by setting this flag to True.

Usage

Please be sure to pass ALRA a matrix where the cells are rows and genes are columns.

ALRA can be used as follows:

# Let A_norm be a normalized expression matrix where cells are rows and genes are columns.
# We use library and log normalization, but other approaches may also work well.
result.completed <- alra(A_norm)
A_norm_completed <- result.completed[[3]]

See alra_test.R for a complete example.

ALRA is integrated into Seurat v3.0 (currently at pre-release stage) as function RunALRA(). But if you use Seurat v2, we provide a simple function to perform ALRA on a Seurat v2 object in alraSeurat2.R.

ALRA is supported for OS X, Linux, and Windows. It has been tested on MacOS (Mojave, 10.14) and Ubuntu 16.04. Installation should not take longer than a minute or two.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].