All Projects → CODARcode → MGARD

CODARcode / MGARD

Licence: Apache-2.0 license
MGARD: MultiGrid Adaptive Reduction of Data

Programming Languages

C++
36643 projects - #6 most used programming language
Cuda
1817 projects
CMake
9771 projects

Projects that are alternatives of or similar to MGARD

zstd-rs
zstd-decoder in pure rust
Stars: ✭ 148 (+825%)
Mutual labels:  compression
power-gzip
POWER9 gzip engine documentation and code samples
Stars: ✭ 16 (+0%)
Mutual labels:  compression
compressstream-explainer
Compression Streams Explained
Stars: ✭ 22 (+37.5%)
Mutual labels:  compression
Spring
FASTQ compression
Stars: ✭ 71 (+343.75%)
Mutual labels:  compression
fasterai1
FasterAI: A repository for making smaller and faster models with the FastAI library.
Stars: ✭ 34 (+112.5%)
Mutual labels:  compression
fpzip
Cython bindings for fpzip, a floating point image compression algorithm.
Stars: ✭ 24 (+50%)
Mutual labels:  compression
NBT
A java implementation of the NBT protocol, including a way to implement custom tags.
Stars: ✭ 128 (+700%)
Mutual labels:  compression
supersnappy
Dependency-free and performant Nim Snappy implementation.
Stars: ✭ 55 (+243.75%)
Mutual labels:  compression
go7z
A native Go 7z archive reader.
Stars: ✭ 46 (+187.5%)
Mutual labels:  compression
salvador
A free, open-source compressor for the ZX0 format
Stars: ✭ 35 (+118.75%)
Mutual labels:  compression
laravel-Packer
CSS, Javascript and Images packer/processors to Laravel
Stars: ✭ 57 (+256.25%)
Mutual labels:  compression
nason
🗜 Ultra tiny serializer / encoder with plugin-support. Useful to build binary files containing images, strings, numbers and more!
Stars: ✭ 30 (+87.5%)
Mutual labels:  compression
ParallelUtilities.jl
Fast and easy parallel mapreduce on HPC clusters
Stars: ✭ 28 (+75%)
Mutual labels:  reduction
web-config
A Rollup configuration to build modern web applications with sweet features as for example SCSS imports, Service Worker generation with Workbox, Karma testing, live reloading, coping resources, chunking, treeshaking, Typescript, license extraction, filesize visualizer, JSON import, budgets, build progress, minifying and compression with brotli a…
Stars: ✭ 17 (+6.25%)
Mutual labels:  compression
VszLib
7-zip VB6 Helper
Stars: ✭ 35 (+118.75%)
Mutual labels:  compression
lossyless
Generic image compressor for machine learning. Pytorch code for our paper "Lossy compression for lossless prediction".
Stars: ✭ 81 (+406.25%)
Mutual labels:  compression
Natours
An awesome tour booking web app written in NodeJS, Express, MongoDB 🗽
Stars: ✭ 94 (+487.5%)
Mutual labels:  compression
Binary-Neural-Networks
Implemented here a Binary Neural Network (BNN) achieving nearly state-of-art results but recorded a significant reduction in memory usage and total time taken during training the network.
Stars: ✭ 55 (+243.75%)
Mutual labels:  reduction
TranscodingStreams.jl
Simple, consistent interfaces for any codec.
Stars: ✭ 71 (+343.75%)
Mutual labels:  compression
qlZipInfo
MacOSX QuickLook Generator for zip, jar, tar, tar.gz (.tgz), tar.bz2 (.tbz2/.tbz), tar.Z. xar (.xar, .pkg), debian (.deb), RedHat Package Manager (.rpm), 7zip (.7z), xz, Microsoft cabinet (.cab), gzip (.gz), lha, BinHex 4.0 (.hqx), and Stuffit (.sit) archives, and ISO9660 images
Stars: ✭ 47 (+193.75%)
Mutual labels:  compression

MGARD build status format status

MGARD (MultiGrid Adaptive Reduction of Data) is a technique for multilevel lossy compression of scientific data based on the theory of multigrid methods. We encourage you to make a GitHub issue if you run into any problems using MGARD, have any questions or suggestions, etc.

Building and Installing

To build and install MGARD, run the following from the root of the repository. You will need CMake and Protobuf.

$ cmake -S . -B build -D CMAKE_INSTALL_PREFIX=<location to install MGARD>
$ cmake --build build
$ cmake --install build

Documentation

To build the documentation, run cmake with -D MGARD_ENABLE_DOCS=ON. You will need Doxygen. The documentation will be installed to ${CMAKE_INSTALL_PREFIX}/share/doc/MGARD/ by default. Open index.html with a browser to read.

Benchmarks

To build the benchmarks, run cmake with -D MGARD_ENABLE_BENCHMARKS=ON. You will need Google Benchmark. You can then run the benchmarks with build/bin/benchmarks.

Including and Linking

The API consists of a header file compress.hpp providing declarations for function templates mgard::compress and mgard::decompress. See the header for documentation of these templates.

To use MGARD in your project, you will need to tell your compiler where to find the MGARD headers (by default, ${CMAKE_INSTALL_PREFIX}/include/mgard/) and library (by default, ${CMAKE_INSTALL_PREFIX}/lib/). If you're using CMake, you can call find_package(mgard) and add a dependency to the mgard::mgard imported target. See the examples directory for a basic example.

Command Line Interface

To build the command line interface, run cmake with -D MGARD_ENABLE_CLI=ON. You will need TCLAP. A convenience executable called mgard will be built and installed to ${CMAKE_INSTALL_PREFIX}/bin/ by default. You can get help with the CLI by running the following commands.

$ mgard --help
$ man mgard

This executable is an experimental part of the API.

Accelerated and portable compression

MGARD-X is designed for portable compression on NVIDIA GPUs, AMD GPUs, and CPUs. See detailed user guide in here.

CUDA accelerated compression

MGARD-GPU is designed for accelerating compression specifically using NVIDIA GPUs. See detailed user guide in here.

Fine-grain progressive data reconstruction

MDR and MDR-X are designed for enabling fine-grain data refactoring and progressive data reconstruction. See detailed user guide in here.

References

MGARD's theoretical foundation and software implementation are discussed in the following papers. Reference [2] covers the simplest case and is a natural starting point. Reference [6] covers the design and implementation on GPU heterogeneous systems.

  1. Ben Whitney. Multilevel Techniques for Compression and Reduction of Scientific Data. PhD thesis, Brown University, 2018.
  2. Mark Ainsworth, Ozan Tugluk, Ben Whitney, and Scott Klasky. Multilevel Techniques for Compression and Reduction of Scientific Data—The Univariate Case. Computing and Visualization in Science 19, 65–76, 2018.
  3. Mark Ainsworth, Ozan Tugluk, Ben Whitney, and Scott Klasky. Multilevel Techniques for Compression and Reduction of Scientific Data—The Multivariate Case. SIAM Journal on Scientific Computing 41 (2), A1278–A1303, 2019.
  4. Mark Ainsworth, Ozan Tugluk, Ben Whitney, and Scott Klasky. Multilevel Techniques for Compression and Reduction of Scientific Data—Quantitative Control of Accuracy in Derived Quantities. SIAM Journal on Scientific Computing 41 (4), A2146–A2171, 2019.
  5. Mark Ainsworth, Ozan Tugluk, Ben Whitney, and Scott Klasky. Multilevel Techniques for Compression and Reduction of Scientific Data—The Unstructured Case. SIAM Journal on Scientific Computing, 42 (2), A1402–A1427, 2020.
  6. Jieyang Chen et al. Accelerating Multigrid-based Hierarchical Scientific Data Refactoring on GPUs. 35th IEEE International Parallel & Distributed Processing Symposium, May 17–21, 2021.
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].