All Projects → cpraveen → Dflo

cpraveen / Dflo

Discontinuous Galerkin solver for compressible flows

Labels

Projects that are alternatives of or similar to Dflo

mpi-parallelization
Examples for MPI Spawning and Splitting, and the differences between two implementations
Stars: ✭ 16 (-48.39%)
Mutual labels:  mpi
Ucx
Unified Communication X (mailing list - https://elist.ornl.gov/mailman/listinfo/ucx-group)
Stars: ✭ 471 (+1419.35%)
Mutual labels:  mpi
Mpimemu
MPI Memory Consumption Utilities
Stars: ✭ 17 (-45.16%)
Mutual labels:  mpi
Kernels
This is a set of simple programs that can be used to explore the features of a parallel platform.
Stars: ✭ 287 (+825.81%)
Mutual labels:  mpi
Faasm
High-performance stateful serverless runtime based on WebAssembly
Stars: ✭ 403 (+1200%)
Mutual labels:  mpi
John
John the Ripper jumbo - advanced offline password cracker, which supports hundreds of hash and cipher types, and runs on many operating systems, CPUs, GPUs, and even some FPGAs
Stars: ✭ 5,656 (+18145.16%)
Mutual labels:  mpi
pyccel
Python extension language using accelerators
Stars: ✭ 189 (+509.68%)
Mutual labels:  mpi
Pp Mm A03
Parallel Processing - Matrix Multiplication (Cannon, DNS, LUdecomp)
Stars: ✭ 12 (-61.29%)
Mutual labels:  mpi
Libtommath
LibTomMath is a free open source portable number theoretic multiple-precision integer library written entirely in C.
Stars: ✭ 438 (+1312.9%)
Mutual labels:  mpi
Kratos
Kratos Multiphysics (A.K.A Kratos) is a framework for building parallel multi-disciplinary simulation software. Modularity, extensibility and HPC are the main objectives. Kratos has BSD license and is written in C++ with extensive Python interface.
Stars: ✭ 558 (+1700%)
Mutual labels:  mpi
Sevendayshpc
一週間でなれる!スパコンプログラマ
Stars: ✭ 346 (+1016.13%)
Mutual labels:  mpi
Amgcl
C++ library for solving large sparse linear systems with algebraic multigrid method
Stars: ✭ 390 (+1158.06%)
Mutual labels:  mpi
Elmerfem
Official git repository of Elmer FEM software
Stars: ✭ 523 (+1587.1%)
Mutual labels:  mpi
Mpich
Official MPICH Repository
Stars: ✭ 275 (+787.1%)
Mutual labels:  mpi
Edge
Extreme-scale Discontinuous Galerkin Environment (EDGE)
Stars: ✭ 18 (-41.94%)
Mutual labels:  mpi
Torsten
library of C++ functions that support applications of Stan in Pharmacometrics
Stars: ✭ 38 (+22.58%)
Mutual labels:  mpi
Easylambda
distributed dataflows with functional list operations for data processing with C++14
Stars: ✭ 475 (+1432.26%)
Mutual labels:  mpi
Prpl
parallel Raster Processing Library (pRPL) is a MPI-enabled C++ programming library that provides easy-to-use interfaces to parallelize raster/image processing algorithms
Stars: ✭ 15 (-51.61%)
Mutual labels:  mpi
Esmpy Tutorial
Basic tutorial for ESMPy Python package
Stars: ✭ 22 (-29.03%)
Mutual labels:  mpi
Ohpc
OpenHPC Integration, Packaging, and Test Repo
Stars: ✭ 544 (+1654.84%)
Mutual labels:  mpi

dflo

Discontinuous Galerkin solver for compressible flows. Some features of the code are

  • cartesian and quadrilateral cells
  • Qk basis: nodal Lagrange polynomials on Gauss points
  • Pk basis: Legendre polynomials
  • TVB limiter
  • Positivity preserving limiter
  • Flux functions: Lax-Friedrichs, Roe, HLLC, KFVS

###Getting dflo

git clone https://github.com/cpraveen/dflo

To compile code in "src"

You have to first compile deal.II with Trilinos library, UMFPACK, threads. Serial versions of these libraries are sufficient since the code in "src" is a serial code.

  • cd dflo/src
  • You must set the variable DEAL_II_DIR to your deal.II installation directory. It is good to set this in your bashrc file.
  • cmake .
  • make

By default, this compiles a DEBUG version of the code which is good for code development but runs very slowly. In order to compile optimized version, do

make release

To compile code in "src-mpi"

This is the mpi version which uses p4est. So you must install p4est and compile deal.II with p4est support. It does not require trilinos.

To install p4est, see this page

https://www.dealii.org/developer/external-libs/p4est.html

Obtain latest version of deal.II from github

git clone https://github.com/dealii/dealii

Change into dealii directory

cd dealii
mkdir build
cd build

Configure deal.II. A basic setup is given in the file dealii_mpi.sh. Just run this inside build directory

./dealii_mpi.sh

Now compile deal.II and install it

make all
make install

After this you can compile dflo.

Running the code

Many test cases are included in the examples directory. In most examples, a geo file is provided to generate the mesh. You need Gmsh to generate the mesh as follows

gmsh -2 grid.geo

Then run dflo

dflo input.prm

Some parts of dflo are parallelized with threads. You can specify number of threads to use while starting dflo

dflo input.prm 4

which will run with 4 threads.

To run the mpi version, do

mpirun -np 4 dflo input.prm

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].