All Projects → dice-group → IGUANA

dice-group / IGUANA

Licence: AGPL-3.0 License
IGUANA is a benchmark execution framework for querying HTTP endpoints and CLI Applications such as Triple Stores. Contact: [email protected]

Programming Languages

java
68154 projects - #9 most used programming language

Projects that are alternatives of or similar to IGUANA

semagrow
A SPARQL query federator of heterogeneous data sources
Stars: ✭ 27 (+22.73%)
Mutual labels:  sparql, rdf, triplestore
viziquer
Tool for Search in Structured Semantic Data
Stars: ✭ 12 (-45.45%)
Mutual labels:  sparql, rdf, triplestore
trio
Datatype agnostic triple store & query engine API
Stars: ✭ 78 (+254.55%)
Mutual labels:  sparql, rdf, triplestore
best
🏆 Delightful Benchmarking & Performance Testing
Stars: ✭ 73 (+231.82%)
Mutual labels:  benchmark, performance-analysis, performance-testing
tentris
Tentris is a tensor-based RDF triple store with SPARQL support.
Stars: ✭ 34 (+54.55%)
Mutual labels:  sparql, rdf, triplestore
LinkedDataHub
The Knowledge Graph notebook. Apache license.
Stars: ✭ 150 (+581.82%)
Mutual labels:  sparql, rdf, triplestore
ont-api
ONT-API (OWL-API over Apache Jena)
Stars: ✭ 20 (-9.09%)
Mutual labels:  sparql, rdf
rdf2x
RDF2X converts big RDF datasets to the relational database model, CSV, JSON and ElasticSearch.
Stars: ✭ 43 (+95.45%)
Mutual labels:  sparql, rdf
benchmark-malloc
Trace memory allocations and collect stats
Stars: ✭ 18 (-18.18%)
Mutual labels:  benchmark, performance-analysis
CSV2RDF
Streaming, transforming, SPARQL-based CSV to RDF converter. Apache license.
Stars: ✭ 48 (+118.18%)
Mutual labels:  sparql, rdf
graph-explorer
Graph Explorer can be used to explore RDF graphs in SPARQL endpoints or on the web.
Stars: ✭ 30 (+36.36%)
Mutual labels:  sparql, rdf
SEPA
Get notifications about changes in your SPARQL endpoint.
Stars: ✭ 21 (-4.55%)
Mutual labels:  sparql, rdf
docker-google-lighthouse-puppeteer
Google Lighthouse + Puppeteer / Docker Image
Stars: ✭ 29 (+31.82%)
Mutual labels:  performance-analysis, performance-testing
dvc-bench
Benchmarks for DVC
Stars: ✭ 17 (-22.73%)
Mutual labels:  benchmarks, performance-testing
rdflib-hdt
A Store back-end for rdflib to allow for reading and querying HDT documents
Stars: ✭ 18 (-18.18%)
Mutual labels:  sparql, rdf
autobench
Benchmark your application on CI
Stars: ✭ 16 (-27.27%)
Mutual labels:  benchmark, performance-analysis
stardog-language-servers
Language Servers for Stardog Languages
Stars: ✭ 19 (-13.64%)
Mutual labels:  sparql, rdf
LSQ
Linked SPARQL Queries (LSQ): Framework for RDFizing triple store (web) logs and performing SPARQL query extraction, analysis and benchmarking in order to produce datasets of Linked SPARQL Queries
Stars: ✭ 23 (+4.55%)
Mutual labels:  sparql, rdf
sparql-micro-service
SPARQL micro-services: A lightweight approach to query Web APIs with SPARQL
Stars: ✭ 22 (+0%)
Mutual labels:  sparql, rdf
Sessel
Document RDFizer for CouchDB
Stars: ✭ 22 (+0%)
Mutual labels:  sparql, rdf

GitLicense Java CI with MavenBCH compliance Codacy Badge Project Stats

IGUANA

IGUANA Logo

ABOUT

Semantic Web is becoming more important and it's data is growing each day. Triple stores are the backbone here, managing these data. Hence it is very important that the triple store must scale on the data and can handle several users. Current Benchmark approaches could not provide a realistic scenario on realistic data and could not be adjustet for your needs very easily. Additionally Question Answering systems and Natural Language Processing systems are becoming more and more popular and thus needs to be stresstested as well. Further on it was impossible to compare results for different benchmarks.

Iguana is an an Integerated suite for benchmarking read/write performance of HTTP endpoints and CLI Applications.
which solves all these issues. It provides an enviroment which ...

  • ... is highly configurable
  • ... provides a realistic scneario benchmark
  • ... works on every dataset
  • ... works on SPARQL HTTP endpoints
  • ... works on HTTP Get & Post endpoints
  • ... works on CLI applications
  • and is easily extendable

For further Information visit

iguana-benchmark.eu

Documentation

Getting Started

Prerequisites

You need to install Java 11 or greater. In Ubuntu you can install these using the following commands

sudo apt-get install java

Iguana Modules

Iguana consists of two modules

  1. corecontroller: This will benchmark the systems
  2. resultprocessor: This will calculate the Metrics and save the raw benchmark results

corecontroller

The corecontroller will benchmark your system. It should be started on the same machine the is started.

resultprocessor

The resultprocessor will calculate the metrics. By default it stores its result in a ntriple file. But you may configure it, to write the results directly to a Triple Store. On the processing side, it calculates various metrics.

Per run metrics:

  • Query Mixes Per Hour (QMPH)
  • Number of Queries Per Hour (NoQPH)
  • Number of Queries (NoQ)
  • Average Queries Per Second (AvgQPS)

Per query metrics:

  • Queries Per Second (QPS)
    • Number of successful and failed queries
    • result size
    • queries per second
    • sum of execution times

You can change these in the Iguana Benchmark suite config.

If you use the basic configuration, it will save all mentioned metrics to a file called results_{{DATE_RP_STARTED}}.nt

Setup Iguana

Download

Please download the release zip iguana-x.y.z.zip from the newest release available here:

mkdir iguana
wget https://github.com/dice-group/IGUANA/releases/download/v3.3.0/iguana-3.3.0.zip
unzip iguana-3.3.0.zip

It contains the following files:

  • iguana.corecontroller-X.Y.Z.jar
  • start-iguana.sh
  • example-suite.yml

Run Your Benchmarks

Create a Configuration

You can use the basic configuration we provide and modify it to your needs. For further information please visit our configuration and Stresstest wiki pages. For a detailed, step-by-step instruction please attend our tutorial.

Execute the Benchmark

Use the start script

./start-iguana.sh example-suite.yml

Now Iguana will execute the example benchmark suite configured in the example-suite.yml file

How to Cite

@InProceedings{10.1007/978-3-319-68204-4_5,
author="Conrads, Felix
and Lehmann, Jens
and Saleem, Muhammad
and Morsey, Mohamed
and Ngonga Ngomo, Axel-Cyrille",
editor="d'Amato, Claudia
and Fernandez, Miriam
and Tamma, Valentina
and Lecue, Freddy
and Cudr{\'e}-Mauroux, Philippe
and Sequeda, Juan
and Lange, Christoph
and Heflin, Jeff",
title="Iguana: A Generic Framework for Benchmarking the Read-Write Performance of Triple Stores",
booktitle="The Semantic Web -- ISWC 2017",
year="2017",
publisher="Springer International Publishing",
address="Cham",
pages="48--65",
abstract="The performance of triples stores is crucial for applications driven by RDF. Several benchmarks have been proposed that assess the performance of triple stores. However, no integrated benchmark-independent execution framework for these benchmarks has yet been provided. We propose a novel SPARQL benchmark execution framework called Iguana. Our framework complements benchmarks by providing an execution environment which can measure the performance of triple stores during data loading, data updates as well as under different loads and parallel requests. Moreover, it allows a uniform comparison of results on different benchmarks. We execute the FEASIBLE and DBPSB benchmarks using the Iguana framework and measure the performance of popular triple stores under updates and parallel user requests. We compare our results (See https://doi.org/10.6084/m9.figshare.c.3767501.v1) with state-of-the-art benchmarking results and show that our benchmark execution framework can unveil new insights pertaining to the performance of triple stores.",
isbn="978-3-319-68204-4"
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].