All Projects → NyanKiyoshi → Pytest Django Queries

NyanKiyoshi / Pytest Django Queries

Licence: mit
Generate performance reports from your django database performance tests.

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Pytest Django Queries

Pytest Benchmark
py.test fixture for benchmarking code
Stars: ✭ 730 (+1251.85%)
Mutual labels:  benchmark, benchmarking, pytest
Processhacker
A free, powerful, multi-purpose tool that helps you monitor system resources, debug software and detect malware.
Stars: ✭ 6,285 (+11538.89%)
Mutual labels:  performance-monitoring, benchmarking
Jsbench Me
jsbench.me - JavaScript performance benchmarking playground
Stars: ✭ 50 (-7.41%)
Mutual labels:  benchmark, benchmarking
Lzbench
lzbench is an in-memory benchmark of open-source LZ77/LZSS/LZMA compressors
Stars: ✭ 490 (+807.41%)
Mutual labels:  benchmark, benchmarking
p3arsec
Parallel Patterns Implementation of PARSEC Benchmark Applications
Stars: ✭ 12 (-77.78%)
Mutual labels:  benchmarking, benchmark
best
🏆 Delightful Benchmarking & Performance Testing
Stars: ✭ 73 (+35.19%)
Mutual labels:  benchmarking, benchmark
Tufte
Simple profiling and performance monitoring for Clojure/Script
Stars: ✭ 401 (+642.59%)
Mutual labels:  performance-monitoring, benchmarking
Bench Scripts
A compilation of Linux server benchmarking scripts.
Stars: ✭ 873 (+1516.67%)
Mutual labels:  benchmark, benchmarking
Dana
Test/benchmark regression and comparison system with dashboard
Stars: ✭ 46 (-14.81%)
Mutual labels:  benchmark, benchmarking
Pytest Patterns
A couple of examples showing how pytest and its plugins can be combined to solve real-world needs.
Stars: ✭ 24 (-55.56%)
Mutual labels:  benchmark, pytest
Unchase.FluentPerformanceMeter
🔨 Make the exact performance measurements of the public methods for public classes using this NuGet Package with fluent interface. Requires .Net Standard 2.0+. It is an Open Source project under Apache-2.0 License.
Stars: ✭ 33 (-38.89%)
Mutual labels:  benchmarking, benchmark
Pibench
Benchmarking framework for index structures on persistent memory
Stars: ✭ 46 (-14.81%)
Mutual labels:  benchmark, benchmarking
CARLA
CARLA: A Python Library to Benchmark Algorithmic Recourse and Counterfactual Explanation Algorithms
Stars: ✭ 166 (+207.41%)
Mutual labels:  benchmarking, benchmark
Web Tooling Benchmark
JavaScript benchmark for common web developer workloads
Stars: ✭ 290 (+437.04%)
Mutual labels:  benchmark, benchmarking
bench
⏱️ Reliable performance measurement for Go programs. All in one design.
Stars: ✭ 33 (-38.89%)
Mutual labels:  benchmarking, benchmark
beapi-bench
Tool for benchmarking apis. Uses ApacheBench(ab) to generate data and gnuplot for graphing. Adding new features almost daily
Stars: ✭ 16 (-70.37%)
Mutual labels:  benchmarking, benchmark
python-pytest-harvest
Store data created during your `pytest` tests execution, and retrieve it at the end of the session, e.g. for applicative benchmarking purposes.
Stars: ✭ 44 (-18.52%)
Mutual labels:  benchmark, pytest
LuaJIT-Benchmarks
LuaJIT Benchmark tests
Stars: ✭ 20 (-62.96%)
Mutual labels:  benchmarking, benchmark
Benchmarkdotnet
Powerful .NET library for benchmarking
Stars: ✭ 7,138 (+13118.52%)
Mutual labels:  benchmark, benchmarking
Sysbench Docker Hpe
Sysbench Dockerfiles and Scripts for VM and Container benchmarking MySQL
Stars: ✭ 14 (-74.07%)
Mutual labels:  benchmark, benchmarking

pytest-django-queries

Generate performance reports from your django database performance tests (inspired by coverage.py).

Requirement Status Coverage Status Documentation Status Version Latest Unstable on pypi

Commits since latest release Supported versions Supported implementations

Usage

Install pytest-django-queries, write your pytest tests and mark any test that should be counted or use the count_queries fixture.

Note: to use the latest development build, use pip install --pre pytest-django-queries

import pytest


@pytest.mark.count_queries
def test_query_performances():
    Model.objects.all()


# Or...
def test_another_query_performances(count_queries):
    Model.objects.all()

Each test file and/or package is considered as a category. Each test inside a "category" compose its data, see Visualising Results for more details.

You will find the full documentation here.

Recommendation when Using Fixtures

You might end up in the case where you want to add fixtures that are generating queries that you don't want to be counted in the results–or simply, you want to use the pytest-django plugin alongside of pytest-django-queries, which will generate unwanted queries in your results.

For that, you will want to put the count_queries fixture as the last fixture to execute.

But at the same time, you might want to use the the power of pytest markers, to separate the queries counting tests from other tests. In that case, you might want to do something like this to tell the marker to not automatically inject the count_queries fixture into your test:

import pytest


@pytest.mark.count_queries(autouse=False)
def test_retrieve_main_menu(fixture_making_queries, count_queries):
    pass

Notice the usage of the keyword argument autouse=False and the count_queries fixture being placed last.

Using pytest-django alongside of pytest-django-queries

We recommend you to do the following when using pytest-django:

import pytest


@pytest.mark.django_db
@pytest.mark.count_queries(autouse=False)
def test_retrieve_main_menu(any_fixture, other_fixture, count_queries):
    pass

Integrating with GitHub

TBA.

Testing Locally

Simply install pytest-django-queries through pip and run your tests using pytest. A report should have been generated in your current working directory in a file called with .pytest-queries.

Note: to override the save path, pass the --django-db-bench PATH option to pytest.

Visualising Results

You can generate a table from the tests results by using the show command:

django-queries show

You will get something like this to represent the results:

+---------+--------------------------------------+
| Module  |          Tests                       |
+---------+--------------------------------------+
| module1 | +-----------+---------+------------+ |
|         | | Test Name | Queries | Duplicated | |
|         | +-----------+---------+------------+ |
|         | |   test1   |    0    |     0      | |
|         | +-----------+---------+------------+ |
|         | |   test2   |    1    |     0      | |
|         | +-----------+---------+------------+ |
+---------+--------------------------------------+
| module2 | +-----------+---------+------------+ |
|         | | Test Name | Queries | Duplicated | |
|         | +-----------+---------+------------+ |
|         | |   test1   |   123   |     0      | |
|         | +-----------+---------+------------+ |
+---------+--------------------------------------+

Exporting the Results (HTML)

For a nicer presentation, use the html command, to export the results as HTML.

django-queries html

It will generate something like this.

Comparing Results

You can run django-queries backup (can take a path, optionally) after running your tests then rerun them. After that, you can run django-queries diff to generate results looking like this:

screenshot

Development

First of all, clone the project locally. Then, install it using the below command.

./setup.py develop

After that, you need to install the development and testing requirements. For that, run the below command.

pip install -e .[test]
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].