Skip to content

Red-GV/loki-benchmarks

 
 

Repository files navigation

loki-benchmarks

observatorium

This suite consists of Loki benchmarks tests for multiple scenarios. Each scenario asserts recorded measurements against a selected profile from the config directory:

  1. Write benchmarks:

    • High Volume Writes: Measure CPU, MEM and QPS, p99, p50 avg request duration for all 2xx write requests to all Loki distributor and ingester pods.
  2. Read benchmarks:

    • High Volume Reads: Measure QPS, p99, p50 and avg request duration for all 2xx read requests to all Loki query-frontend, querier and ingester pods.
    • High Volume Aggregate: Measure QPS, p99, p50 and avg request duration for all 2xx read requests to all Loki query-frontend, querier and ingester pods.
    • High Volume Aggregate: Measure QPS, p99, p50 and avg request duration for all 2xx read requests to all Loki query-frontend, querier and ingester pods.
    • Dashboard queries: Measure QPS, p99, p50 and avg request duration for all 2xx read requests to all Loki query-frontend, querier and ingester pods.

Prerequisites

  • Software: gnuplot

Note: Install on Linux environment, e.g. on Fedora using: sudo dnf install gnuplot

Kubernetes

Note: Clone git repositories into sibling directories to the loki-benchmarks one.

Note: Cadvisor is only required if measuring CPU and memory of the container. In addition, change the value of the enableCadvisorMetrics key in the configuration to be true. It is false by default.

Deployment

  1. Configure the parameters (config/loki-parameters) and deploy Loki & configure Prometheus: make deploy-obs-loki
  2. Run the benchmarks: make bench-dev

OCP AWS Cluster

  • Required software: oc, aws
  • Cluster Size: m4.16xlarge

Deployment

  1. Configure benchmark parameters config/loki-parameters
  2. Create S3 bucket: make deploy-s3-bucket
  3. Deploy prometheus make deploy-ocp-prometheus
  4. Download loki observatorium template locally make download-obs-loki-template
  5. Deploy Loki make deploy-ocp-loki
  6. Run the benchmarks: make ocp-run-benchmarks

Note: For additional details and all-in-one commands use: make help

Upon benchmark execution completion, results are available in the reports/date+time folder.

Uninstall using: make ocp-all-cleanup.

How to add new benchmarks to this suite

Developing

Run the tests

$ make bench-dev

Example output:

Running Suite: Benchmarks Suite
===============================
Random Seed: 1597237201
Will run 1 of 1 specs

• [MEASUREMENT]
Scenario: High Volume Writes
/home/username/dev/loki-benchmarks/benchmarks/high_volume_writes_test.go:18
  should result in measurements of p99, p50 and avg for all successful write requests to the distributor
  /home/username/dev/loki-benchmarks/benchmarks/high_volume_writes_test.go:32

  Ran 10 samples:
  All distributor 2xx Writes p99:
    Smallest: 0.087
     Largest: 0.096
     Average: 0.092 ± 0.003
  All distributor 2xx Writes p50:
    Smallest: 0.003
     Largest: 0.003
     Average: 0.003 ± 0.000
  All distributor 2xx Writes avg:
    Smallest: 0.370
     Largest: 0.594
     Average: 0.498 ± 0.085
------------------------------

Inspecting the benchmark report

On each run a new time-based report directory is created under the reports directory. Each report includes:

  • Summary README.md with all benchmark measurements.
  • A CSV file for each specific measurement.
  • A GNUPlot file for each specific measurement to transform the data into a PNG graph.

Example output:

reports
├── 2020-08-12-10-33-31
   ├── All-distributor-2xx-Writes-avg.csv
   ├── All-distributor-2xx-Writes-avg.gnuplot
   ├── All-distributor-2xx-Writes-avg.gnuplot.png
   ├── All-distributor-2xx-Writes-p50.csv
   ├── All-distributor-2xx-Writes-p50.gnuplot
   ├── All-distributor-2xx-Writes-p50.gnuplot.png
   ├── All-distributor-2xx-Writes-p99.csv
   ├── All-distributor-2xx-Writes-p99.gnuplot
   ├── All-distributor-2xx-Writes-p99.gnuplot.png
   ├── junit.xml
   └── README.md

Troubleshooting

During benchmark execution, use hack/ocp-deploy-grafana.sh to deploy grafna and connect to Loki as a datasource:

  • Use a web browser to access grafana UI. The URL, username and password are printed by the script
  • In the UI, under settings -> data-sources hit Save & test to verify that Loki data-source is connected and that there are no errors
  • In explore tab change the data-source to Loki and use {client="promtail"} query to visualize log lines
  • Use additional queries such as rate({client="promtail"}[1m]) to verify the behaviour of Loki and the benchmark

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Go 71.0%
  • Shell 14.9%
  • Makefile 14.1%