Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adding Benchmarks #165

Open
wants to merge 9 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 5 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -165,3 +165,8 @@ dmypy.json

# Cython debug symbols
cython_debug/

# Benchmarks created my asv
pystack/**
.asv/**
.vscode/**
78 changes: 0 additions & 78 deletions .vscode/settings.json

This file was deleted.

21 changes: 21 additions & 0 deletions asv.conf.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
{
"version":1,
"benchmark_dir": "./benchmarks",
"repo":"git@github.com:bloomberg/pystack.git",
"project": "pystack",
"project_url": "https://github.com/bloomberg/pystack",
"env_dir":".asv/env",
"results_dir":".asv/results",
"html_dir":".asv/html",
"environment_type":"conda",
"dvcs":"git",
"branches":["main"],
"install_command":[
"python -mpip install -r requirements-test.txt -r requirements-extra.txt",
"python -mpip install -e ."
],
"build_command":[
"python -mpip install pkgconfig",
"python -mpip install dbg"
]
}
34 changes: 34 additions & 0 deletions benchmarks/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,34 @@
# Using the Benchmarking Tool

One of the prerequisites is to have the respective libraries installed. Hence do install the following libraries

- libdw
- libelf

These can be installed via the command `apt-get install libdw-dev libelf-dev`.

To benchmark the packages present another tool is used called `airspeed velocity`. To install it please run the follow command

```pip install asv```

In the parent directory run the following command to get a brief benchmark of your current packages

```asv run```

Use the `-v` flag to get a verbose output.

To compare the all the commits across all the branches you may make use of the following command.

```asv run ALL```

To run benchmarks from a particular commit or tag you can use the commit hash or the tag

```asv run [TAG|HASH]..[branch]```

To compare between tags

```asv show [TAG]..[branch]```

To have a local server to display all the graphs

```asv publish```
Empty file added benchmarks/__init__.py
Empty file.
60 changes: 60 additions & 0 deletions benchmarks/benchmark_colors.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,60 @@
from pystack.colors import colored, format_colored

RANGE=100

class ColorsBenchmarkSuite:

def setup(self):
pass

def time_colored(self):
colors = ["red","green","yellow","blue","magenta","cyan","white"]
highlights = ["on_red",
"on_green",
"on_yellow",
"on_blue",
"on_magenta",
"on_cyan",
"on_white"]
attributes = ["bold", "dark", "underline", "blink", "reverse", "concealed"]
for counter in range(RANGE):
for color in colors:
for highlight in highlights:
colored("Benchmark Colored",color,highlight,attributes)
return "Successfully Benchmarks colored"

def time_format_colored(self):
colors=[
"grey",
"red",
"green",
"yellow",
"blue",
"magenta",
"cyan",
"white",
]
highlights=[
"on_grey",
"on_red",
"on_green",
"on_yellow",
"on_blue",
"on_magenta",
"on_cyan",
"on_white",
]
attributes=[
"bold",
"faint",
"italized",
"underline",
"blink",
"reverse",
"concealed",
]
for counter in range(RANGE):
for color in colors:
for highlight in highlights:
format_colored("Benchmark Format Colored",color,highlight,attributes)
return "Successfully Benchmarks format_colored"
Loading