Skip to content

Commit

Permalink
Add tests using ludeeus/integration_blueprint#50 as base (#112)
Browse files Browse the repository at this point in the history
* add tests using ludeeus/integration_blueprint#50 as base

* move tests folder to the right place

* add test files to files table

* try to fix

* move requirements to appropriate folder

* add a note to contributing about tests

* add a note to contributing about tests

* Update docs/guide.rst

Co-authored-by: Oncleben31 <oncleben31@users.noreply.github.com>

* Update {{cookiecutter.project_name}}/CONTRIBUTING.md

Co-authored-by: Oncleben31 <oncleben31@users.noreply.github.com>

* Update {{cookiecutter.project_name}}/tests/test_init.py

Co-authored-by: Oncleben31 <oncleben31@users.noreply.github.com>

* address review comments

* add more comments

* fix description for requirements files

* add more comments

* update gitignore and fix CONTRIBUTING

* add dummy init file so that pytest works

* Update {{cookiecutter.project_name}}/requirements_test.txt

Co-authored-by: Oncleben31 <oncleben31@users.noreply.github.com>

* fix import statements

* Fix spelling

* bypass setup in config flow

* fix comment

* incorporate comments that were in integration_blueprint PR

* add tests to isort known_first_party

Co-authored-by: Oncleben31 <oncleben31@users.noreply.github.com>
  • Loading branch information
raman325 and oncleben31 committed Jan 13, 2021
1 parent b48bb70 commit b5c30b5
Show file tree
Hide file tree
Showing 16 changed files with 402 additions and 25 deletions.
3 changes: 3 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -3,3 +3,6 @@
__pycache__/
/.python-version
/.vscode/
.coverage
venv
.venv
55 changes: 33 additions & 22 deletions docs/guide.rst
Original file line number Diff line number Diff line change
Expand Up @@ -52,28 +52,36 @@ This repository contains multiple files, here is a overview:
.. table:: Files list
:widths: auto

====================================================== ======================================================================================================================
``.devcontainer/*`` Used for development/testing with VSCODE, more info in the readme file in that dir
``.github/ISSUE_TEMPLATE/feature_request.md`` Template for Feature Requests
``.github/ISSUE_TEMPLATE/issue.md`` Template for issues
``.github/settings.yml`` Probot settings to control the repository settings.
``.vscode/tasks.json`` Tasks for the devcontainer
``custom_components/[DOMAIN NAME]/translations/*`` `Translation files`_
``custom_components/[DOMAIN NAME]/__init__.py`` The component file for the integration
``custom_components/[DOMAIN NAME]/api.py`` This is a sample API client
``custom_components/[DOMAIN NAME]/binary_sensor.py`` Binary sensor platform for the integration
``custom_components/[DOMAIN NAME]/config_flow.py`` Config flow file, this adds the UI configuration possibilities
``custom_components/[DOMAIN NAME]/const.py`` A file to hold shared variables/constants for the entire integration
``custom_components/[DOMAIN NAME]/manifest.json`` A `manifest file`_ for Home Assistant.
``custom_components/[DOMAIN NAME]/sensor.py`` Sensor platform for the integration
``custom_components/[DOMAIN NAME]/switch.py`` Switch sensor platform for the integration
``CONTRIBUTING.md`` Guidelines on how to contribute
``example.png`` Screenshot that demonstrate how it might look in the UI
``info.md`` An example on a info file (used by HACS_)
``LICENSE`` The license file for the project
``README.md`` The file you are reading now, should contain info about the integration, installation and configuration instructions
``requirements.txt`` Python packages used by this integration
====================================================== ======================================================================================================================
============================================================= ======================================================================================================================
``.devcontainer/*`` Used for development/testing with VSCODE, more info in the readme file in that dir
``.github/ISSUE_TEMPLATE/feature_request.md`` Template for Feature Requests
``.github/ISSUE_TEMPLATE/issue.md`` Template for issues
``.github/settings.yml`` Probot settings to control the repository settings.
``.vscode/tasks.json`` Tasks for the devcontainer
``custom_components/[DOMAIN NAME]/translations/*`` `Translation files`_
``custom_components/[DOMAIN NAME]/__init__.py`` The component file for the integration
``custom_components/[DOMAIN NAME]/api.py`` This is a sample API client
``custom_components/[DOMAIN NAME]/binary_sensor.py`` Binary sensor platform for the integration
``custom_components/[DOMAIN NAME]/config_flow.py`` Config flow file, this adds the UI configuration possibilities
``custom_components/[DOMAIN NAME]/const.py`` A file to hold shared variables/constants for the entire integration
``custom_components/[DOMAIN NAME]/manifest.json`` A `manifest file`_ for Home Assistant
``custom_components/[DOMAIN NAME]/sensor.py`` Sensor platform for the integration
``custom_components/[DOMAIN NAME]/switch.py`` Switch sensor platform for the integration
``custom_components/[DOMAIN NAME]/tests/__init__.py`` Makes the `tests` folder a Python package
``custom_components/[DOMAIN NAME]/tests/conftest.py`` Global fixtures_ used in tests to patch_ functions
``custom_components/[DOMAIN NAME]/tests/test_api.py`` Tests for `custom_components/[DOMAIN NAME]/api.py`
``custom_components/[DOMAIN NAME]/tests/test_config_flow.py`` Tests for `custom_components/[DOMAIN NAME]/config_flow.py`
``custom_components/[DOMAIN NAME]/tests/test_init.py`` Tests for `custom_components/[DOMAIN NAME]/__init__.py`
``custom_components/[DOMAIN NAME]/tests/test_switch.py`` Tests for `custom_components/[DOMAIN NAME]/switch.py`
``CONTRIBUTING.md`` Guidelines on how to contribute
``example.png`` Screenshot that demonstrate how it might look in the UI
``info.md`` An example on a info file (used by HACS_)
``LICENSE`` The license file for the project
``README.md`` The file you are reading now, should contain info about the integration, installation and configuration instructions
``requirements.txt`` Python packages used by this integration
``requirements_dev.txt`` Python packages used to provide IntelliSense_/code hints during development of this integration, typically includes packages in ``requirements.txt`` but may include additional packages
``requirements_test.txt`` Python packages required to run the tests for this integration, typically includes packages in ``requirements_dev.txt`` but may include additional packages
============================================================= ======================================================================================================================

If you want to use all the potential and features of this blueprint template you
should use Visual Studio Code to develop in a container. In this container you
Expand Down Expand Up @@ -204,3 +212,6 @@ Deploy on HACS
.. _Remote Python Debugger integration documentation: https://www.home-assistant.io/integrations/debugpy/
.. _Translation files: https://developers.home-assistant.io/docs/internationalization/custom_integration
.. _Visual Studio code: https://code.visualstudio.com/
.. _fixtures: https://docs.pytest.org/en/stable/fixture.html
.. _patch: https://docs.python.org/3/library/unittest.mock.html#unittest.mock.patch
.. _IntelliSense: https://code.visualstudio.com/docs/editor/intellisense
17 changes: 17 additions & 0 deletions {{cookiecutter.project_name}}/CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -63,6 +63,23 @@ file.
You can use the `pre-commit` settings implemented in this repository to have
linting tool checking your contributions (see deicated section below).

You should also verify that existing [tests](./tests) are still working
and you are encouraged to add new ones.
You can run the tests using the following commands from the root folder:

```bash
# Create a virtual environment
python3 -m venv venv
source venv/bin/activate
# Install requirements
pip install -r requirements_test.txt
# Run tests and get a summary of successes/failures and code coverage
pytest --durations=10 --cov-report term-missing --cov=custom_components.{{cookiecutter.domain_name}} tests
```

If any of the tests fail, make the necessary changes to the tests as part of
your changes to the integration.

## Pre-commit

You can use the [pre-commit](https://pre-commit.com/) settings included in the
Expand Down
2 changes: 1 addition & 1 deletion {{cookiecutter.project_name}}/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@
[![Discord][discord-shield]][discord]
[![Community Forum][forum-shield]][forum]

**TO BE REMOVED: If you need help, as a developper, to use this custom component tempalte,
**TO BE REMOVED: If you need help, as a developer, to use this custom component tempalte,
please look at the [User Guide in the Cookiecutter documentation](https://cookiecutter-homeassistant-custom-component.readthedocs.io/en/stable/quickstart.html)**

**This component will set up the following platforms.**
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
"""Dummy init so that pytest works."""
Original file line number Diff line number Diff line change
Expand Up @@ -72,4 +72,4 @@ async def api_wrapper(
exception,
)
except Exception as exception: # pylint: disable=broad-except
_LOGGER.error("Something really wrong happend! - %s", exception)
_LOGGER.error("Something really wrong happened! - %s", exception)
1 change: 1 addition & 0 deletions {{cookiecutter.project_name}}/requirements_dev.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
homeassistant
2 changes: 2 additions & 0 deletions {{cookiecutter.project_name}}/requirements_test.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
-r requirements_dev.txt
pytest-homeassistant-custom-component==0.1.0
2 changes: 1 addition & 1 deletion {{cookiecutter.project_name}}/setup.cfg
Original file line number Diff line number Diff line change
Expand Up @@ -31,5 +31,5 @@ not_skip = __init__.py
force_sort_within_sections = true
sections = FUTURE,STDLIB,INBETWEENS,THIRDPARTY,FIRSTPARTY,LOCALFOLDER
default_section = THIRDPARTY
known_first_party = custom_components.{{ cookiecutter.domain_name }}
known_first_party = custom_components.{{ cookiecutter.domain_name }}, tests
combine_as_imports = true
1 change: 1 addition & 0 deletions {{cookiecutter.project_name}}/tests/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
"""Tests for {{cookiecutter.friendly_name}} integration."""
40 changes: 40 additions & 0 deletions {{cookiecutter.project_name}}/tests/conftest.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,40 @@
"""Global fixtures for {{cookiecutter.friendly_name}} integration."""
from unittest.mock import patch

import pytest

pytest_plugins = "pytest_homeassistant_custom_component"

# This fixture is used to prevent HomeAssistant from attempting to create and dismiss persistent
# notifications. These calls would fail without this fixture since the persistent_notification
# integration is never loaded during a test.
@pytest.fixture(name="skip_notifications", autouse=True)
def skip_notifications_fixture():
"""Skip notification calls."""
with patch("homeassistant.components.persistent_notification.async_create"), patch(
"homeassistant.components.persistent_notification.async_dismiss"
):
yield


# This fixture, when used, will result in calls to async_get_data to return None. To have the call
# return a value, we would add the `return_value=<VALUE_TO_RETURN>` parameter to the patch call.
@pytest.fixture(name="bypass_get_data")
def bypass_get_data_fixture():
"""Skip calls to get data from API."""
with patch(
"custom_components.{{cookiecutter.domain_name}}.{{cookiecutter.class_name_prefix}}ApiClient.async_get_data"
):
yield


# In this fixture, we are forcing calls to async_get_data to raise an Exception. This is useful
# for exception handling.
@pytest.fixture(name="error_on_get_data")
def error_get_data_fixture():
"""Simulate error when retrieving data from API."""
with patch(
"custom_components.{{cookiecutter.domain_name}}.{{cookiecutter.class_name_prefix}}ApiClient.async_get_data",
side_effect=Exception,
):
yield
4 changes: 4 additions & 0 deletions {{cookiecutter.project_name}}/tests/const.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
"""Constants for {{cookiecutter.friendly_name}} tests."""
from custom_components.{{cookiecutter.domain_name}}.const import CONF_PASSWORD, CONF_USERNAME

MOCK_CONFIG = {CONF_USERNAME: "test_username", CONF_PASSWORD: "test_password"}
86 changes: 86 additions & 0 deletions {{cookiecutter.project_name}}/tests/test_api.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,86 @@
"""Tests for {{cookiecutter.friendly_name}} api."""
import asyncio

import aiohttp
from homeassistant.helpers.aiohttp_client import async_get_clientsession

from custom_components.{{cookiecutter.domain_name}}.api import {{cookiecutter.class_name_prefix}}ApiClient


async def test_api(hass, aioclient_mock, caplog):
"""Test API calls."""

# To test the api submodule, we first create an instance of our API client
api = {{cookiecutter.class_name_prefix}}ApiClient("test", "test", async_get_clientsession(hass))

# Use aioclient_mock which is provided by `pytest_homeassistant_custom_components`
# to mock responses to aiohttp requests. In this case we are telling the mock to
# return {"test": "test"} when a `GET` call is made to the specified URL. We then
# call `async_get_data` which will make that `GET` request.
aioclient_mock.get(
"https://jsonplaceholder.typicode.com/posts/1", json={"test": "test"}
)
assert await api.async_get_data() == {"test": "test"}

# We do the same for `async_set_title`. Note the difference in the mock call
# between the previous step and this one. We use `patch` here instead of `get`
# because we know that `async_set_title` calls `api_wrapper` with `patch` as the
# first parameter
aioclient_mock.patch("https://jsonplaceholder.typicode.com/posts/1")
assert await api.async_set_title("test") is None

# In order to get 100% coverage, we need to test `api_wrapper` to test the code
# that isn't already called by `async_get_data` and `async_set_title`. Because the
# only logic that lives inside `api_wrapper` that is not being handled by a third
# party library (aiohttp) is the exception handling, we also want to simulate
# raising the exceptions to ensure that the function handles them as expected.
# The caplog fixture allows access to log messages in tests. This is particularly
# useful during exception handling testing since often the only action as part of
# exception handling is a logging statement
caplog.clear()
aioclient_mock.put(
"https://jsonplaceholder.typicode.com/posts/1", exc=asyncio.TimeoutError
)
assert (
await api.api_wrapper("put", "https://jsonplaceholder.typicode.com/posts/1")
is None
)
assert (
len(caplog.record_tuples) == 1
and "Timeout error fetching information from" in caplog.record_tuples[0][2]
)

caplog.clear()
aioclient_mock.post(
"https://jsonplaceholder.typicode.com/posts/1", exc=aiohttp.ClientError
)
assert (
await api.api_wrapper("post", "https://jsonplaceholder.typicode.com/posts/1")
is None
)
assert (
len(caplog.record_tuples) == 1
and "Error fetching information from" in caplog.record_tuples[0][2]
)

caplog.clear()
aioclient_mock.post("https://jsonplaceholder.typicode.com/posts/2", exc=Exception)
assert (
await api.api_wrapper("post", "https://jsonplaceholder.typicode.com/posts/2")
is None
)
assert (
len(caplog.record_tuples) == 1
and "Something really wrong happened!" in caplog.record_tuples[0][2]
)

caplog.clear()
aioclient_mock.post("https://jsonplaceholder.typicode.com/posts/3", exc=TypeError)
assert (
await api.api_wrapper("post", "https://jsonplaceholder.typicode.com/posts/3")
is None
)
assert (
len(caplog.record_tuples) == 1
and "Error parsing information from" in caplog.record_tuples[0][2]
)
111 changes: 111 additions & 0 deletions {{cookiecutter.project_name}}/tests/test_config_flow.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,111 @@
"""Test {{cookiecutter.friendly_name}} config flow."""
from unittest.mock import patch

import pytest
from homeassistant import config_entries, data_entry_flow
from pytest_homeassistant_custom_component.common import MockConfigEntry

from custom_components.{{cookiecutter.domain_name}}.const import (
BINARY_SENSOR,
DOMAIN,
PLATFORMS,
SENSOR,
SWITCH,
)

from .const import MOCK_CONFIG


# This fixture bypasses the actual setup of the integration
# since we only want to test the config flow. We test the
# actual functionality of the integration in other test modules.
@pytest.fixture(autouse=True)
def bypass_setup_fixture():
"""Prevent setup."""
with patch(
"custom_components.{{cookiecutter.domain_name}}.async_setup",
return_value=True,
), patch(
"custom_components.{{cookiecutter.domain_name}}.async_setup_entry",
return_value=True,
):
yield


# Here we simiulate a successful config flow from the backend.
# Note that we use the `bypass_get_data` fixture here because
# we want the config flow validation to succeed during the test.
async def test_successful_config_flow(hass, bypass_get_data):
"""Test a successful config flow."""
# Initialize a config flow
result = await hass.config_entries.flow.async_init(
DOMAIN, context={"source": config_entries.SOURCE_USER}
)

# Check that the config flow shows the user form as the first step
assert result["type"] == data_entry_flow.RESULT_TYPE_FORM
assert result["step_id"] == "user"

# If a user were to enter `test_username` for username and `test_password`
# for password, it would result in this function call
result = await hass.config_entries.flow.async_configure(
result["flow_id"], user_input=MOCK_CONFIG
)

# Check that the config flow is complete and a new entry is created with
# the input data
assert result["type"] == data_entry_flow.RESULT_TYPE_CREATE_ENTRY
assert result["title"] == "test_username"
assert result["data"] == MOCK_CONFIG
assert result["result"]


# In this case, we want to simulate a failure during the config flow.
# We use the `error_on_get_data` mock instead of `bypass_get_data`
# (note the function parameters) to raise an Exception during
# validation of the input config.
async def test_failed_config_flow(hass, error_on_get_data):
"""Test a failed config flow due to credential validation failure."""

result = await hass.config_entries.flow.async_init(
DOMAIN, context={"source": config_entries.SOURCE_USER}
)

assert result["type"] == data_entry_flow.RESULT_TYPE_FORM
assert result["step_id"] == "user"

result = await hass.config_entries.flow.async_configure(
result["flow_id"], user_input=MOCK_CONFIG
)

assert result["type"] == data_entry_flow.RESULT_TYPE_FORM
assert result["errors"] == {"base": "auth"}


# Our config flow also has an options flow, so we must test it as well.
async def test_options_flow(hass):
"""Test an options flow."""
# Create a new MockConfigEntry and add to HASS (we're bypassing config
# flow entirely)
entry = MockConfigEntry(domain=DOMAIN, data=MOCK_CONFIG, entry_id="test")
entry.add_to_hass(hass)

# Initialize an options flow
result = await hass.config_entries.options.async_init(entry.entry_id)

# Verify that the first options step is a user form
assert result["type"] == data_entry_flow.RESULT_TYPE_FORM
assert result["step_id"] == "user"

# Enter some fake data into the form
result = await hass.config_entries.options.async_configure(
result["flow_id"],
user_input={platform: platform != SENSOR for platform in PLATFORMS},
)

# Verify that the flow finishes
assert result["type"] == data_entry_flow.RESULT_TYPE_CREATE_ENTRY
assert result["title"] == "test_username"

# Verify that the options were updated
assert entry.options == {BINARY_SENSOR: True, SENSOR: False, SWITCH: True}
Loading

0 comments on commit b5c30b5

Please sign in to comment.