Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add tests using custom-components/integration_blueprint#50 as base #112

Merged
merged 23 commits into from
Jan 13, 2021
Merged
Show file tree
Hide file tree
Changes from 14 commits
Commits
Show all changes
23 commits
Select commit Hold shift + click to select a range
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
55 changes: 33 additions & 22 deletions docs/guide.rst
Original file line number Diff line number Diff line change
Expand Up @@ -52,28 +52,36 @@ This repository contains multiple files, here is a overview:
.. table:: Files list
:widths: auto

====================================================== ======================================================================================================================
``.devcontainer/*`` Used for development/testing with VSCODE, more info in the readme file in that dir
``.github/ISSUE_TEMPLATE/feature_request.md`` Template for Feature Requests
``.github/ISSUE_TEMPLATE/issue.md`` Template for issues
``.github/settings.yml`` Probot settings to control the repository settings.
``.vscode/tasks.json`` Tasks for the devcontainer
``custom_components/[DOMAIN NAME]/translations/*`` `Translation files`_
``custom_components/[DOMAIN NAME]/__init__.py`` The component file for the integration
``custom_components/[DOMAIN NAME]/api.py`` This is a sample API client
``custom_components/[DOMAIN NAME]/binary_sensor.py`` Binary sensor platform for the integration
``custom_components/[DOMAIN NAME]/config_flow.py`` Config flow file, this adds the UI configuration possibilities
``custom_components/[DOMAIN NAME]/const.py`` A file to hold shared variables/constants for the entire integration
``custom_components/[DOMAIN NAME]/manifest.json`` A `manifest file`_ for Home Assistant.
``custom_components/[DOMAIN NAME]/sensor.py`` Sensor platform for the integration
``custom_components/[DOMAIN NAME]/switch.py`` Switch sensor platform for the integration
``CONTRIBUTING.md`` Guidelines on how to contribute
``example.png`` Screenshot that demonstrate how it might look in the UI
``info.md`` An example on a info file (used by HACS_)
``LICENSE`` The license file for the project
``README.md`` The file you are reading now, should contain info about the integration, installation and configuration instructions
``requirements.txt`` Python packages used by this integration
====================================================== ======================================================================================================================
============================================================= ======================================================================================================================
``.devcontainer/*`` Used for development/testing with VSCODE, more info in the readme file in that dir
``.github/ISSUE_TEMPLATE/feature_request.md`` Template for Feature Requests
``.github/ISSUE_TEMPLATE/issue.md`` Template for issues
``.github/settings.yml`` Probot settings to control the repository settings.
``.vscode/tasks.json`` Tasks for the devcontainer
``custom_components/[DOMAIN NAME]/translations/*`` `Translation files`_
``custom_components/[DOMAIN NAME]/__init__.py`` The component file for the integration
``custom_components/[DOMAIN NAME]/api.py`` This is a sample API client
``custom_components/[DOMAIN NAME]/binary_sensor.py`` Binary sensor platform for the integration
``custom_components/[DOMAIN NAME]/config_flow.py`` Config flow file, this adds the UI configuration possibilities
``custom_components/[DOMAIN NAME]/const.py`` A file to hold shared variables/constants for the entire integration
``custom_components/[DOMAIN NAME]/manifest.json`` A `manifest file`_ for Home Assistant
``custom_components/[DOMAIN NAME]/sensor.py`` Sensor platform for the integration
``custom_components/[DOMAIN NAME]/switch.py`` Switch sensor platform for the integration
``custom_components/[DOMAIN NAME]/tests/__init__.py`` Makes the `tests` folder a Python package
``custom_components/[DOMAIN NAME]/tests/conftest.py`` Global fixtures_ used in tests to patch_ functions
``custom_components/[DOMAIN NAME]/tests/test_api.py`` Tests for `custom_components/[DOMAIN NAME]/api.py`
``custom_components/[DOMAIN NAME]/tests/test_config_flow.py`` Tests for `custom_components/[DOMAIN NAME]/config_flow.py`
``custom_components/[DOMAIN NAME]/tests/test_init.py`` Tests for `custom_components/[DOMAIN NAME]/__init__.py`
``custom_components/[DOMAIN NAME]/tests/test_switch.py`` Tests for `custom_components/[DOMAIN NAME]/switch.py`
``CONTRIBUTING.md`` Guidelines on how to contribute
``example.png`` Screenshot that demonstrate how it might look in the UI
``info.md`` An example on a info file (used by HACS_)
``LICENSE`` The license file for the project
``README.md`` The file you are reading now, should contain info about the integration, installation and configuration instructions
``requirements.txt`` Python packages used by this integration
raman325 marked this conversation as resolved.
Show resolved Hide resolved
``requirements_dev.txt`` Python packages used to provide IntelliSense_/code hints during development of this integration, typically includes packages in ``requirements.txt`` but may include additional packages
``requirements_test.txt`` Python packages required to run the tests for this integration, typically includes packages in ``requirements_dev.txt`` but may include additional packages
============================================================= ======================================================================================================================

If you want to use all the potential and features of this blueprint template you
should use Visual Studio Code to develop in a container. In this container you
Expand Down Expand Up @@ -204,3 +212,6 @@ Deploy on HACS
.. _Remote Python Debugger integration documentation: https://www.home-assistant.io/integrations/debugpy/
.. _Translation files: https://developers.home-assistant.io/docs/internationalization/custom_integration
.. _Visual Studio code: https://code.visualstudio.com/
.. _fixtures: https://docs.pytest.org/en/stable/fixture.html
.. _patch: https://docs.python.org/3/library/unittest.mock.html#unittest.mock.patch
.. _IntelliSense: https://code.visualstudio.com/docs/editor/intellisense
17 changes: 17 additions & 0 deletions {{cookiecutter.project_name}}/CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -63,6 +63,23 @@ file.
You can use the `pre-commit` settings implemented in this repository to have
linting tool checking your contributions (see deicated section below).

You should also verify that existing [tests](./tests) are still working
and you are encouraged to add new ones.
You can run the tests using the following commands from the root folder:

```bash
# Create a virtual environment
python3 -m venv venv
raman325 marked this conversation as resolved.
Show resolved Hide resolved
source venv/bin/activate
# Install requirements
pip install -r requirements_dev.txt
raman325 marked this conversation as resolved.
Show resolved Hide resolved
# Run tests and get a summary of successes/failures and code coverage
pytest --durations=10 --cov-report term-missing --cov=custom_components.{{cookie_cutter.domain_name}} tests
raman325 marked this conversation as resolved.
Show resolved Hide resolved
```

If any of the tests fail, make the necessary changes to the tests as part of
your changes to the integration.

## Pre-commit

You can use the [pre-commit](https://pre-commit.com/) settings included in the
Expand Down
2 changes: 1 addition & 1 deletion {{cookiecutter.project_name}}/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@
[![Discord][discord-shield]][discord]
[![Community Forum][forum-shield]][forum]

**TO BE REMOVED: If you need help, as a developper, to use this custom component tempalte,
**TO BE REMOVED: If you need help, as a developer, to use this custom component tempalte,
please look at the [User Guide in the Cookiecutter documentation](https://cookiecutter-homeassistant-custom-component.readthedocs.io/en/stable/quickstart.html)**

**This component will set up the following platforms.**
Expand Down
1 change: 1 addition & 0 deletions {{cookiecutter.project_name}}/requirements_dev.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
homeassistant
3 changes: 3 additions & 0 deletions {{cookiecutter.project_name}}/requirements_test.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
-r requirements_dev.txt
pytest
pytest-homeassistant-custom-component
raman325 marked this conversation as resolved.
Show resolved Hide resolved
1 change: 1 addition & 0 deletions {{cookiecutter.project_name}}/tests/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
"""Tests for {{cookiecutter.friendly_name}} integration."""
39 changes: 39 additions & 0 deletions {{cookiecutter.project_name}}/tests/conftest.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,39 @@
"""Global fixtures for {{cookiecutter.friendly_name}} integration."""
import pytest
from pytest_homeassistant_custom_component.async_mock import patch
oncleben31 marked this conversation as resolved.
Show resolved Hide resolved

pytest_plugins = "pytest_homeassistant_custom_component"

# This fixture is used to prevent HomeAssistant from attempting to create and dismiss persistent
# notifications. These calls would fail without this fixture since the persistent_notification
# integration is never loaded during a test.
@pytest.fixture(name="skip_notifications", autouse=True)
raman325 marked this conversation as resolved.
Show resolved Hide resolved
def skip_notifications_fixture():
"""Skip notification calls."""
with patch("homeassistant.components.persistent_notification.async_create"), patch(
"homeassistant.components.persistent_notification.async_dismiss"
):
yield


# This fixture, when used, will result in calls to async_get_data to return None. To have the call
# return a value, we would add the `return_value=<VALUE_TO_RETURN>` parameter to the patch call.
@pytest.fixture(name="bypass_get_data")
def bypass_get_data_fixture():
"""Skip calls to get data from API."""
with patch(
"custom_components.{{cookiecutter.domain_name}}.{{cookiecutter.class_name_prefix}}ApiClient.async_get_data"
):
yield


# In this fixture, we are forcing calls to async_get_data to raise an Exception. This is useful
# for exception handling.
@pytest.fixture(name="error_on_get_data")
def error_get_data_fixture():
"""Simulate error when retrieving data from API."""
with patch(
"custom_components.{{cookiecutter.domain_name}}.{{cookiecutter.class_name_prefix}}ApiClient.async_get_data",
side_effect=Exception,
):
yield
4 changes: 4 additions & 0 deletions {{cookiecutter.project_name}}/tests/const.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
"""Constants for {{cookiecutter.friendly_name}} tests."""
from custom_components.{{cookiecutter.domain_name}}.const import CONF_PASSWORD, CONF_USERNAME

MOCK_CONFIG = {CONF_USERNAME: "test_username", CONF_PASSWORD: "test_password"}
82 changes: 82 additions & 0 deletions {{cookiecutter.project_name}}/tests/test_api.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,82 @@
"""Tests for {{cookiecutter.friendly_name}} api."""
import asyncio

import aiohttp
from homeassistant.helpers.aiohttp_client import async_get_clientsession

from custom_components.{{cookiecutter.domain_name}}.api import {{cookiecutter.class_name_prefix}}ApiClient


async def test_api(hass, aioclient_mock, caplog):
"""Test API calls."""

# To test the api submodule, we first create an instance of our API client
api = {{cookiecutter.class_name_prefix}}ApiClient("test", "test", async_get_clientsession(hass))

# We then try a call to `async_get_data` after mocking the response. This is useful
# for testing any logic that lives within the function, e.g. parsing or validating
# the return data
aioclient_mock.get(
"https://jsonplaceholder.typicode.com/posts/1", json={"test": "test"}
)
assert await api.async_get_data() == {"test": "test"}

# We do the same for `async_set_title`. Note the difference in the mock call
# between the previous step and this one. We use `patch` here instead of `get`
# because we know that `async_set_title` calls `api_wrapper` with `patch` as the
# first parameter
aioclient_mock.patch("https://jsonplaceholder.typicode.com/posts/1")
assert await api.async_set_title("test") is None
Comment on lines +29 to +30
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't understand the objective of this test can you explain me ?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We call the function directly so that we can test any logic that exists within the function (in this case there is none). Since we don't want to actually fetch data, we are using aioclient_mock to set up a mock return value for a patch to the URL, which is what async_set_title does under the hood.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I added some comments that hopefully help, let me know what you think.


# In order to get 100% coverage, we need to test `api_wrapper` to test the code
# that isn't already called by `async_get_data` and `async_set_title`. Because the
# only logic that lives inside `api_wrapper` that is not being handled by a third
# party library (aiohttp) is the exception handling, we also want to simulate
# raising the exceptions to ensure that the function handles them as expected.
caplog.clear()
aioclient_mock.put(
"https://jsonplaceholder.typicode.com/posts/1", exc=asyncio.TimeoutError
)
assert (
await api.api_wrapper("put", "https://jsonplaceholder.typicode.com/posts/1")
is None
)
assert (
len(caplog.record_tuples) == 1
and "Timeout error fetching information from" in caplog.record_tuples[0][2]
)

caplog.clear()
aioclient_mock.post(
"https://jsonplaceholder.typicode.com/posts/1", exc=aiohttp.ClientError
)
assert (
await api.api_wrapper("post", "https://jsonplaceholder.typicode.com/posts/1")
is None
)
assert (
len(caplog.record_tuples) == 1
and "Error fetching information from" in caplog.record_tuples[0][2]
)

caplog.clear()
aioclient_mock.post("https://jsonplaceholder.typicode.com/posts/2", exc=Exception)
assert (
await api.api_wrapper("post", "https://jsonplaceholder.typicode.com/posts/2")
is None
)
assert (
len(caplog.record_tuples) == 1
and "Something really wrong happened!" in caplog.record_tuples[0][2]
)

caplog.clear()
aioclient_mock.post("https://jsonplaceholder.typicode.com/posts/3", exc=TypeError)
assert (
await api.api_wrapper("post", "https://jsonplaceholder.typicode.com/posts/3")
is None
)
assert (
len(caplog.record_tuples) == 1
and "Error parsing information from" in caplog.record_tuples[0][2]
)
77 changes: 77 additions & 0 deletions {{cookiecutter.project_name}}/tests/test_config_flow.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,77 @@
"""Test {{cookiecutter.friendly_name}} config flow."""
from homeassistant import config_entries, data_entry_flow
from pytest_homeassistant_custom_component.common import MockConfigEntry

from custom_components.{{cookiecutter.domain_name}}.const import (
BINARY_SENSOR,
DOMAIN,
PLATFORMS,
SENSOR,
SWITCH,
)

from .const import MOCK_CONFIG


# Here we simiulate a successful config flow from the backend.
# Note that we use the `bypass_get_data` fixture here because
# we want the config flow validation to succeed during the test.
async def test_successful_config_flow(hass, bypass_get_data):
"""Test a successful config flow."""
result = await hass.config_entries.flow.async_init(
DOMAIN, context={"source": config_entries.SOURCE_USER}
)

assert result["type"] == data_entry_flow.RESULT_TYPE_FORM
assert result["step_id"] == "user"

result = await hass.config_entries.flow.async_configure(
result["flow_id"], user_input=MOCK_CONFIG
)

assert result["type"] == data_entry_flow.RESULT_TYPE_CREATE_ENTRY
assert result["title"] == "test_username"
assert result["data"] == MOCK_CONFIG
assert result["result"]


# In this case, we want to simulate a failure during the config flow.
# We use the `error_on_get_data` mock to raise an Exception during
# validation of the input config.
async def test_failed_config_flow(hass, error_on_get_data):
"""Test a failed config flow due to credential validation failure."""

result = await hass.config_entries.flow.async_init(
DOMAIN, context={"source": config_entries.SOURCE_USER}
)

assert result["type"] == data_entry_flow.RESULT_TYPE_FORM
assert result["step_id"] == "user"

result = await hass.config_entries.flow.async_configure(
result["flow_id"], user_input=MOCK_CONFIG
)

assert result["type"] == data_entry_flow.RESULT_TYPE_FORM
assert result["errors"] == {"base": "auth"}


# Our config flow also has an options flow, so we must test it as well.
async def test_options_flow(hass):
"""Test an options flow."""
entry = MockConfigEntry(domain=DOMAIN, data=MOCK_CONFIG, entry_id="test")
entry.add_to_hass(hass)
result = await hass.config_entries.options.async_init(entry.entry_id)

assert result["type"] == data_entry_flow.RESULT_TYPE_FORM
assert result["step_id"] == "user"

result = await hass.config_entries.options.async_configure(
result["flow_id"],
user_input={platform: platform != SENSOR for platform in PLATFORMS},
)

assert result["type"] == data_entry_flow.RESULT_TYPE_CREATE_ENTRY
assert result["title"] == "test_username"

assert entry.options == {BINARY_SENSOR: True, SENSOR: False, SWITCH: True}
56 changes: 56 additions & 0 deletions {{cookiecutter.project_name}}/tests/test_init.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,56 @@
"""Test {{cookiecutter.friendly_name}} setup process."""
from homeassistant.exceptions import ConfigEntryNotReady
import pytest
from pytest_homeassistant_custom_component.common import MockConfigEntry

from custom_components.{{cookiecutter.domain_name}} import (
{{cookiecutter.class_name_prefix}}DataUpdateCoordinator,
async_reload_entry,
async_setup_entry,
async_unload_entry,
)
from custom_components.{{cookiecutter.domain_name}}.const import DOMAIN

from .const import MOCK_CONFIG


# We can pass fixtures as defined in conftest.py to tell pytest to use the fixture
# for a given test. We can also leverage fixtures and mocks that are available in
# Home Assistant using the pytest_homeassistant_custom_component plugin.
# Assertions allow you to verify that the return value of whatever is on the left
# side of the assertion matches with the right side.
async def test_setup_unload_and_reload_entry(hass, bypass_get_data):
"""Test entry setup and unload."""
# Create a mock entry so we don't have to go through config flow
config_entry = MockConfigEntry(domain=DOMAIN, data=MOCK_CONFIG, entry_id="test")

# Set up the entry and assert that the values set during setup are where we expect
# them to be. Because we have patched the {{cookiecutter.class_name_prefix}}DataUpdateCoordinator.async_get_data
# call, no code from custom_components/{{cookiecutter.domain_name}}/api.py actually runs.
assert await async_setup_entry(hass, config_entry)
assert DOMAIN in hass.data and config_entry.entry_id in hass.data[DOMAIN]
assert (
type(hass.data[DOMAIN][config_entry.entry_id]) == {{cookiecutter.class_name_prefix}}DataUpdateCoordinator
)

# Reload the entry and assert that the data from above is still there
assert await async_reload_entry(hass, config_entry) is None
assert DOMAIN in hass.data and config_entry.entry_id in hass.data[DOMAIN]
assert (
type(hass.data[DOMAIN][config_entry.entry_id]) == {{cookiecutter.class_name_prefix}}DataUpdateCoordinator
)

# Unload the entry and verify that the data has been removed
assert await async_unload_entry(hass, config_entry)
assert config_entry.entry_id not in hass.data[DOMAIN]


async def test_setup_entry_exception(hass, error_on_get_data):
"""Test ConfigEntryNotReady when API raises an exception during entry setup."""
config_entry = MockConfigEntry(domain=DOMAIN, data=MOCK_CONFIG, entry_id="test")

# In this case we are testing the condition where async_setup_entry raises
# ConfigEntryNotReady using the `error_on_get_data` fixture which simulates
# an error.
with pytest.raises(ConfigEntryNotReady):
assert await async_setup_entry(hass, config_entry)
Loading